KR20150027634A - Method and Apparatus for scoring test - Google Patents

Method and Apparatus for scoring test Download PDF

Info

Publication number
KR20150027634A
KR20150027634A KR20130106303A KR20130106303A KR20150027634A KR 20150027634 A KR20150027634 A KR 20150027634A KR 20130106303 A KR20130106303 A KR 20130106303A KR 20130106303 A KR20130106303 A KR 20130106303A KR 20150027634 A KR20150027634 A KR 20150027634A
Authority
KR
South Korea
Prior art keywords
evaluation
information
answer
item
image
Prior art date
Application number
KR20130106303A
Other languages
Korean (ko)
Inventor
임정환
이재준
권대형
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR20130106303A priority Critical patent/KR20150027634A/en
Publication of KR20150027634A publication Critical patent/KR20150027634A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

A method for calculating an evaluation score is disclosed. A method for calculating an evaluation score comprises the steps of: acquiring a test strip image including at least one evaluation item and correct answer data for at least one evaluation item; and acquiring correct answer data for each evaluation item included in the evaluation target image, To extract the answer information
The correct answer data is compared with the answer information, and the evaluation score for the evaluation target image can be calculated based on the comparison result.

Description

[0001] METHOD AND APPARATUS FOR SCORING [0002]

More particularly, the present invention relates to a method and an apparatus for calculating evaluation scores by comparing images scanned with evaluation contents.

As a method for calculating the score, generally, a candidate marking with a watercolor pen is marked on the OMR (Optical Mark Read Card) indicated in the box for marking the answer, , The score of the person scored in the off-line is input to the scoring system, and the final score is outputted by calculating the scoring score and the multiple-choice scoring score.

Conventionally, there is a problem in that the time consuming is large and an error is generated in the input process by inputting the result of the question marking to the scoring system through a keyboard or a mouse after manually selecting the scoring score.

An object of the present invention is to provide a method and an apparatus for calculating an evaluation score by comparing images scanned with evaluation contents.

An embodiment of the present invention provides a method for a device to calculate an evaluation score, comprising: obtaining a test strip image including at least one evaluation item and correct answer data for the at least one evaluation item; Extracting answer information for each evaluation item included in the evaluation target image to be evaluated based on the test strip image; Comparing the correct answer data with the answer information; And calculating an evaluation score for the evaluation target image based on the comparison result.

A method for calculating a score of a device, the extracting step comprising: obtaining text information from the evaluation object image and reading information about each evaluation item; And acquiring handwritten information from the evaluation target image and reading answer information for each evaluation item.

A method for a device to calculate an evaluation score,

Comparing the test strip image and the evaluation target image; And at least one of identification information for each evaluation item included in the evaluation target image, view item information for each evaluation item, and answer area information for each evaluation item, based on the comparison result And identifying in the image to be evaluated.

A device according to claim 1, wherein, in the comparing step, the comparing step determines that the answer information is correct if the answer data and the answer information match, and if the answer data and the answer information match And judging the answer information as an incorrect answer if the answer information is not answered.

A method for a device to calculate an evaluation score,

And displaying correct answer data for the predetermined evaluation item for a predetermined evaluation item for which the answer information is determined as an incorrect answer.

A method for a device to calculate an evaluation score,

And obtaining an evaluation score for each of the evaluation items based on the comparison result and a weight score set in advance for each evaluation item.

A method for a device to calculate an evaluation score, the method comprising:

Extracting the candidate identification information from the evaluation target image, and transmitting the calculated evaluation score to the candidate's device based on the extracted identification information of the candidate.

A method for a device to calculate an evaluation score,

And providing the user with a user interface for inputting correct answer data for at least one evaluation item included in the test strip image.

A method for a device to calculate an evaluation score, wherein providing the user interface comprises: using text information contained in the test strip image to generate a view item for the at least one evaluation item and the at least one Displaying an answer area for an evaluation item of the evaluation item; Further comprising receiving an input signal from the user for at least one of a viewing item for the at least one rating item and an answer area for the at least one rating item, .

The method of calculating a score of a device according to claim 1, wherein the score calculation method further comprises acquiring at least one of the test strip image and the evaluation target image using a mobile device.

An evaluation point calculating device according to another embodiment of the present invention, comprising: an acquiring unit acquiring a test paper image including at least one evaluation item and correct answer data for the at least one evaluation item; A control unit for extracting answer information for each evaluation item included in the evaluation target image to be evaluated based on the test strip image; And a determination unit for comparing the correct answer data with the answer information and calculating an evaluation score for the evaluation target image based on the comparison result.

The evaluation score calculating apparatus according to claim 1, wherein the control section obtains text information from the evaluation target image, reads information on each evaluation item, obtains handwriting information from the evaluation target image, Read answer information.

Wherein the control unit compares the test strip image and the evaluation target image, and based on the comparison result, identifies information about each evaluation item included in the evaluation target image, View item information on an evaluation item, and answer area information for each evaluation item are identified in the evaluation object image.

Wherein the judging unit judges that the answer information is a correct answer when the correct answer data and the answer information match, and when the correct answer data and the answer information do not match, We judge information as wrong answer.

In the evaluation point calculating device, the judging unit displays correct answer data for the predetermined evaluation item for the predetermined evaluation item for which the answer information is judged to be a wrong answer.

In the evaluation point calculating device, the determining unit obtains an evaluation score for each of the evaluation items based on the comparison result and a weighting score set in advance for each of the evaluation items.

In the evaluation score calculating device, the control section extracts the candidate identification information from the evaluation target image, and transmits the calculated evaluation score to the candidate's device based on the extracted identification information of the candidate.

In the evaluation point calculating device, the obtaining unit may further include a display unit that provides a user interface that allows the user to input correcting data for the at least one evaluation item included in the test strip image.

Wherein the display unit displays the view item for the at least one evaluation item and the answer area for the at least one evaluation item in the test paper image using the text information included in the test paper image And receives an input signal for at least one of a viewing item for the at least one evaluation item and an answer area for the at least one evaluation item from the user, .

In the evaluation score calculating device, the evaluation score calculating device obtains at least one of the test strip image and the evaluation target image using the mobile device.

1 is a flowchart for explaining a method of calculating a score of a device according to an embodiment of the present invention.
FIG. 2A is a diagram illustrating a test paper image according to an exemplary embodiment of the present invention.
FIG. 2B is a diagram for explaining a process of acquiring correct answer data according to an embodiment of the present invention.
FIG. 3 is a view for explaining a process of obtaining correct answer data according to another embodiment of the present invention.
4 is a view showing an evaluation target image according to an embodiment of the present invention.
5 is a view showing an image to be evaluated according to another embodiment of the present invention.
FIG. 6 is a test paper scored using an evaluation score calculation method according to an embodiment of the present invention.
7 is a diagram showing an image to be evaluated according to an embodiment of the present invention.
8 is a diagram for explaining a process of acquiring an image using an evaluation object apparatus according to an embodiment of the present invention.
9 is a block diagram showing an evaluation target apparatus according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a flowchart for explaining a method of calculating a score of a device according to an embodiment of the present invention. Here, the device may include an evaluation score calculating device capable of calculating an evaluation score.

In step 110, the device may obtain a test strip image containing at least one evaluation item and correct answer data for at least one evaluation item.

Here, the test paper is an example of a printed matter used for an evaluation. Therefore, the evaluation score calculation method according to the embodiment of the present invention can be applied to all printed materials used for evaluation purposes.

According to one embodiment of the present invention, the device may provide a user interface by which the user can enter correct answer data for at least one evaluation item contained in the test strip image. The user interface can be provided to the user in various forms.

For example, an input window may be provided in which, as one of the user interfaces capable of inputting correct answer data, an identification number for at least one evaluation item and a correct answer display area corresponding to each of the at least one evaluation item are displayed. The user can input correct answer data in the correct answer display area using an input device such as a keyboard.

On the other hand, the user interface may be provided in the form of a test strip image so that the user can directly display the correct answer on the test strip. Specifically, the device can acquire text information included in the test paper image by applying an OCR (Optical Character Recognition) function to the acquired test paper image. Here, the text information may include characters, figures, and symbols.

The device can recognize at least one of an item for at least one evaluation item included in the test strip image and an answer area for at least one evaluation item based on the text information obtained from the test strip image. The device may display at least one of the recognized view item and the answer area on the test paper image to configure a user interface for inputting correct answer data.

The device may receive from the user an input signal for at least one of a viewing item for at least one rating item and an answer area for at least one rating item. According to an embodiment of the present invention, correct answer data may be generated based on the received input signal.

The user can mark the correct answer in the view item of the evaluation item included in the test paper image provided through the user interface. In addition, the user may write the correct answer in the answer area of the test paper image. The device can generate the correct answer data based on at least one of the view items marked by the user and the correct answer made.

According to another embodiment of the present invention, the device can obtain the correct data from the external device in which the correct data is stored. Specifically, the process by which a device obtains correct data from an external device may be performed in a plurality of interfaces for wireless communication and other types of interfaces for wired communication. For example, a plurality of interfaces for wireless communication, such as an infrared communication interface and a Bluetooth wireless communication interface, may be provided.

In step 120, the device can extract answer information for each evaluation item included in the evaluation target image to be evaluated based on the test paper image. Here, the answer information may include information that the candidate of the evaluation test writes as an answer to each evaluation item.

The device can obtain text information from the image to be evaluated and read information about each evaluation item to extract the answer information. The information on each evaluation item according to an embodiment of the present invention may include identification information on the evaluation item, view item information on each evaluation item, and answer area information on each evaluation item.

For example, the device can extract text information for each evaluation item contained in the evaluation object image by using the OCR function. Based on the text information, the device can identify at least one of identification information for each evaluation item, view item information for each evaluation item, and answer area information for each evaluation item in the evaluation object image.

Here, the identification information for the evaluation item may include the number of the evaluation item. However, the number of evaluation items is only one example of the identification information, but is not limited thereto. For example, words of a predetermined length included in each evaluation item may be used as identification information for evaluation items by using text information for each evaluation item.

Meanwhile, according to an embodiment of the present invention, the device can acquire the handwriting information from the evaluation target image and read the answer information for each evaluation item. Specifically, the device can recognize the answer information created by the candidate for each evaluation item using the handwriting recognition function.

Based on the information about the evaluation items read and the answer information, the device can generate the answer information created by the candidate for the evaluation target image.

For example, you can assume that the candidate wrote answer 3 for multiple choice multiple choice 20. The device can confirm that the number for the predetermined evaluation item is 20 through the text information extracted for the evaluation target image by using the OCR function. On the other hand, in the case of a multiple choice, the candidate can display the answer information in the view item or write the answer information directly in the multiple choice answer field. The device can recognize the information on at least one of the answer information displayed in the view item and the answer information created in the answer area using the handwriting recognition function. Specifically, the device can verify that the answer written by the candidate for the evaluation item 20 is 3 times through the handwriting recognition function.

According to another embodiment of the present invention, when the evaluation item is the supporting type, the device can recognize the answer information created in the answer area of the subjective evaluation item using the handwriting recognition function.

On the other hand, the device can extract more accurate answer information based on the test paper image obtained in step 110. Specifically, it can be assumed that the candidate completes the answer information by superimposing on the evaluation item or the view item of the evaluation item. When the device applies the OCR function or the handwriting recognition function only to the image to be evaluated, it may be difficult to extract the written answer information by superimposing the answer information on the evaluation item or the view item of the evaluation item.

For example, if the candidate displays an answer by coloring the number of the view item for a given multiple-choice item, it may be difficult to read information about the view item number even if the OCR function is applied. The device can read information about the number of the colored view item by comparing the test image with the test image that was not answered by the candidate. For example, in the case where the third view of the item 20 of the evaluation target is colored, the text information about the area at the same position as the view 3 of the item 20 in the evaluation target image, , You can see that the number of the colored view item is 3 times.

In conclusion, if the evaluation item information or answer information is not recognized, the device can obtain the correct answer information by comparing the test image with the image to be evaluated.

At step 130, the device may compare the answer data obtained at step 110 with the answer information obtained at step 120. [ As a result of comparing the correct answer data and the answer information for a predetermined evaluation item, the device can determine answer answering to a predetermined evaluation item as a correct answer when the answer data and the answer data match. On the other hand, if the correct answer data and the answer information do not match as a result of comparing the correct answer data and the answer information for a predetermined evaluation item, the device can determine the answer information for the predetermined evaluation item as an incorrect answer.

In step 140, the device may calculate an evaluation score for the image to be evaluated based on the comparison result in step 130. [ The device may display the correct answer data for the predetermined evaluation item for which the answer information is determined to be incorrect in step 130. [

According to one embodiment of the present invention, the device can display the correct answer in the area designated by the user for the evaluation item. For example, the user can specify an area in which the correct answer is to be written by classifying the case where the evaluation item is a multiple choice and the case where the evaluation item is a questionable type.

On the other hand, the device can calculate the percentage of correct answers for a predetermined evaluation item by performing a process of extracting incorrect information for a predetermined evaluation item on at least one or more evaluation target images.

According to one embodiment of the present invention, the device may obtain an evaluation score for each evaluation item based on the comparison result at step 130 and a predetermined weighting score for each evaluation item. According to an embodiment of the present invention, each evaluation item included in the evaluation target image may have a different weighting score. For example, the weight score may be different depending on the difficulty of the evaluation item. In addition, the weight score may be different depending on whether the evaluation item is a multiple choice question or answer.

The device can calculate the evaluation score for the image to be evaluated by adding up the evaluation scores obtained for each evaluation item included in the evaluation image.

According to an embodiment of the present invention, the device may extract the candidate's identification information from the evaluation target image. The device can transmit the evaluation score calculated to the candidate's device based on the extracted identification information of the candidate.

For example, the device may include address information for the candidate's device in advance. The device can extract the address information of the device corresponding to the identification information of the extracted candidate. Here, the address information may include MAC (Media Access Control) address information. The device can provide the evaluation score to the candidate by transmitting the evaluation score calculated by the extracted address information.

On the other hand, according to an embodiment of the present invention, the device performing the evaluation point calculating method may include a mobile device. The mobile device according to an embodiment of the present invention can scan the evaluation target test papers in which the test papers and the candidates have made the answers to the test papers, thereby obtaining the test papers image and the evaluation target image.

2A is a diagram illustrating a test strip image 200 according to an embodiment of the present invention. Referring to FIG. 2A, the test strip image 200 includes at least one rating item, a viewing item 220, 225, 230, 235, 240, 255, 260, 265, 270, 275 for at least one rating item, Answer areas 245 and 280 for one evaluation item are displayed.

The score analyzing apparatus according to an embodiment of the present invention may use at least one evaluation item identification information, information of a viewing item for at least one evaluation item, and at least one The answer area information for the evaluation item can be identified in the test strip image 200. For example, OCR may be applied to the test strip image 200 to identify each piece of information in the test strip image 200 using the extracted text information.

FIG. 2B is a diagram for explaining a process of acquiring correct answer data according to an embodiment of the present invention. The score calculating device can obtain correct answer data for at least one evaluation item indicated in the test paper image. According to an embodiment of the present invention, the evaluation score calculation device may provide a user interface by which the user can input correct answer data for at least one evaluation item included in the test paper image. The user interface may be provided in various forms.

Referring to FIG. 2B, in FIG. 2B, one of the user interfaces according to an exemplary embodiment of the present invention includes an identification number area 292 for at least one evaluation item and a correct answer for at least one evaluation item An input window 290 in which the correct answer notation area 294 is displayed is shown. The user can input the correct answer in the correct answer notation area 294 by using an input device such as a keyboard. Correct answers inputted by the user into the correct answer notation area 294 can be stored in the score calculation device as correct answer data.

FIG. 3 is a view for explaining a process of obtaining correct answer data according to another embodiment of the present invention. The user interface for obtaining correct answer data may be provided in the form of a test strip image so that the user can directly display the answers on the test strip. Specifically, the evaluation score calculating device can acquire text information included in the test paper image by applying an OCR (Optical Character Recognition) function to the obtained test paper image. Here, the text information may include characters, figures, and symbols.

The evaluation score calculating device may include identification information for at least one evaluation item contained in the test strip image based on the text information acquired for the test strip image, view items for at least one evaluation item, and answer area for at least one evaluation item At least one of them can be recognized. The evaluation score calculating device may display at least one of the identification information, the view item, and the answer area for the recognized evaluation item on the test paper image to constitute a user interface for inputting the correct answer data.

Specifically, the evaluation score calculation device includes evaluation item information 210, 250, view items 220, 225, 230, 235, 240, 255, 260, 265, 270, 275, and answer area 245 , ≪ RTI ID = 0.0 > 280 < / RTI > Here, the test strip image 300 is the same as the test strip image 200 of FIG.

The rating score computing device may include a view item 220, 225, 230, 235, 240, 255, 260, 265, 270, 275 for at least one rating item from the user and an answer area 245, 280) for receiving the input signal. According to an embodiment of the present invention, correct answer data may be generated based on the received input signal.

The user can transmit an input signal for generating the correct answer data to the evaluation score calculation device by displaying the correct answer in the test paper image 300 through the user interface. The user can display the correct answer in the test paper image 300 using an input device such as a keyboard and a mouse. On the other hand, the user may transmit the input signal to the evaluation score calculating device through the touch input for touching the screen of the evaluation score calculating device.

 Specifically, the user can input the correct answer data # 4 in the answer area 245 for the evaluation item # 20 included in the test paper image 300. [ On the other hand, the user can input the correct answer data by displaying the 4th view item 235 among the view items 220, 225, 230, 235, and 240 for the 20th evaluation item.

In addition, the user can input correct answer data # 4 in the answer area 280 for the evaluation item # 21 included in the test paper image 300. [ On the other hand, the user can input the correct answer by displaying the 4th view item 270 among the view items 255, 260, 265, 270, and 275 for the 21st evaluation item.

The evaluation score calculating device can be used as a basis for calculating the evaluation score of the evaluation target image by storing the correct answer inputted by the user as correct answer data.

4 is a diagram illustrating an image 400 to be evaluated according to an embodiment of the present invention.

4, the image to be evaluated includes at least one evaluation item, a viewing item 415, 420, 425, 430, 435, 455, 460, 465, 470, 475 for at least one evaluation item, Answer areas 440 and 480 for the evaluation items are displayed.

4 is displayed as the answer information for the evaluation item No. 40 (410) in the evaluation target image (400), and 3 is displayed as the answer information for the evaluation item No. 450 (450). The answer information displayed on the evaluation target image 400 is created by the candidate.

The evaluation score calculating method according to an embodiment of the present invention will be described in detail with reference to Figs. 2A, 2B, 3 and 4.

Referring to FIG. 2, the evaluation score calculating device may use at least one evaluation item identification information, view item information for at least one evaluation item, and at least one evaluation item using the text information included in the test paper image 200 The answer area information can be identified. For example, the evaluation score calculating device may include an identification number of the evaluation item that the evaluation item is 20 and an item number of the view item of the evaluation item No. 20 , View 3 (230), view 4 (235), and view 5 (236)). In addition, the evaluation score calculating device can identify the answer area 245 of the evaluation item 20.

The evaluation score calculating device can obtain the correct answer data from the user that the answer to the evaluation item 20 is 4 times. The evaluation score calculating device includes an input window 290 displaying a test item image and an identification number area 292 in which a viewing item and an answer area are displayed to the user and a correct answer notation area 294 in which correct answers for at least one evaluation question can be entered, At least one user interface may be provided.

The evaluation score calculating device can compare the correct answer data acquired from the user with the answer information extracted from the evaluation target image 400. [ 4, the evaluation score calculating apparatus applies the OCR function and the handwriting recognition function to the evaluation target image 400, and determines whether the answer made by the candidate for the evaluation item 20 is answer number 4, You can obtain answer information that the candidate's answer is 3 times.

According to an embodiment of the present invention, the evaluation score calculating device can compare the correct answer data and the answer information. As a result of the comparison, since the answer data and the answer information for the evaluation item 20 are all four, the evaluation score calculating device can determine that the answer information prepared by the candidate for the evaluation item 20 is the correct answer.

On the other hand, since the correct answer data for the evaluation item 21 is 4, the answer information is 3, and the correct answer data and the answer information do not coincide with each other. Therefore, .

FIG. 5 is a diagram showing an image 500 to be evaluated according to another embodiment of the present invention.

The evaluation score calculating apparatus according to an embodiment of the present invention can acquire text information from the evaluation target image and read information about each evaluation item included in the evaluation target image. The information on each evaluation item according to an embodiment of the present invention may include identification information on the evaluation item, view item information on each evaluation item, and answer area information on each evaluation item. The evaluation score calculating device can acquire the text information of the evaluation target image by using the OCR function as an example.

Referring to FIG. 5, the evaluation score calculating device can obtain the information of No. 2 (505) and No. 3 (525) by using the OCR function. Further, based on the acquired text information, the evaluation score calculating device can read information about the view item and the answer area in the evaluation target image.

More specifically, the evaluation score calculating device can confirm that the number is the number of the view item when numbers are written in a rectangular figure using preset text information. Further, in the case where the parentheses exist according to the preset text information, the evaluation score calculating device can confirm that the area between the parentheses is the answer area.

According to an embodiment of the present invention, the user can change preset text information to match the characteristics of the test paper. For example, when the number of the view item included in a predetermined test paper is changed to a form in which a number is written in a circle, the preset information for the view item can be changed. According to the changed information, the evaluation score calculating device can confirm that the recognized area is a viewing item when a text in which numbers are described in a circle is recognized in the test paper image.

The evaluation score calculating device can acquire information that the number 505 of the predetermined evaluation item displayed on the evaluation target image 500 is 2 based on the evaluation reference image. In addition, the evaluation score calculating device can identify the answer area 510 of the evaluation item No. 2 in the evaluation object image 500. [ Specifically, when the parentheses are displayed, the evaluation score calculating device can identify the answer area 510 within the evaluation target image by presetting the area in which the parentheses are displayed to be the answer area 510 of the evaluation item .

The evaluation score calculating device can extract answer information 522 for the answer 520 created by the user for the second evaluation item using the handwriting recognition function in the answer area 510 of the evaluation item No. 2.

In addition, the evaluation score calculating device can identify the answer area 530 of the evaluation item 3 in the evaluation object image 500. [ In addition, the evaluation score calculating device may identify the view items 550, 555, 560, 567, 570 of the evaluation item No. 3. Specifically, the evaluation score calculating device can confirm that the number is a number for a view item when a figure is written in a square figure.

The evaluation score calculating device calculates an answer 540 written by the user for the evaluation item 3 using the handwriting recognition function on the answer area 530 and the view items 550, 555, 560, 567, and 570 of the evaluation item No. 3, The answer information 542 can be extracted.

FIG. 6 is a test paper scored using an evaluation score calculation method according to an embodiment of the present invention.

Referring to FIG. 6, test strips are marked with score marks and evaluation scores by evaluation items.

According to the embodiment of the present invention, the evaluation score calculating device can automatically calculate the evaluation score from the evaluation target image 600 in which the score display is described. Here, in the evaluation object image 600, a score mark may be described according to the correct answer to each evaluation item using the evaluation score calculation method according to the embodiment of the present invention.

The evaluation score calculating device can extract information on a predetermined score display. For example, in the case of the circle display, the user can set the correct score and the information that the wrong score is displayed in the case of displaying a single slash.

The evaluation score calculating device can detect a scoring mark set in advance in the extracted areas and a display having a degree of proximity more than a predetermined value. Wherein the proximity may include a predefined scoring indication and a degree of match of the indications appearing in the extracted regions. The evaluation score calculating device can calculate the evaluation score for the evaluation target image based on the number of detected displays.

The evaluation score calculating device confirms that four circle displays 610 are displayed in the evaluation target image 600 shown in Fig. 6 using the predetermined score display information and one slash mark 620 is displayed . In conclusion, the evaluation score calculation device can confirm that the answers to the four evaluation items are correct, and that the answers to one evaluation item are incorrect.

7 is a diagram showing an image 700 to be evaluated, according to an embodiment of the present invention.

7, the examinee's image number area 710 and the name information area 720 are displayed in the image 700 to be evaluated. The evaluation score calculating device can extract the identification information of the candidate described in the examination number specification area 710 and the name designation area 720 by applying the OCR function and the handwriting recognition function to the evaluation object image 700. [

The evaluation score calculating device can transmit the evaluation score calculated by the evaluation score calculating device of the candidate based on the extracted identification information of the candidate. For example, the evaluation score calculating device may include the address information of the candidate's evaluation score calculating device in advance. The evaluation score calculating device can extract the address information of the evaluation score calculating device corresponding to the extracted candidate's identification information. Here, the address information may include MAC address information. The evaluation score calculating device can provide the evaluation result to the candidate by transmitting the evaluation score calculated by the extracted address information.

8 is a diagram for explaining a process of acquiring an image using an evaluation object apparatus according to an embodiment of the present invention.

Referring to FIG. 8, the user can easily obtain the test strip image and the evaluation target image by using the evaluation target device 820 and the holder 830 together.

 When the user directly scans the test subject by using the evaluation target device 820, the evaluation target device 820 may be shaken or the scanning direction may be unstable, so that it may be difficult to obtain a clear image.

According to one embodiment of the present invention, the holder 830 can adjust the direction, angle, distance to the test strip, and the like of the evaluation target apparatus 820.

Meanwhile, according to an embodiment of the present invention, the evaluation object apparatus 820 may include a mobile score calculation apparatus. The mobile score calculating apparatus according to the embodiment of the present invention can scan the evaluation target test paper in which the test sheet and the candidate have made the answers to the test sheet to obtain the test strip image and the evaluation target image.

9 is a block diagram illustrating an apparatus 900 to be evaluated, according to an embodiment of the present invention.

9, the evaluation object apparatus 900 according to an embodiment of the present invention may include an acquisition unit 910, a control unit 920, and a determination unit 930.

Only the components related to the present embodiment are shown in the evaluation target device 900 shown in FIG. Therefore, it will be understood by those skilled in the art that other general-purpose components other than the components shown in FIG. 9 may be further included.

The acquiring unit 910 may acquire a test paper image including at least one evaluation item and correct answer data for at least one evaluation item. According to an embodiment of the present invention, the acquiring unit 910 may further include a display unit that provides a user interface that allows the user to input correct data for at least one evaluation item included in the test paper image.

The display unit may display a test item for at least one evaluation item and an answer area for at least one evaluation item in the test paper image using the text information included in the test paper image. The display unit may also receive an input signal from the user for at least one of a viewing area for at least one evaluation item and an answer area for at least one evaluation item. Here, the correct answer data may be generated based on the received input signal.

The control unit 920 can extract answer information for each evaluation item included in the evaluation target image to be evaluated based on the test paper image. The control unit 920 may acquire text information from the evaluation target image, read information about each evaluation item, acquire handwriting information from the evaluation target image, and read answer information for each evaluation item.

The control unit 920 according to an embodiment of the present invention compares the test paper image and the evaluation target image, and based on the comparison result, identifies the evaluation items included in the evaluation target image, At least one of the view item information for each evaluation item and the answer area information for each evaluation item can be identified in the evaluation object image.

Further, the control unit 920 can extract the candidate identification information from the evaluation target image. The control unit 920 can transmit the evaluation score calculated to the candidate's device based on the extracted identification information of the candidate.

The determination unit 930 may compare the correct answer data and the answer information, and may calculate the evaluation score for the evaluation target image based on the comparison result. The determination unit 930 determines that the answer information is correct if the answer data and the answer information match, and determines the answer information as an incorrect answer when the answer data and the answer information do not match.

The determination unit 930 can display correct answer data for a predetermined evaluation item for a predetermined evaluation item for which the answer information is determined to be an incorrect answer.

The judging unit 930 according to the embodiment of the present invention can obtain the evaluation score for each evaluation item based on the comparison result between the correct answer data and the answer information and the weighting score set in advance for each evaluation item. The determination unit 930 may display the evaluation score acquired in the evaluation target image.

Meanwhile, the evaluation point calculating apparatus 900 according to an embodiment of the present invention can acquire at least one of a test paper image and an evaluation target image using a mobile device.

An apparatus according to the present invention may include a processor, a memory for storing and executing program data, a permanent storage such as a disk drive, a communication port for communicating with an external device, a user interface such as a touch panel, a key, Devices, and the like. Methods implemented with software modules or algorithms may be stored on a computer readable recording medium as computer readable codes or program instructions executable on the processor. Here, the computer-readable recording medium may be a magnetic storage medium such as a read-only memory (ROM), a random-access memory (RAM), a floppy disk, a hard disk, ), And a DVD (Digital Versatile Disc). The computer-readable recording medium may be distributed over networked computer systems so that computer readable code can be stored and executed in a distributed manner. The medium is readable by a computer, stored in a memory, and executable on a processor.

All documents, including publications, patent applications, patents, etc., cited in the present invention may be incorporated into the present invention in the same manner as each cited document is shown individually and specifically in conjunction with one another, .

In order to facilitate understanding of the present invention, reference will be made to the preferred embodiments shown in the drawings, and specific terminology is used to describe the embodiments of the present invention. However, the present invention is not limited to the specific terminology, Lt; / RTI > may include all elements commonly conceivable by those skilled in the art.

The present invention may be represented by functional block configurations and various processing steps. These functional blocks may be implemented in a wide variety of hardware and / or software configurations that perform particular functions. For example, the present invention may include integrated circuit configurations, such as memory, processing, logic, look-up tables, etc., that may perform various functions by control of one or more microprocessors or other control devices Can be adopted. Similar to the components of the present invention that may be implemented with software programming or software components, the present invention may be implemented as a combination of C, C ++, and C ++, including various algorithms implemented with data structures, processes, routines, , Java (Java), assembler, and the like. Functional aspects may be implemented with algorithms running on one or more processors. Further, the present invention can employ conventional techniques for electronic environment setting, signal processing, and / or data processing. Terms such as "mechanism", "element", "means", "configuration" may be used broadly and are not limited to mechanical and physical configurations. The term may include the meaning of a series of routines of software in conjunction with a processor or the like.

The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly mentioned, such as " essential ", " importantly ", etc., it may not be a necessary component for application of the present invention.

The use of the terms " above " and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.

900: Evaluation target device
910:
920:
930:

Claims (21)

A method for a device to calculate an evaluation score,
Obtaining a test strip image including at least one evaluation item and correct answer data for the at least one evaluation item;
Extracting answer information for each evaluation item included in the evaluation target image to be evaluated based on the test strip image;
Comparing the correct answer data with the answer information; And
And calculating an evaluation score for the evaluation target image based on the comparison result.
The method according to claim 1,
Acquiring text information from the evaluation object image and reading information about each evaluation item; And
Further comprising the step of acquiring handwriting information from the evaluation target image and reading the answer information for each evaluation item.
The method according to claim 1,
Comparing the test strip image and the evaluation target image; And
Based on the comparison result, at least one of identification information for each evaluation item included in the evaluation object image, view item information for each evaluation item, and answer area information for each evaluation item, Further comprising the step of identifying an evaluation score in the evaluation target image.
2. The method of claim 1,
Determining that the answer information is correct if the answer data and the answer information match each other and determining that the answer information is an incorrect answer when the answer data and the answer information do not match How to calculate the evaluation score that is characteristic.
5. The method according to claim 4,
Further comprising the step of displaying correct answer data for the predetermined evaluation item for a predetermined evaluation item for which the answer information is determined as an incorrect answer.
2. The method according to claim 1,
Further comprising the step of obtaining an evaluation score for each of the evaluation items based on the comparison result and a weight score set in advance for each of the evaluation questions.
The evaluation score calculating method according to claim 1,
Extracting the candidate identification information from the evaluation target image;
And transmitting the calculated evaluation score to the candidate's device based on the extracted identification information of the candidate.
2. The method of claim 1,
Further comprising the step of providing a user with a user interface for inputting correct answer data for at least one evaluation item included in the test paper image.
9. The method of claim 8, wherein providing the user interface comprises:
Displaying a test item for the at least one evaluation item and an answer area for the at least one evaluation item on the test paper image using text information included in the test paper image;
Further comprising receiving an input signal from the user for at least one of a viewing item for the at least one rating item and an answer area for the at least one rating item,
And the correct answer data is generated based on the received input signal.
2. The method according to claim 1,
Further comprising the step of obtaining at least one of the test strip image and the evaluation target image using the mobile device.
In the evaluation point calculating device,
An obtaining unit obtaining a test strip image including at least one evaluation item and correct answer data for the at least one evaluation item;
A control unit for extracting answer information for each evaluation item included in the evaluation target image to be evaluated based on the test strip image;
And a determination unit for comparing the correct answer data with the answer information and calculating an evaluation score for the evaluation target image based on the comparison result.
12. The apparatus according to claim 11,
Acquiring text information from the evaluation target image, reading information about each evaluation item, acquiring written information from the evaluation target image, and reading answer information for each evaluation item Score computing device.
12. The apparatus according to claim 11,
Comparing the test strip image and the evaluation target image, and based on the comparison result, identification information for each evaluation item included in the evaluation target image, view item information for each evaluation item, And an answer area information for an evaluation item in the evaluation object image.
12. The apparatus according to claim 11,
And judges the answer information as a correct answer when the correct answer data and the answer information match, and judges that the answer information is an incorrect answer when the correct answer data and the answer information do not match, Output device.
15. The apparatus of claim 14,
And displays correct answer data for the predetermined evaluation item for a predetermined evaluation item for which the answer information is determined as an incorrect answer.
12. The apparatus according to claim 11,
And obtains an evaluation score for each of the evaluation items based on the comparison result and a weight score set in advance for each of the evaluation questions.
12. The apparatus according to claim 11,
Extracts the candidate identification information from the evaluation target image, and transmits the calculated evaluation score to the candidate's device based on the extracted identification information of the candidate.
12. The image processing apparatus according to claim 11,
Further comprising a display unit for providing the user with a user interface for inputting correct answer data for the at least one evaluation item included in the test paper image.
19. The display device according to claim 18,
Displaying the test item image for the at least one evaluation item and the answer area for the at least one evaluation item using text information included in the test article image, And an input area for at least one of an answer field for the at least one evaluation item and an answer field for the at least one evaluation item,
And the correct answer data is generated based on the received input signal.
12. The apparatus according to claim 11,
And acquires at least one of the test strip image and the evaluation target image using the mobile device.
A computer-readable recording medium having recorded thereon a program for causing a computer to execute the method according to any one of claims 1 to 10.
KR20130106303A 2013-09-04 2013-09-04 Method and Apparatus for scoring test KR20150027634A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130106303A KR20150027634A (en) 2013-09-04 2013-09-04 Method and Apparatus for scoring test

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130106303A KR20150027634A (en) 2013-09-04 2013-09-04 Method and Apparatus for scoring test

Publications (1)

Publication Number Publication Date
KR20150027634A true KR20150027634A (en) 2015-03-12

Family

ID=53022921

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130106303A KR20150027634A (en) 2013-09-04 2013-09-04 Method and Apparatus for scoring test

Country Status (1)

Country Link
KR (1) KR20150027634A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767883A (en) * 2020-07-07 2020-10-13 北京猿力未来科技有限公司 Title correction method and device
KR102285665B1 (en) * 2020-11-23 2021-08-04 주식회사 제로원파트너스 A method, system and apparatus for providing education curriculum
KR102396308B1 (en) * 2021-12-07 2022-05-10 조윤정 Method and device for providing grading result information
KR102430505B1 (en) * 2021-12-17 2022-08-08 멘토알고 주식회사 Method for providing user interface for scoring problem and device for performing the same

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767883A (en) * 2020-07-07 2020-10-13 北京猿力未来科技有限公司 Title correction method and device
CN111767883B (en) * 2020-07-07 2024-04-12 北京猿力未来科技有限公司 Question correction method and device
KR102285665B1 (en) * 2020-11-23 2021-08-04 주식회사 제로원파트너스 A method, system and apparatus for providing education curriculum
KR102396308B1 (en) * 2021-12-07 2022-05-10 조윤정 Method and device for providing grading result information
KR102430505B1 (en) * 2021-12-17 2022-08-08 멘토알고 주식회사 Method for providing user interface for scoring problem and device for performing the same

Similar Documents

Publication Publication Date Title
CN110008933B (en) Universal intelligent marking system and method
KR101648756B1 (en) Examination paper recognition and scoring system
US10713528B2 (en) System for determining alignment of a user-marked document and method thereof
CN109426835B (en) Information processing apparatus, control method of information processing apparatus, and storage medium
KR101663311B1 (en) Studying evaluation service system
CN110956138B (en) Auxiliary learning method based on home education equipment and home education equipment
JP6985856B2 (en) Information processing equipment, control methods and programs for information processing equipment
CN108960149A (en) Paper reads and appraises method, apparatus and electronic equipment automatically
US20150279220A1 (en) Method and system for analyzing exam-taking behavior and improving exam-taking skills
CN109086336A (en) Paper date storage method, device and electronic equipment
KR20150027634A (en) Method and Apparatus for scoring test
CN110879965A (en) Automatic reading and amending method of test paper objective questions, electronic device, equipment and storage medium
KR20180013777A (en) Apparatus and method for analyzing irregular data, a recording medium on which a program / application for implementing the same
JP6217407B2 (en) Information processing system, information processing apparatus, and program
CN110210465A (en) A kind of method and system of data acquisition
JP6915611B2 (en) Information processing equipment, information processing methods and programs
JP2014071489A (en) Grade result acquisition method, program, and device
CN210038810U (en) Intelligent evaluation equipment and system
JP5860488B2 (en) Answer determination system
KR100795951B1 (en) System for grading examination paper and control method
CN115294573A (en) Job correction method, device, equipment and medium
JP7180161B2 (en) Information processing device and program
KR101139765B1 (en) Marking recognition method for omr card using image pattern
CN112184660A (en) Design image evaluation method and device and electronic equipment
JP7342655B2 (en) Information processing device, control method and program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application