WO2020153698A1 - Procédé et dispositif pour sélectionner un annotateur en utilisant une condition d'association - Google Patents

Procédé et dispositif pour sélectionner un annotateur en utilisant une condition d'association Download PDF

Info

Publication number
WO2020153698A1
WO2020153698A1 PCT/KR2020/000986 KR2020000986W WO2020153698A1 WO 2020153698 A1 WO2020153698 A1 WO 2020153698A1 KR 2020000986 W KR2020000986 W KR 2020000986W WO 2020153698 A1 WO2020153698 A1 WO 2020153698A1
Authority
WO
WIPO (PCT)
Prior art keywords
annotator
task
candidate
data annotation
evaluation
Prior art date
Application number
PCT/KR2020/000986
Other languages
English (en)
Korean (ko)
Inventor
박민우
김주영
Original Assignee
주식회사 크라우드웍스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 크라우드웍스 filed Critical 주식회사 크라우드웍스
Publication of WO2020153698A1 publication Critical patent/WO2020153698A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063112Skill-based matching of a person or a group to a task
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/105Human resources

Definitions

  • the present invention relates to a method and apparatus for selecting an annotator using associated conditions.
  • tutorial questions such as OX quizzes and sample tests should lead the general public to get the correct answer by continuing to provide hints that are not provided in the actual work if the general public goes wrong.
  • the existing training method does not correspond to the actual business activity, so if the general public who has passed the existing training method participates in the actual business activity and processes the data, it cannot produce a proper result.
  • the problem to be solved by the present invention is to provide a method and apparatus for selecting an annotator using associated conditions.
  • a method for selecting an annotator using a related condition according to an aspect of the present invention for solving the above-described problem is a method performed by a computer, before transmitting a data annotation task for practical use, to data annotation (Annotation). Transmitting to the terminal of at least one candidate annotator one or more data annotation tasks for evaluation, including a first input item relating to the data input and a second input item relating to one or more associated conditions corresponding to the data annotation; Receiving a result of performing the evaluation data annotation task by the above candidate annotator from a terminal of the one or more candidate annotators, and using the result of performing the evaluation data annotation task, an annotation capability of the candidate annotator Comprising the step of evaluating, and using the evaluation result, selecting one or more actual annotators among the one or more candidate annotators, wherein the one or more association conditions have different priorities, and the candidate annotations When evaluating an annotation ability of data, they have different weights according to the priority.
  • all the pre-prepared data annotation tasks for evaluation are transmitted to the terminal of the candidate annotator, and evaluating the annotation ability of the candidate annotator includes: for the candidate annotator, all the pre-prepared The score for the result of the performance of the data annotation task for evaluation is calculated in total, and if the score is less than a predetermined reference score, the annotation ability of the candidate annotator is evaluated as unreliable, and if the score is greater than or equal to the predetermined reference score The annotation ability of the candidate annotator is evaluated as reliable.
  • the rejection as the pass
  • the data annotation task for evaluation of the second task step which is the next step of the task step, is not transmitted, and the score for the result of performing the data annotation task for evaluation of the first task step is greater than or equal to the predetermined reference score
  • the pass The data annotation task for evaluation of the second task step is transmitted, and the step of evaluating the annotation capability of the candidate annotator is, when the final task step is passed to the candidate annotator, trusting the annotation capability of the candidate annotator If it evaluates as possible and fails to pass in the final task step or the previous step of the final task step, the annotating ability of the candidate annotator is evaluated as unreliable.
  • the data annotation task for evaluating the first task step includes the second input item related to the K (K is a natural number greater than or equal to 2) number of conditions, and is the next step of the first task step.
  • the data annotation task for evaluation of the task level 2 includes the second input items related to the K-1 conditions associated with the highest priority among the K associated conditions.
  • the step of selecting one or more actual annotators from the one or more candidate annotators using the evaluation result may include only the candidate annotators for which the annotation ability is evaluated as reliable based on the evaluation result.
  • the candidate annotator is selected as an actual annotator, and the candidate annotator whose annotation ability is evaluated to be unreliable is not selected as the actual annotator.
  • a method of selecting an annotator using a related condition according to another aspect of the present invention for solving the above-described problems includes transmitting one or more dedicated data annotation tasks to a terminal of one or more actual annotators, and the one or more Further comprising the step of receiving a result of the performance data annotation task performed by the actual annotator from the terminal of the actual annotator, the data annotation task for exclusive use includes a first input item related to data annotation, It does not include a second input item related to one or more association conditions corresponding to the data annotation.
  • the one or more data annotation tasks for evaluation are generated based on a portion of a raw data set in a raw state where data annotation is requested, and the one or more dedicated data annotation tasks are generated from the source data set. It is created based on some other.
  • target reliability is defined for a result of performing the data annotation task for real use, and the number of one or more association conditions is determined corresponding to the target reliability.
  • An apparatus for selecting an annotator using an association condition according to another aspect of the present invention for solving the above-described problem includes: a first input item for data annotation and a second for one or more association conditions corresponding to the data annotation
  • a transmission unit that transmits one or more evaluation data annotation tasks including input items to terminals of one or more candidate annotators, and a result of performing the evaluation data annotation task by the one or more candidate annotators.
  • a receiving unit received from the terminal of the one or more candidate annotators, an evaluation unit evaluating an annotation capability of the candidate annotator using the result of performing the evaluation data annotation task, and the evaluation result, using the evaluation result
  • a selection unit for selecting one or more actual annotators among candidate annotators, wherein the one or more associated conditions have different priorities, and different weights according to the priorities are evaluated when evaluating the annotation ability of the candidate annotators.
  • an input item regarding the association conditions is included in the evaluation data annotation task, so that the association conditions are sufficiently understood according to the evaluation result of the evaluation data annotation task. This enables selection of annotators that can accurately annotate data.
  • an annotator is provided with a guide of an association condition and an association condition corresponding to the characteristics of the data annotation task for evaluation to be performed by the annotator, so that the annotator performs data annotation while grasping the characteristics of the data annotation task for evaluation. do.
  • FIG. 1 is a schematic flowchart of a method of selecting an annotator using a related condition according to an embodiment of the present invention.
  • FIG. 2 is an exemplary view of an operation screen of an evaluation data annotation task including input items related to a plurality of association conditions.
  • FIG. 3 is an exemplary view of an execution result screen of an evaluation data annotation task including input items related to a plurality of association conditions.
  • FIG. 4 is a schematic flowchart of a method of selecting an annotator and processing a dedicated data annotation task using an association condition according to another embodiment of the present invention.
  • FIG. 5 is an exemplary diagram schematically illustrating a method of evaluating an annotation ability of a candidate annotator according to an embodiment.
  • FIG. 6 is another exemplary diagram schematically illustrating a method of evaluating an annotation ability of a candidate annotator according to an embodiment.
  • FIG. 7 is an exemplary view of an operation screen of an evaluation data annotation task in which an input item related to one association condition is excluded.
  • FIG. 8 is an exemplary diagram of a practical data annotation task.
  • FIG. 9 is a block diagram of an apparatus for selecting an annotator using associated conditions according to another embodiment of the present invention.
  • FIG. 1 is a schematic flowchart of a method of selecting an annotator using a related condition according to an embodiment of the present invention.
  • a method of selecting an annotator using an association condition includes transmitting a data annotation task for evaluation to a terminal of a candidate annotator (S110), and a result of performing the data annotation task for evaluation of the candidate annotator It includes the step of receiving from the terminal (S120), evaluating the annotation ability of the candidate annotator (S130) and selecting the actual annotator (S140).
  • step S110 at least one data annotation task for evaluation, which includes a first input item 10 for data annotation and a second input item 20 for one or more association conditions corresponding to the data annotation, is one or more. It is sent to the terminal of the candidate annotator.
  • the data annotation task for evaluation is provided on a website operated by the administrator of the annotator selection, and a candidate annotator can access the website and perform the data annotation task for evaluation.
  • the administrator of the annotator screening pre-creates correct answer data for the data annotation task for evaluation. Correct answer data is then used to evaluate the results of the performance of the data annotation task for evaluation by candidate annotators.
  • the annotator selection period is determined identically or differently for each practical data annotation task described later.
  • the annotator selection period can be predetermined by the administrator. Only within the annotator selection period, the data annotation task for evaluation is provided to the candidate annotator, and the data annotation task for evaluation is not provided after the period. Customers requesting data annotation can suggest opinions on the above period to select excellent annotators through competition among more candidate annotators.
  • Data annotation refers to the act of entering annotation data in relation to source data.
  • data annotation refers to the act of inputting annotation data in a manner corresponding to the instructions in the type of source data such as text, image, audio and video according to the administrator's instructions.
  • data annotation may include, but is not limited to, finding an object in a given fingerprint, making a simple voice recording, finding similar sentences, collecting photos of a specific person, and the like.
  • Data annotation can be classified into general data annotation and premium data annotation according to the difficulty.
  • the annotator may perform only general data annotation or both general data annotation and premium data annotation according to the evaluation level of his annotation ability.
  • the first input item 10 includes the problem 11 of data annotation.
  • the problem of data annotation 11 is generated based on a portion of the raw source data set provided by the customer requesting the data annotation.
  • the data annotation problem 11 may be one of the source data sets.
  • the candidate annotator performs the problem 11 of data annotation on the first input item 10. For example, when the candidate annotator performs an annotation to find an object on a given fingerprint, the candidate annotator may tag the object according to a predetermined instruction on the fingerprint of the data annotation problem 11. For example, a candidate annotator can tag an object on the fingerprint according to instructions to tag the object on the fingerprint of the data annotation problem 11.
  • the association condition is a condition corresponding to the data annotation.
  • the related condition may be a condition corresponding to a basic knowledge or a thing to be understood in performing data annotation.
  • the entity refers to a person or object that is actually distinguished, or a specific possible concept, etc., so input regarding the association condition relates to some typical types of objects. It may include a selection input or a selection input regarding the degree of proper noun based on a characteristic that most of the entities are proper nouns.
  • One or more association conditions may be disposed in the second input item 20, and the one or more association conditions have different priorities. Association conditions with different priorities have different weights when evaluating the annotation ability of the annotator.
  • association conditions with a relatively high priority have a relatively high weight
  • association conditions with a relatively low priority have a relatively low weight
  • the weight may be a weight for a score for evaluating an annotator.
  • an annotator having a wrong association condition with a relatively high weight receives a relatively low score
  • an annotator having an association condition having a relatively low weight has a relatively high score.
  • the higher the association condition, the higher the priority may be a condition that can be easily understood by the candidate annotator. Therefore, a candidate annotator that misses the high-priority association condition receives a low evaluation score.
  • the number of association conditions is determined in correspondence with a target confidence level for a result of performing a dedicated data annotation task described below.
  • the customer who requested the data annotation will be able to ask the administrator for the target reliability of the result of the performance of the data annotation task.
  • the higher the target reliability the more the number of relevant conditions can be increased. Or, the higher the target reliability, the more difficult the relative condition may be.
  • the association condition is placed on the second input item 20.
  • the candidate annotator performs a task corresponding to the association condition on the second input item 20. For example, the candidate annotator performs a task of selecting one of selection items on the association condition of the second input item 20.
  • the data annotation task for evaluation is a data annotation task for evaluating the annotation ability of a candidate annotator.
  • the evaluation data annotation task includes input items related to one or more association conditions, and is used to evaluate the annotation ability of the annotator by reflecting the weight of the association conditions.
  • a low state means a state before data annotation is performed.
  • the data annotation task for evaluation is generated based on a portion of the raw data set in the low state where data annotation is requested.
  • the practical data annotation task described later is generated based on another part of the source data set that is not used to generate the data annotation task for evaluation. That is, the administrator of the annotation screening directly performs data annotation on a portion of the raw data in the low state received from the customer in advance.
  • the administrator makes a data annotation task for evaluation and correct answer data in advance based on the result of the operation.
  • the candidate annotator is an annotator whose ability of annotation has not been verified, and is an annotator performing a data annotation task for evaluation.
  • step S120 results of performing the data annotation task for evaluation by one or more candidate annotators are received from terminals of the one or more candidate annotators.
  • One or more candidate annotators may perform a data annotation task for evaluation on a website operated by an administrator and store the results of the execution.
  • step S130 the annotation ability of the candidate annotator is evaluated using the result of performing the evaluation data annotation task.
  • the ability of the candidate annotator can be evaluated as reliable or unreliable, depending on the result of performing the data annotation task for evaluation.
  • step S140 one or more actual annotators are selected from the one or more candidate annotators using the evaluation result of the annotation ability of the candidate annotators.
  • a candidate annotator whose annotation ability is evaluated as reliable according to the evaluation result of the annotation ability of the candidate annotator is selected as an actual annotator.
  • the actual annotator may perform a dedicated data annotation task described later.
  • a candidate annotator whose annotation ability is evaluated to be unreliable according to the evaluation result of the annotation ability of the candidate annotator is not selected as an actual annotator.
  • FIG. 2 is an exemplary view of an operation screen of an evaluation data annotation task including input items related to a plurality of association conditions.
  • the data annotation task for evaluation includes a first input item 10 for one data annotation and a second input item 20 for one or more associated conditions.
  • the first input item 10 includes a data annotation problem 11 and a highlight keyword 12.
  • a fingerprint for finding an object is placed in the data annotation problem 11.
  • the candidate annotation performs an object search data annotation by tagging one word on the fingerprint of the problem 11 of data annotation.
  • the candidate annotator can confirm whether the word he or she is trying to tag is properly tagged by checking the word recorded in the Hirat keyword 12. Alternatively, the candidate annotator can replace the task of tagging the word on the problem 11 of the data annotation by directly entering the word to be tagged in the highlight keyword 12.
  • association condition 21 for making selections regarding several representative types of individuals
  • association conditions 22 for making selections regarding the degree of proper nouns.
  • the manager provides candidate annotators with a guide to each association condition. For example, the manager provides a description that defines the type of entity for each person, discipline, theory, artifact, institution, etc.
  • the manager judges that if 3 stars are selected, it is a complete proper noun entity, and if 2 stars are selected, it is determined that the combination is a common noun and proper noun, and if 1 star is selected, It provides data explaining that it is not an object at this time, but that it can be judged to be an object.
  • association conditions have different priorities. Association conditions with different priorities have different weights when evaluating the annotation ability of the annotator.
  • the association condition 21 for selecting a representative type of an individual has a higher priority than the association condition 22 for selecting a proper noun degree. Therefore, when a candidate annotator erroneously enters the association condition (21) for selecting a proper noun degree, the annotating ability is relatively less than when the candidate annotator erroneously enters the association condition (21) for selecting a proper noun. Can be underestimated.
  • the candidate annotator can complete the object search operation for one object by clicking the “Add” button, and the completed object can be displayed on the evaluation data annotation task.
  • the candidate annotator can complete the execution of a data annotation task for evaluation by clicking the “Save and Next” button.
  • a candidate annotator who has completed the execution of one evaluation data annotation task may perform the next evaluation data annotation task.
  • FIG. 3 is an exemplary view of an execution result screen of an evaluation data annotation task including input items related to a plurality of association conditions.
  • an example of an execution result screen of a data annotation for finding an object on a given fingerprint will be described to describe an execution result of an evaluation data annotation task.
  • an object of the first input item 10 is tagged as a result of performing an evaluation data annotation task, and an association condition of the second input item 20 is selected.
  • the candidate annotator tags the “map the soul”, an object on the fingerprint of the problem 11 of data annotation.
  • the tagged “map the soul” is recorded as it is in the highlight keyword.
  • the candidate annotator can determine that the object is an object by common sense knowing the "map the soul", and it can determine the object in the context of the fingerprint.
  • the candidate annotator may select “institution” on the association condition 21 to determine that “map the soul” is an entity corresponding to a specific institution and to make a selection regarding the representative type of the entity of the second input item 20.
  • the candidate annotator determines that “map the soul” is an instance of a common noun and a proper noun and determines two stars on the association condition (22) to select the degree of proper noun of the second input item (20).
  • FIG. 4 is a schematic flowchart of a method of selecting an annotator and processing a dedicated data annotation task using an association condition according to another embodiment of the present invention.
  • the step of transmitting the actual data annotation task to the terminal of the actual annotator (S150) and the result of performing the actual data annotation task are performed by the actual annotator. It includes the step of receiving from the terminal (S160).
  • step S150 one or more dedicated data annotation tasks are transmitted to the terminals of the one or more actual annotators.
  • the actual data annotation task is provided on the website operated by the administrator, and the actual annotator may access the website and perform the actual data annotation task.
  • the one or more dedicated data annotation tasks are generated based on some of the source data sets that are not used to generate the data annotation task for evaluation.
  • the practical data annotation task includes the first input item 10 for data annotation, and does not include the second input item 20 for one or more association conditions corresponding to the data annotation.
  • the configuration of the dedicated data annotation task is illustrated in FIG. 8.
  • step S160 a result of performing the data annotation task for dedicated use by one or more actual annotators is received from the terminal of the actual annotator.
  • One or more real-time annotators may perform a data-only data annotation task on a website operated by the administrator of the present invention and store the result of the performance.
  • FIG. 5 is an exemplary diagram schematically illustrating a method of evaluating an annotation ability of a candidate annotator according to an embodiment.
  • a candidate annotator's annotation ability is evaluated by summing the scores for the results of all evaluation data annotation tasks.
  • All the pre-prepared data annotation tasks for evaluation are sequentially transmitted to the terminals of the candidate annotators, and for each evaluation data annotation, the evaluation data annotation tasks performed by the candidate annotators are sequentially received.
  • n (n is a natural number equal to or greater than 1) evaluation data annotation tasks are sequentially transmitted to a terminal of a candidate annotator, and results of performing each evaluation data annotation task are received.
  • n is a natural number greater than or equal to 1
  • the score of each evaluation data annotation task is calculated in correspondence with the associated condition arranged in the second input item 20 of the evaluation data annotation task. For example, the score of the data annotation task for evaluation is calculated to be relatively lower than the case where the association condition with a low priority is wrong when the association condition with a high priority is wrong.
  • the annotator's annotation ability is evaluated as unreliable. On the other hand, if the sum of the scores is equal to or greater than a predetermined reference score, the annotator's annotation ability is evaluated as reliable.
  • FIG. 6 is another exemplary diagram schematically illustrating a method of evaluating an annotation ability of a candidate annotator according to an embodiment.
  • One or more evaluation data annotation tasks are divided into multiple task steps.
  • the data annotation for evaluation divided into a plurality of task steps may include different numbers of association conditions.
  • the candidate annotator may perform data annotation for evaluation in which the number of association conditions is reduced as the task step increases. Association conditions are excluded in order of increasing priority as task steps increase.
  • the data annotation task for evaluation of the first task step includes a second input item 20 for K (K is a natural number of 2 or more) related conditions.
  • K is a natural number of 2 or more
  • the K association conditions have different priorities and different weights depending on the priority. Therefore, the association condition having the highest priority has a higher weight than other association conditions.
  • An association condition having a higher priority corresponds to an association condition that a candidate annotator is relatively easier to select than other association conditions.
  • the data annotation task for evaluation of the second task step which is the next step of the first task step, includes a second input item 20 for K-1 related conditions, excluding the highest priority related condition among the K related conditions. Includes. Therefore, as the step of the task increases, the association condition having the highest priority in the step is excluded.
  • the second input item 20 is configured by excluding the association condition having the highest priority in the previous step among the association conditions in a similar manner to the above.
  • it is composed of association conditions with relatively low priority. Therefore, in the final task step, the annotation task for evaluation, which is composed of the association conditions most difficult for the annotator to input, is provided to the annotator.
  • the candidate annotator passes the task step for each of the plurality of task steps. If the score for the result of performing the data annotation task for evaluation in the first task step is less than a predetermined reference score, the corresponding task step fails to pass, and the data annotation task for evaluation in the second task step, which is the next step in the first task step, is transmitted. Does not work. On the other hand, if the score for the result of the performance of the data annotation task for evaluation of the first task step is greater than or equal to a predetermined reference score, the data annotation task for evaluation of the second task step is transmitted through the corresponding task step.
  • the annotation ability of the candidate annotator is evaluated as reliable.
  • the candidate annotator fails to pass in the final task stage or the previous stage of the final task stage, the annotating ability of the candidate annotator is evaluated as unreliable.
  • the manager selects one of the method of FIG. 5 and the method of FIG. 6 described above to evaluate the annotation ability of the candidate annotator.
  • the method of FIG. 5 and the method of FIG. 6 described above may be determined according to the amount of the source data set requested by the customer who requested the data annotation of the present invention. For example, when the amount of the source data set is less than a predetermined criterion, the annotation ability of the candidate annotator is evaluated using the method of FIG. 5, and when the amount of the source data set is greater than the predetermined criterion, the method of FIG. 6 is used. The annotator's ability to annotate can be assessed.
  • the administrator When the amount of the source data set is less than a predetermined criterion, the administrator is forced to make the amount of data annotation tasks for evaluation relatively small. On the other hand, when the amount of the source data set is greater than or equal to a predetermined criterion, the administrator can make a relatively large amount of data annotation task for evaluation. If the amount of data annotation task for evaluation is small, it is difficult to configure the steps of evaluating the annotation ability in multiple steps, so the administrator performs candidates for all evaluation data annotation tasks, and then sums the scores for the results. Annotators can be evaluated.
  • the manager determines whether to pass the candidate annotator for several steps, and the candidate annotation that has passed the final step Data can be evaluated.
  • FIG. 7 is an exemplary view of an operation screen of an evaluation data annotation task in which an input item related to one association condition is excluded.
  • FIG. 7 an example of an operation screen of an evaluation data annotation for finding an object on a given fingerprint will be described to describe an operation screen of an evaluation data annotation task in which one association condition is excluded.
  • FIG. 7 a work screen of an evaluation data annotation task in which an association condition of one of the second input items 20 is excluded is illustrated.
  • the data annotation task for evaluation in the second task step is performed.
  • the data annotation task for evaluation in FIG. 7 shows a work screen of the data annotation task for evaluation in which one related condition of the second task step is excluded.
  • the data annotation task for evaluation of the second task step may exclude some of the association conditions of the second input item 20. Since the association condition (21) for selecting a representative type of object has a higher priority than the association condition (22) for selecting a proper noun degree, the data annotation task for evaluation in the second task step is representative of the object.
  • the second input item 20 is configured only by the association condition 21 for selecting the degree of proper noun, except for the association condition 21 for selecting the type.
  • FIG. 8 is an exemplary diagram of a practical data annotation task.
  • the dedicated data annotation task includes the first input item 10 and does not include the second input item 20.
  • the practical data annotation task does not include the second input item 20, and is composed of only the first input item 10. Since the actual annotator is a person skilled in the corresponding data annotation, even if there is no related condition of the second input item 20, even by looking at the fingerprint of the first input item 10, it is possible to recognize what the object is and perform object search. .
  • the actual annotator performs an actual data annotation task for tagging an object in the same manner as the data annotation task for evaluation on the problem 11 and the highlight keyword 12 of the data annotation of the first input item 10.
  • FIG. 9 is a block diagram of an apparatus for selecting an annotator using associated conditions according to another embodiment of the present invention.
  • an apparatus 200 for selecting an annotator using an association condition includes a transmitter 210, a receiver 220, an evaluation unit 230, and a selector 240.
  • the plurality of workers 300 performs data annotation using a wired/wireless communication and a device for selecting an annotator using the associated conditions.
  • components of the apparatus 200 for selecting an annotators using associated conditions correspond to respective components of the method described with reference to FIGS. 1 to 8.
  • descriptions of functions or operations of each component of the apparatus will be omitted.
  • the transmitter 210 may include one or more candidate data annotation tasks for one or more evaluation data annotation tasks including a first input item 10 for data annotation and a second input item 20 for one or more related conditions corresponding to the data annotation. Send to the data terminal.
  • the transmitter 210 transmits one or more dedicated data annotation tasks to the terminals of the one or more actual annotators.
  • the receiving unit 220 receives a result of performing the data annotation task for evaluation by one or more candidate annotators from the terminals of the one or more candidate annotators.
  • the receiving unit 220 receives the result of performing the data annotation task for dedicated use by one or more actual annotators from the terminal of the actual annotator.
  • the evaluation unit 230 evaluates the annotation ability of the candidate annotator by using the result of performing the evaluation data annotation task.
  • the sorting unit 240 selects one or more actual annotators from one or more candidate annotators using the evaluation result of the annotating ability of the candidate annotators.
  • the steps of a method or algorithm described in connection with an embodiment of the present invention may be implemented directly in hardware, a software module executed by hardware, or a combination thereof.
  • the software modules may include random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, hard disk, removable disk, CD-ROM, or It may reside on any type of computer readable recording medium well known in the art.

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Document Processing Apparatus (AREA)

Abstract

La présente invention concerne un procédé et un dispositif de sélection d'un annotateur en utilisant une condition d'association. Le procédé comprend les étapes consistant à : transmettre une ou plusieurs tâches d'annotation de données pour des tests à des terminaux d'un ou plusieurs candidats annotateurs avant de transmettre une tâche d'annotation de données réelle, chacune des tâches d'annotation de données pour des tests comprenant un premier élément d'entrée relatif à l'annotation de données et un second élément d'entrée associé à une ou plusieurs conditions d'association correspondant à l'annotation de données; recevoir, à partir des terminaux du ou des candidats annotateurs, des résultats des tâches d'annotation de données pour des tests que le ou les candidats annotateurs ont réalisé; évaluer un niveau d'annotation de chacun des candidats annotateurs en utilisant les résultats des tâches d'annotation de données pour des tests; et sélectionner un ou plusieurs annotateurs réels parmi le ou les candidats annotateurs en utilisant les résultats de l'évaluation.
PCT/KR2020/000986 2019-01-24 2020-01-21 Procédé et dispositif pour sélectionner un annotateur en utilisant une condition d'association WO2020153698A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190009278A KR102138573B1 (ko) 2019-01-24 2019-01-24 연관 조건을 이용한 어노테이터를 선별하는 방법 및 장치
KR10-2019-0009278 2019-01-24

Publications (1)

Publication Number Publication Date
WO2020153698A1 true WO2020153698A1 (fr) 2020-07-30

Family

ID=71735385

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/000986 WO2020153698A1 (fr) 2019-01-24 2020-01-21 Procédé et dispositif pour sélectionner un annotateur en utilisant une condition d'association

Country Status (2)

Country Link
KR (1) KR102138573B1 (fr)
WO (1) WO2020153698A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114446431A (zh) * 2022-01-30 2022-05-06 中国医学科学院医学信息研究所 专业数据的标注人员遴选方法、装置和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050050627A (ko) * 2005-05-09 2005-05-31 주식회사 이디스넷 통신망을 통한 능력평가 시스템 및 그 방법
JP2007241910A (ja) * 2006-03-13 2007-09-20 National Institute Of Information & Communication Technology 機械翻訳評価装置及び方法
KR20140066921A (ko) * 2012-11-23 2014-06-03 삼성전자주식회사 번역 평가 장치 및 번역 평가 방법
JP2015200985A (ja) * 2014-04-04 2015-11-12 Kddi株式会社 クラウドソーシングにおける作業者のスキルを評価するスキル評価装置、プログラム及び方法
KR101811211B1 (ko) * 2016-12-30 2017-12-21 (주)씽크포비엘 빅데이터 기반의 사용성 테스트 방법 및 장치

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140095956A (ko) 2013-01-25 2014-08-04 한국전자통신연구원 크라우드 소싱기반 영상 지식 콘텐츠 생성 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050050627A (ko) * 2005-05-09 2005-05-31 주식회사 이디스넷 통신망을 통한 능력평가 시스템 및 그 방법
JP2007241910A (ja) * 2006-03-13 2007-09-20 National Institute Of Information & Communication Technology 機械翻訳評価装置及び方法
KR20140066921A (ko) * 2012-11-23 2014-06-03 삼성전자주식회사 번역 평가 장치 및 번역 평가 방법
JP2015200985A (ja) * 2014-04-04 2015-11-12 Kddi株式会社 クラウドソーシングにおける作業者のスキルを評価するスキル評価装置、プログラム及び方法
KR101811211B1 (ko) * 2016-12-30 2017-12-21 (주)씽크포비엘 빅데이터 기반의 사용성 테스트 방법 및 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114446431A (zh) * 2022-01-30 2022-05-06 中国医学科学院医学信息研究所 专业数据的标注人员遴选方法、装置和电子设备

Also Published As

Publication number Publication date
KR102138573B1 (ko) 2020-07-28

Similar Documents

Publication Publication Date Title
WO2017146344A1 (fr) Procédé, appareil, et programme d'ordinateur destinés à fournir un contenu éducatif personnalisé
KR101671693B1 (ko) 오답 및 문제 분석 정보의 재가공을 통한 맞춤형 학습 서비스 방법
WO2020116914A1 (fr) Procédé et dispositif d'évaluation de l'accessibilité et de l'ouverture du web
WO2017026850A1 (fr) Procédé de sortie d'interface utilisateur/expérience utilisateur (ui/ux) personnalisée destiné à des personnes âgées par évaluation de capacités physique et de reconnaissance
WO2018212396A1 (fr) Procédé, dispositif et programme informatique pour analyser des données
WO2019093675A1 (fr) Dispositif de fusion de données et procédé d'analyse de données volumineuses
WO2021112463A1 (fr) Appareil de fourniture d'informations et procédé pour entreprise
WO2021149913A1 (fr) Procédé et dispositif permettant de sélectionner un gène lié à une maladie dans une analyse ngs
WO2020004749A1 (fr) Appareil et procédé permettant à un équipement d'apprendre, à l'aide d'un fichier vidéo
WO2023153863A1 (fr) Système de test et d'évaluation en ligne
WO2020153698A1 (fr) Procédé et dispositif pour sélectionner un annotateur en utilisant une condition d'association
WO2011074714A1 (fr) Procédé pour service d'apprentissage intelligent personnalisé
WO2012165761A2 (fr) Système de service de renseignement collectif et procédé associé
WO2020242108A1 (fr) Procédé de sélection de travailleurs en fonction des caractéristiques d'un projet basé sur une externalisation ouverte
WO2021133076A1 (fr) Procédé et dispositif de gestion du prix unitaire de travail d'un projet basé sur l'externalisation ouverte pour la génération de données d'apprentissage d'intelligence artificielle
WO2024101754A1 (fr) Système de fourniture d'un service de tutorat en mathématiques basé sur l'ia et pouvant effectuer une classification automatique de thème et de niveau de difficulté et une réédition d'une question de mathématiques, et procédé d'application dudit système
WO2020149541A1 (fr) Procédé et dispositif pour générer automatiquement un ensemble de données de questions-réponses pour un thème spécifique
WO2022245009A1 (fr) Procédé d'évaluation de capacité de métacognition, et système d'évaluation associé
WO2022215773A1 (fr) Dispositif d'apprentissage ia et procédé pour fournir un plan d'apprentissage iai
WO2022050624A1 (fr) Système d'analyse et d'évaluation du microbiome intestinal et procédé d'évaluation associé
WO2021112308A1 (fr) Procédé et serveur de fourniture de service d'entretien d'embauche
KR20150031521A (ko) 오답과 관련된 연관문제 제공방법
WO2021112527A1 (fr) Appareil et procédé de détermination de type d'entrepreneur
WO2021049700A1 (fr) Application et serveur pour la gestion de personnels de service
WO2020096135A1 (fr) Procédé et système pour optimiser un processus de création d'entreprise selon des types d'entreprise

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20744932

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 20744932

Country of ref document: EP

Kind code of ref document: A1