WO2023118669A1 - User-specific quizzes based on digital learning material - Google Patents

User-specific quizzes based on digital learning material Download PDF

Info

Publication number
WO2023118669A1
WO2023118669A1 PCT/FI2022/050869 FI2022050869W WO2023118669A1 WO 2023118669 A1 WO2023118669 A1 WO 2023118669A1 FI 2022050869 W FI2022050869 W FI 2022050869W WO 2023118669 A1 WO2023118669 A1 WO 2023118669A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
specific
processor
answers
quiz
Prior art date
Application number
PCT/FI2022/050869
Other languages
French (fr)
Inventor
Christopher PETRIE
Masnad NEHITH
Janne Jormalainen
Original Assignee
New Nordic School Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Nordic School Oy filed Critical New Nordic School Oy
Publication of WO2023118669A1 publication Critical patent/WO2023118669A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • G09B7/04Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student characterised by modifying the teaching programme in response to a wrong answer, e.g. repeating the question, supplying a further explanation

Definitions

  • the present invention relates to user-specific quizzes based on digital learning material.
  • a computer program comprising instructions which, when the program is executed by a processor of, to perform at least the method according to the first aspect.
  • a non-transitory computer readable medium comprising program instructions for causing an apparatus, arrangement or a smart learning system, to perform at least the method according to the first aspect.
  • Fig. 1 , Fig. 2 and Fig. 3 illustrate examples of methods according to at least some embodiments
  • Fig. 4 illustrates an example of labeled quiz-data in accordance with at least some examples of the present invention
  • Fig. 5 illustrates an example of user interface for generating a user-specific quiz based on key concepts in accordance with at least some examples of the present invention
  • Fig. 6 illustrates an example of user interface for generating weighted answers to questions for a user-specific quiz in accordance with at least some examples of the present invention
  • Fig. 7 illustrates an example of user interface for setting quiz conditions for a user-specific quiz in accordance with at least some examples of the present invention
  • Fig. 8 illustrates an example of user interface for setting recommendations based on results of user-specific quizzes in accordance with at least some examples of the present invention
  • Fig. 9 illustrates an example of user interface for a user for taking a user-specific quiz in accordance with at least some examples of the present invention.
  • Fig. 10 illustrates an example of a system in accordance with at least some examples of the present invention.
  • a quiz, or a user-specific quiz, referred to herein may be a computerized quiz comprising questions and associated weighted answers.
  • the user-specific quiz may be provided by computer program code that is executed by at least one processor.
  • the execution of the computer program code causes providing the user-specific quiz on a user interface of a user device.
  • the user interface of the user device is controlled to provide the quiz.
  • the computer program code may be executed in a cloud environment that is connected over a data communications connection with the user device, whereby the user interface may be provided based on controlling the user interface of the user device over a control connection over the data communications connection.
  • the cloud environment may host a machine learning component.
  • the computer program code is configured to input data and/or process data output by the machine learning component for providing the user-specific quiz.
  • the user- specific quiz may be provided based on a skill level of the user of the user device.
  • the skill level of the user may initially be set to a default value for a given topic and after the user has taken at least one user-specific quiz for that topic, the skill level may be evaluated and the skill level of the user may be improved.
  • the machine learning -omponent may check user inputs of the user to the user-specific-quiz and determine based on the received user-specific performance data, a distribution of weighted answers of the user-specific quiz corresponding to the user, and determine, based on the adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user.
  • the input data to the machine-learning component may comprise user inputs of the user to the user-specific- quiz, user-specific performance data and distribution of weighted answers of the user- specific quiz corresponding to the user.
  • the output data to the machine- learning component may comprise adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user.
  • the computer program code and machine learning component may serve a plurality of users and provide the user-specific quizzes on user device of the users.
  • Fig. 1 illustrates an example of a method according to at least some embodiments.
  • Phase 102 comprises generating, by at least one processor connected to a memory storing computer program code and operatively connected one or more user devices, based on at least one digital learning material, user-specific quizzes comprising questions and associated weighted answers selectable via user interfaces of the one or more user devices.
  • Phase 102 comprises controlling, by the at least one processor, each of the one or more user devices to provide, on a user interface of each user device, a user-specific quiz of the generated user-specific quizzes corresponding to a user of the user device.
  • Phase104 comprises receiving, by the at least one processor, from each of the one or more user devices user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration for the user answering the at least one question, at least one weighted answer selected by the user and monitoring data of the user answering the at least one question.
  • Phase 106 comprises adapting, by the at least one processor, based on the received user-specific performance data, distributions of weighted answers of the user-specific quizzes corresponding to each user.
  • Phase 108 comprises controlling, by the at least one processor, each of the one or more user devices to provide, on the user interface of the user device, at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user based on the adapted distribution of weighted answers of the user-specific quiz corresponding to the user.
  • the at least one digital learning material comprises textual digital learning material.
  • the textual digital learning material may include human readable text in a machine-readable file format. Examples of the file formats comprise formats that support structured text comprise XML (Extensible Markup Language), HMTL (Hypertext Markup Language), PDF (Portable Document Format) and DOCX. Also non-structured text may be used, e.g. ASCII text, for the digital learning material.
  • the method may be performed by a system described in an embodiment.
  • the system may be connected to one or more user devices for providing the user-specific quizzes on user interfaces of the user devices.
  • phase 102 comprises generating, by the at least one processor, the user-specific quizzes based on a machine learning component trained based on labeled questions and answers associated with at least one topic of the at least one digital learning material.
  • the machine learning component may be connected to a system described in an embodiment, whereby the user-specific quizzes may be provided.
  • phase 102 comprises generating, by the at least one processor, the at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user by feeding the user-specific performance data to the machine learning component trained based on labeled questions and weighted answers associated with at least one topic of the at least one digital learning material.
  • difficulty levels of the questions, distributions of weighted answers and numbers of weighted answers of the user-specific quizzes are determined by the machine learning component.
  • the distributions of weighted answers and numbers of weighted answers may be different for each skill level. For example, if a question is presented to a novice, the distribution of weighted answer for the question and a number of weighted answers may be different than if the question is presented to an expert.
  • the labeling for each question and associated weighted answers comprises at least part of information indicating performance of a user on answering the question, a grade of the question, a difficulty level of the question and a distribution of weighted answers.
  • the labeling may be performed by a user on a user interface provided on a user device.
  • phase 102 comprises deriving, by the at least one processor, the at least one topic, from the at least one digital learning material, and generating, by the at least one processor, the questions and associated weighted answers based on the derived the at least one topic.
  • phase 108 comprises, generating, by the at least one processor, a grade, or skill level, for the user, after a completion of the user-specific quiz corresponding to the user on the user device.
  • the generated skill level may used by the system for updating a backend system.
  • the backend system may maintain information of skill levels of users and the backend system may provide an initial skill level for generating a user-specific quiz for a given user in phase 102.
  • the weighted answers comprise one or more answers in each of two or more categories for possible answers, wherein the two or more categories comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
  • each of the distributions of weighted answers of the user-specific quizzes corresponding to each user comprise a configuration of the two or more categories and a number of answers in each of the two or more categories.
  • Fig. 2 illustrates an example of a method in accordance with at least some embodiments.
  • the method may be performed in connection with the method of Fig. 1 , for example in phase 108.
  • the method provides that a skill level of the user, for example improvement of the skill level of the user in a topic can be updated to a backend system.
  • the backend system may maintain information of skill level of users and associated topics.
  • Phase 202 comprises determining, by the at least one processor, a skill level of the user based on the user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user
  • Phase 204 comprises determining whether the determined skill level meets a threshold for teaching the user the at least one topic.
  • phase 204 determines, by the at least one processor, updating the skill level and the at least one topic to a backend system for tracking learning results of the user. If it is determined in phase 204 that the determined skill level does not meet the threshold, the method proceeds to phase 208 comprising, determining, by the at least one processor, a further user-specific quiz to the user; and controlling, by the at least one processor, to provide the further user-specific quiz on the user interface of the user device of the user.
  • the phase 204 may comprise comparing the skill level of the user to a target skill level that is e.g. novice, apprentice, proficient and expert.
  • phase 204 comprises that the threshold may be a percentage value formed based on a proportion of a number of correct answers of the user to a total number of questions to the user-specific quiz.
  • phase 204 comprises that the threshold may be associated with a target skill level of the user. If the determined skill level of the user meets the threshold, e.g. is equal to the threshold or exceeds the threshold, the user may be determined to have reached the target skill level and the target skill level may be set as the skill level of the user.
  • phase 204 comprises that the determined skill level is compared with more than one threshold, a plurality of thresholds or ranges of threshold values, for teaching the user the at least one topic the threshold.
  • Each of the thresholds or ranges of threshold values may be associated with a target skill level of the user. If the determined skill level of the user meets one of the thresholds, e.g. is equal to the threshold or exceeds the threshold, or falls within one of the ranges of threshold values, the user may be determined to have reached the target skill level associated with the met threshold or the range of threshold values, and the reached target skill level may be set as the skill level of the user.
  • the plurality of thresholds, or ranges of threshold values may comprise two or more thresholds, or two or more or ranges of thresholds.
  • the thresholds, or ranges of threshold values may be associated with target skill levels examples of which comprise Expert, Proficient, Apprentice and Novice.
  • the target skill levels may be associated with the thresholds, or ranges of threshold values, that may be represented in percentage values formed based on a proportion of a number of correct answers of the user to a total number of questions to the user-specific quiz.
  • Fig. 3 illustrates an example of a method for a system in accordance with at least some embodiments.
  • the method describes operation of the system for providing a user-specific quiz for a single student.
  • the operation of the system may be applied for each user, or student, in a group of users, or students, in a similar way.
  • Phase 302 comprises receiving labeled quiz data.
  • the labeled quiz data comprises labeled questions and answers associated with at least one topic of the at least one digital learning material.
  • the topic nay be determined based on the digital learning material using natural language processing.
  • the answers may be weighted answers.
  • the quiz-data may be generated based on one or more digital learning material.
  • the digital learning material may be an online resource. Accordingly, the one or more digital learning material may be available over a data network connection.
  • An application programming interface may be used for accessing the one or more digital learning material.
  • the digital learning material is the Encyclopedia Britannica available at https://www.britannica.com/.
  • the labeling for each question and associated weighted answers comprises at least part of information indicating performance of a user on answering the question, a grade of the question (difficulty level of the question), or skill level of the user, and a distribution of weighted answers.
  • Phase 304 comprises training a machine learning component based on the labeled quiz-data received in phase 302.
  • the training comprises training the Generative Pre-trained Transformer 3 (GPT-3) model, for example Davinci model described in “https://beta.openai.com/docs/engines/davinci”.
  • GPT-3 model and its applications are described in “https://openai.com/blog/gpt-3-apps/”.
  • the GPT- 3 model may be trained based on giving text prompts, i.e. prompts, to the GPT-3 model. Examples of the text prompts comprise phrases and/or sentences. In return to each prompt, the GPT-3 model returns a text completion in natural language.
  • the training is performed based on generating one or more prompts for the GPT3 based on the labeled quiz-data received in phase 302, whereby the labeled quizdata is fed to the GPT-3 model and the GPT-3 model may output quiz-data comprising one or more user-specific quizzes.
  • the quiz-data may comprise one or more questions and for each question associated weighted answers.
  • the weighted answers for each question comprise one or more answers in each of two or more categories for possible answers, wherein the two or more categories comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
  • the GPT-3 may be trained for different skill levels of the users.
  • the GPT-3 may output quiz-data comprising 7 weighted answers and the user-specific quiz, according to the skill level of the user, may be determined from the output of the GPT-3 model based on a distribution of weighted answers that is specific to the skill level of the user.
  • Each of the distributions of weighted answers of the user-specific quizzes corresponding to the skill level of the user may comprise a configuration of the two or more categories and a number of answers in each of the two or more categories. Examples of the two or more categories may comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer. Examples of the skill levels comprise novice, apprentice, proficient and expert.
  • the quiz-data output by the GPT-3 may be evaluated e.g. by user such as a teacher to determine whether the GPT-3 model has been trained sufficiently.
  • a sufficient training may be determined based on a proportion of acceptable quiz-data from a total quiz-data output by the GPT3-model.
  • a suitable proportion of acceptable quiz-data may be e.g. 80%.
  • the quiz-data e.g. questions and answers may be displayed to the teacher using the user interface described with Fig. 6. In this way the teacher may assist and/or train the machine learning component.
  • the proportion of acceptable quiz-data may be evaluated and updated continuously based on user input of the teacher. In this way the proportion of acceptable quiz-data may be used in addition to the user input of the teacher to validate the user-specific quizzes.
  • Phase 306 comprises the machine learning component outputting a userspecific quiz for a user, or student.
  • the user-specific quiz comprises questions that are associated with multiple answers of different categories.
  • the user-specific quiz comprises weighted answers of at least two different categories for each question. Examples of the categories of weighted answers comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
  • phase 306 comprises the machine learning component outputting questions that have more than one grade.
  • each question may have a grade, or a difficulty level of the question, and the question may be associated with and weighted answers.
  • the difficulty level of the question may be easy, medium or hard.
  • the grade of the question may be determined based on Flesch-Kincaid readability tests described in https://en.wikipedia.org/wiki/Flesch%E2%80%93Kincaid_readability_tests.
  • the Flesch- Kincaid readability tests comprise Flesch reading ease and Flesch-Kincaid grade level. They may be both used for determining a grade of a question or only one of them may be used.
  • the Flesch reading ease test is defined based the following formula that gives the Flesch reading ease score:
  • the Flesch-Kincaid grade level is defined based the following formula that gives the Flesch-Kincaid grade level:
  • Values obtained by the Flesch reading ease score and the Flesch-Kincaid grade level may be compared with value ranges that correspond with the difficulty levels of the questions, e.g. easy, medium or hard.
  • the different difficulty levels may be determined based on value ranges, whereby the values obtained by the Flesch reading ease score and/or the Flesch-Kincaid grade level falling within a given value range may be used to determine the difficulty level of the question.
  • the Flesch-Kincaid readability tests may be accessed by the system based on an Application Programming Interface (API) provided at “https://github.com/words/flesch-kincaid”.
  • API Application Programming Interface
  • Phase 310 comprises providing the generated user-specific quiz to the user.
  • the user may take the user-specific quiz and the user may be evaluated based on the user-specific quiz.
  • the generated user-specific quiz may be displayed to the user on a user interface of the user device of the user. User input of the user on the user interface may be used to determine answers to the questions of the user-specific quiz.
  • Phases 312, 314 and 316 comprises receiving user-specific performance data of the user answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration 314 for the user answering the at least one question, at least one weighted answer 312 selected by the user and monitoring data 316 of the user answering the at least one question.
  • the monitoring data may comprise sensor data received from one or more sensors that have been configured for monitoring the user.
  • the one or more sensors may comprise video and/or still cameras, and/or microphones, and/or audio recording means.
  • Phase 318 comprises adapting, based on the received user-specific performance data, distributions of weighted answers of the user-specific quiz of the user and controlling the user device of the user to provide, on the user interface of the user device, at least one subsequent question and associated weighted answers of the userspecific quiz corresponding to the user based on the adapted distribution of weighted answers of the user-specific quiz corresponding to the user.
  • the machine learning component may check user inputs of the user to the user-specific-quiz and determine based on the received user-specific performance data, a distribution of weighted answers of the user-specific quiz corresponding to the user, and determine, based on the adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user.
  • Phase 320 comprises displaying the user-specific quiz with the one or more subsequent questions to the user. In this way the user may provide his/her answers to the one or more subsequent questions that have been adapted based on the user-specific performance data. If the quiz is completed after the user provide his/her answers to the one or more subsequent questions that have been adapted based on the user-specific performance data, the method proceeds to phase 322. Otherwise the method may proceed to phase 322.
  • Phase 322 comprises generating, by the at least one processor, a grade, or skill level, for the user, after a completion of the user-specific quiz corresponding to the user on the user device.
  • the grade, or skill level, of the user indicates a skill level of the user in the at least one topic.
  • at least one of phases 320 and 322 may provide that a skill level of the user, for example improvement of the skill level of the user in a topic can be updated to a backend system.
  • the skill level of the user may be updated in accordance with the method described in Fig. 2.
  • Fig. 4 illustrates an example of labeled quiz-data in accordance with at least some examples of the present invention.
  • the labeled quiz-data may be used for training a machine learning component e.g. in phase 304 of Fig. 3.
  • the labeled quiz-data may be generated by human, e.g. a teacher.
  • the labeled quiz-data is illustrated in a table for facilitating to understand the relationships between different parts of the quiz-data. However, it should be noted that also other presentations are viable.
  • the quiz-data comprises for each key concept 402, one or more questions 404. Each question is presented in a row that defines further quiz- data associated with the question. Accordingly, each question is associated with a set of weighted answers 406, 408, 410, 412, 414, 416, 418.
  • the question and associated weighted answers have a grade of the question, or difficulty level, 420 of the question.
  • the question and associated weighted answers are associated with a skill level 422 that indicates a skill level of the user that the questions and associated weighted answers are targeted.
  • weighted answers 406, 408, 410, 412, 414, 416, 418 may comprise one or more answers in two or more categories comprising a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
  • a distribution of categories may be adapted for user-specific quizzes.
  • the distribution of categories of weighted answers may be associated with a skill level of a user. The skill levels may be for example novice, apprentice, proficient and expert. The distribution of categories for one skill level may have a different portion of weighted answers in a specific category than a distribution of categories for another skill level.
  • a portion of correct answers of the weighted answers may be the highest for a novice skill level and the lowest for an expert skill level.
  • a question for novice skill level may be associated with altogether 3 questions comprising the following distribution of categories: 1 correct answer, 1 one nearly correct answer and 1 extremely incorrect answer.
  • a question for apprentice skill level may be associated with altogether 4 questions comprising the following distribution of categories: 1 correct answer, 1 one nearly correct answer, 1 mostly incorrect answer and 1 extremely incorrect answer.
  • a question for proficient skill level may be associated with altogether 6 questions comprising the following distribution of categories: 1 correct answer, 2 one nearly correct answer, mostly incorrect answer 2 and 1 extremely incorrect answer.
  • a question for expert skill level may be associated with altogether 8 questions comprising the following distribution of categories: 1 correct answer, 4 one nearly correct answer, 2 mostly incorrect answer and 1 extremely incorrect answer.
  • Fig. 5 illustrates an example of user interface for generating a user-specific quiz based on one or more key concepts, or topics, in accordance with at least some examples of the present invention.
  • the user interface may be displayed on a user device of a teacher, or a teacher user device.
  • the user interface provides the teacher to assist and/or train a machine learning component in generating user-specific quizzes, for example in phase 304 of Fig. 3.
  • the user interface comprises one or more views 500, 501 that each may comprise user interface elements. At least part of the user interface elements may be selectable by the user based on the user entering user input that is associated by the user device as a selection that of a user interface element displayed on the user interface.
  • the first view comprises a user interface element 502 that causes deriving, one or more topics, from the one or more digital learning materials.
  • the one or more topics may be derived and displayed in the second view 501.
  • the second view may display the topics to the user.
  • the second view may provide the user to select one or more of the topics. Also editing of the topics may be permitted.
  • the second view 501 illustrates selected topics by the user 504 and non- selected topics by the user 505.
  • the second view may comprise a user interface element 506 that when selected causes the system to generate questions and associated weighted answers for each of the selected topics.
  • the views 500, 501 may be used for example in connection with phase 304 and/or 306 of Fig. 3 for assisting the machine learning component to generate user-specific quizzes.
  • the user-specific quizzes may be generated based on the topics selected by the user.
  • Fig. 6 illustrates an example of user interface for generating weighted answers to questions for a user-specific quiz in accordance with at least some examples of the present invention.
  • the user interface may be displayed after the machine learning component has generated one or more user-specific quizzes in accordance with phase 304 and/or 306 in Fig. 3.
  • the user interface 600 may display answers 604,606, 608 associated with each question 602 of the generated quiz.
  • the answers may comprise one or more answers in two or more categories comprising a correct answer 604 and at least one of a nearly correct answer 606, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer 608.
  • the answers may be weighted, whereby a number of answers in each category may be controlled by the user via the user interface, for example by way of one or more user interface elements for setting a number of answers in each category.
  • the weighted answers may be specific to a skill level of the quiz.
  • the skill level of the quiz may be e.g. novice, apprentice, proficient and expert.
  • the user interface may further comprise a user interface element 610 for validating the question and/or one or more further user interface elements for setting conditions the quiz.
  • the conditions of the quiz provide controlling the quiz.
  • the user interface described with of Fig. 6 may be used for assisting training of the machine learning component in phase 304 of Fig. 3.
  • the question 602 and associated answers 604, 606, 608 may be obtained from the output of the machine learning component, e.g. during training of the machine learning component, e.g. the GPT-3 model.
  • the question 602 and associated answers 604, 606, 608 of the quiz-data may be displayed to the teacher on the user interface.
  • the teacher may review all questions and answers of user-specific quizzes of different skill levels via the user interface. Accordingly, the teacher can review via the user interface the content of the questions and answers, distribution of weighted answers, numbers of weighted answers for questions and difficulty of the questions.
  • the training of the machine-learning component may be determined to be completed based on user input received from the teacher to the questions and associated answers for different skill levels displayed on the user interface. If the teacher enters user input to the user interface for validating the userspecific quizzes of each skill level, the machine learning component may be determined to be sufficiently trained.
  • user input of the teacher to the user interface element 610 may serve for validating the user-specific quizzes of different skill levels or a further user interface element may be provided on the user interface for validating one or more of the user-specific quizzes. Consequently, the user interface to the teacher assists in training of the machine learning component at least by allowing the teacher to validate questions and associated weighted answers directed to the different skill levels of users.
  • the machine learning component has been configured to generate user-specific quizzes based on more than one skill level of users.
  • Fig. 7 illustrates an example of user interface for setting quiz conditions for a user-specific quiz in accordance with at least some examples of the present invention.
  • the user interface 700 may form a part of the user interface described with Fig. 6 for example.
  • the quiz conditions may comprise time parameters 704, skill level 708 and users assigned to take the quiz.
  • the time parameters may comprise a start time of the quiz, a due date of the quiz and a time limit to answer each question. Examples of further parameters comprise a setting that determines whether the quiz is a closed book exam. If the quiz is set to be a closed book exam, monitoring data of the users taking the quiz may activated during the quiz and used to determine that the users do not use extra materials during the quiz.
  • Fig. 8 illustrates an example of user interface for setting recommendations based on results of user-specific quizzes in accordance with at least some examples of the present invention.
  • the user interface 802 may be displayed on a teacher user device in connection with phase 322.
  • the user interface enables a teacher to determine recommendations to users based on the grades, or skill levels, of the users after taking the user-specific quizzes.
  • the user interface comprises skill-level based recommendations 804,806,808, 810.
  • the skill level -based recommendations may be determined based on a machine learning component and/or the teacher.
  • the user interface may comprise user interface elements corresponding to the skill level -based recommendations, whereby the teacher may edit them.
  • the machine learning component has determined the skill level -based recommendations, whereby the teacher may review the recommendations on the user interface and edit the recommendations.
  • the recommendations comprise a recommendation to review a specific material and a recommendation to have a session with the teacher.
  • the session with the teacher may ne an online meeting or an in-person meeting in a classroom for example.
  • the user interface may comprise a user interface element 814 for validating the recommendations, whereby the recommendations may be sent to the users according to their skill levels. Accordingly, users that have a specific skill level are sent the recommendations for the specific skill level.
  • Fig. 9 illustrates an example of user interface for a user for taking a user-specific quiz in accordance with at least some examples of the present invention.
  • the user interface may be displayed don a user device for example in phase 310 of Fig. 3.
  • the user interface 900 may comprise one or more questions 902 of a user-specific quiz and weighted answers 904 associated with the one or more questions.
  • the weighted answers may be selectable by the user either directly or separate user interface elements 906 may be provided for selecting the weighted answers.
  • Figs 5 to 9 describe features of user interfaces.
  • the features may be implemented by user interface elements that may be graphical objects that are displayed on a user interface of a user device, whereby the user of the user device may be provided information.
  • Functionalities of the features implemented by user interface elements may be provided at least in response to a user input of the user.
  • a command may be generated which causes a system according to an embodiment to perform the functionality associated with the user interface element.
  • the user interface may be implemented using a touch screen, a display device and/or one or more computer peripheral devices.
  • Fig. 10 illustrates an example of a system in accordance with at least some embodiments of the present invention.
  • the system 400 may be a user device, such as a student user device or a teacher user device. Examples of the user devices comprise a smart phone, a tablet device, a laptop computer a personal computer or any other similar device, or a part or module therefore.
  • processor 1010 Comprised in apparatus 1000 is processor 1010, which may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core.
  • Processor 1010 may comprise, in general, a control device.
  • Processor 1010 may comprise more than one processor.
  • Processor 1010 may be a control device.
  • a processing core may comprise, for example, a Cortex-A8 processing core manufactured by ARM Holdings or a Zen processing core designed by Advanced Micro Devices Corporation.
  • Processor 1010 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor.
  • Processor 1010 may comprise at least one application-specific integrated circuit, ASIC.
  • Processor 1010 may comprise at least one field-programmable gate array, FPGA.
  • Processor 1010 may be means for performing method steps in system 1000, such as generating, controlling, receiving, adapting, deriving and determining.
  • Processor 1010 may be configured, at least in part by computer instructions, to perform actions.
  • a processor may comprise circuitry, or be constituted as circuitry or circuitries, the circuitry or circuitries being configured to perform phases of methods in accordance with embodiments described herein.
  • circuitry may refer to one or more or all of the following: (a) hardware-only circuit implementations, such as implementations in only analogue and/or digital circuitry, and (b) combinations of hardware circuits and software, such as, as applicable: (i) a combination of analogue and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an system, such as a server, to perform various functions) and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
  • firmware firmware
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
  • the system 1000 may comprise memory 1020.
  • Memory 1020 may comprise random-access memory and/or permanent memory.
  • Memory 1020 may comprise at least one RAM chip.
  • Memory 1020 may comprise solid-state, magnetic, optical and/or holographic memory, for example.
  • Memory 1020 may be at least in part accessible to processor 1010.
  • Memory 1020 may be at least in part comprised in processor 1010.
  • Memory 1020 may be means for storing information.
  • Memory 1020 may comprise computer instructions that processor 1010 is configured to execute. When computer instructions configured to cause processor 1010 to perform certain actions are stored in memory 1020, and system 1000 overall is configured to run under the direction of processor 1010 using computer instructions from memory 1020, processor 1010 and/or its at least one processing core may be considered to be configured to perform said certain actions.
  • Memory 1020 may be at least in part comprised in processor 1010.
  • Memory 1020 may be at least in part external to system 1000 but accessible to system 1000.
  • the system may comprise a machine learning component 1080.
  • the machine learning component may be computer instructions stored on a memory.
  • the memory may comprise random-access memory and/or permanent memory.
  • Memory may comprise at least one RAM chip.
  • Memory may comprise solid-state, magnetic, optical and/or holographic memory, for example.
  • Memory may be at least in part accessible to processor 1010.
  • Memory may be at least in part comprised in processor 1010.
  • Memory may be means for storing information.
  • the machine learning and the memory 1020 may be a single entity.
  • the machine learning component may be hosted by the system and connected to another system 1000 over a data communications connection, where the machine learning component may be utilized over the data communication connection.
  • data received over the data communications connection from one or more user devices may be input to the machine learning component and the machine learning component may generate data that is transmitted over the data communication connection to the user devices. It should be noted that the received data may be processed before fed as input to the machine learning component and/or the data output by the machine learning component may be processed before transmitted over the data communication connection to the user devices.
  • System 1000 may comprise a transmitter 1030.
  • System 1000 may comprise a receiver 1040.
  • Transmitter 1030and receiver 1040 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard.
  • Transmitter 1030 may comprise more than one transmitter.
  • Receiver 1040 may comprise more than one receiver.
  • Transmitter 1030and/or receiver 1040 may be configured to operate in accordance with suitable communication protocols, such as those used in a radio-access and core network of a cellular communication network.
  • System 1000 may comprise user interface, III, 1060.
  • Ill 1060 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing system 1000 to vibrate, a speaker and a microphone.
  • a user may be able to operate system 1000 via III 1060, for example to take a quiz and/or generate user-specific quizzes.
  • Processor 1010 may be furnished with a transmitter arranged to output information from processor 1010, via electrical leads internal to system 1000, to other devices comprised in system 1000.
  • Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 1020 for storage therein.
  • the transmitter may comprise a parallel bus transmitter.
  • processor 1010 may comprise a receiver arranged to receive information in processor 1010, via electrical leads internal to system 1000, from other devices comprised in system 1000.
  • a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 1040 for processing in processor 1010.
  • the receiver may comprise a parallel bus receiver.
  • System 1000 may comprise further systems not illustrated in Fig. 10.
  • System 1000 may comprise one or more of video and/or still cameras, and/or microphones, and/or audio recording means, and/or a fingerprint sensor arranged to authenticate, at least in part, a user of system 1000.
  • system 1000 lacks at least one device described above.
  • Processor 1010, memory 1020, transmitter 1030, receiver 1040, machine learning component 1080, and/or III 1060 may be interconnected by electrical leads internal to system 1000 in a multitude of different ways.
  • each of the aforementioned devices may be separately connected to a master bus internal to system 1000, to allow for the devices to exchange information.
  • this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the present invention.
  • System 1000 may be connected by transmitter 1030 and receiver 1040 to one or more other systems or parts thereof.
  • Examples of the other systems comprise at least machine learning components and a backend system.
  • Connection of the system 1000 with the one or more other systems or parts thereof may be data communications connections for example data network connections over Internet Protocol ()IP) connections.
  • IP Internet Protocol
  • the various embodiments of the invention may be implemented in hardware or special purpose circuits or any combination thereof. While various aspects of the invention may be illustrated and described as block diagrams or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof. [0063] It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

There is provided method a comprising generating, user-specific quizzes comprising questions and associated weighted answers selectable via user interfaces of the one or more user devices, controlling each of the one or more user devices to provide, on a user interface of each user device, a user-specific quiz of the generated user-specific quizzes corresponding to a user of the user device, receiving from each of the one or more user devices user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration for the user answering the at least one question, at least one weighted answer selected by the user and monitoring data of the user answering the at least one question, adapting based on the received user-specific performance data, distributions of weighted answers of the user-specific quizzes corresponding to each user.

Description

USER-SPECIFIC QUIZZES BASED ON DIGITAL LEARNING MATERIAL
[0001] The present invention relates to user-specific quizzes based on digital learning material.
Figure imgf000002_0001
[0002] Educators globally need to reserve a significant amount of time and effort on assessment of learning progress their students. If the number of students is high, there may be not enough time to make personalized assessments for the students. Therefore, standardized tests are a common way of tracking how the learning of the students is progressing. However, the standardized tests do not consider individual differences of the students, for example differences in learning a particular subject and differences in personal abilities. Moreover, creating quality questions and answers for a test to average students is a task which is often too time consuming for individual educators to regularly conduct. Consequently, students and educators mostly engage with the testing of content towards the end of a course in a high-stakes exam. Moreover, regularly maintaining and updating assessments are tasks educators rarely have the time for, which creates a legacy of disorganized and poorly maintained resources, making it difficult to quickly locate and assign appropriate tests for students.
[0003] The scope of protection sought for various embodiments of the invention is set out by the independent claims. The embodiments, examples and features, if any, described in this specification that do not fall under the scope of the independent claims are to be interpreted as examples useful for understanding various embodiments of the invention.
[0004] According to a first aspect there is provided a method according to claim 1 .
[0005] According to a second aspect there is provided a system according to claim 10.
[0006] According to a third aspect there is provided a computer program comprising instructions which, when the program is executed by a processor of, to perform at least the method according to the first aspect.
[0007] According to a fourth aspect there is provided a non-transitory computer readable medium comprising program instructions for causing an apparatus, arrangement or a smart learning system, to perform at least the method according to the first aspect.
[0008] Some further aspects are defined in the dependent claims. The embodiments that do not fall under the scope of the claims are to be interpreted as examples useful for understanding the disclosure.
Brid description of the Drawings
[0009] In the following, various embodiments will be described in more detail with reference to the appended drawings, in which
Fig. 1 , Fig. 2 and Fig. 3 illustrate examples of methods according to at least some embodiments;
Fig. 4 illustrates an example of labeled quiz-data in accordance with at least some examples of the present invention;
Fig. 5 illustrates an example of user interface for generating a user-specific quiz based on key concepts in accordance with at least some examples of the present invention;
Fig. 6 illustrates an example of user interface for generating weighted answers to questions for a user-specific quiz in accordance with at least some examples of the present invention;
Fig. 7 illustrates an example of user interface for setting quiz conditions for a user-specific quiz in accordance with at least some examples of the present invention;
Fig. 8 illustrates an example of user interface for setting recommendations based on results of user-specific quizzes in accordance with at least some examples of the present invention;
Fig. 9 illustrates an example of user interface for a user for taking a user-specific quiz in accordance with at least some examples of the present invention; and
Fig. 10 illustrates an example of a system in accordance with at least some examples of the present invention.
Description of Example Embodiments
[0010] The following description and drawings are illustrative and are not to be construed as unnecessarily limiting. The specific details are provided for a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are, reference to the same embodiment and such references mean at least one of the embodiments.
[0011] Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure.
[0012] Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims and description to modify a described feature does not by itself connote any priority, precedence, or order of one described feature over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one described feature having a certain name from another described feature having a same name (but for use of the ordinal term) to distinguish the described feature.
[0013] The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of un-recited features. The features recited in dependent claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, i.e. a singular form, throughout this document does not exclude a plurality.
[0014] A quiz, or a user-specific quiz, referred to herein may be a computerized quiz comprising questions and associated weighted answers. The user-specific quiz may be provided by computer program code that is executed by at least one processor. The execution of the computer program code causes providing the user-specific quiz on a user interface of a user device. In an example, the user interface of the user device is controlled to provide the quiz. In an example the computer program code may be executed in a cloud environment that is connected over a data communications connection with the user device, whereby the user interface may be provided based on controlling the user interface of the user device over a control connection over the data communications connection. The cloud environment may host a machine learning component. The computer program code is configured to input data and/or process data output by the machine learning component for providing the user-specific quiz. The user- specific quiz may be provided based on a skill level of the user of the user device. The skill level of the user may initially be set to a default value for a given topic and after the user has taken at least one user-specific quiz for that topic, the skill level may be evaluated and the skill level of the user may be improved. The machine learning -omponent may check user inputs of the user to the user-specific-quiz and determine based on the received user-specific performance data, a distribution of weighted answers of the user-specific quiz corresponding to the user, and determine, based on the adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user. Accordingly, the input data to the machine-learning component may comprise user inputs of the user to the user-specific- quiz, user-specific performance data and distribution of weighted answers of the user- specific quiz corresponding to the user. Accordingly, the output data to the machine- learning component may comprise adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user. The computer program code and machine learning component may serve a plurality of users and provide the user-specific quizzes on user device of the users.
[0015] Fig. 1 illustrates an example of a method according to at least some embodiments. Phase 102 comprises generating, by at least one processor connected to a memory storing computer program code and operatively connected one or more user devices, based on at least one digital learning material, user-specific quizzes comprising questions and associated weighted answers selectable via user interfaces of the one or more user devices. Phase 102 comprises controlling, by the at least one processor, each of the one or more user devices to provide, on a user interface of each user device, a user-specific quiz of the generated user-specific quizzes corresponding to a user of the user device. Phase104 comprises receiving, by the at least one processor, from each of the one or more user devices user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration for the user answering the at least one question, at least one weighted answer selected by the user and monitoring data of the user answering the at least one question. Phase 106 comprises adapting, by the at least one processor, based on the received user-specific performance data, distributions of weighted answers of the user-specific quizzes corresponding to each user. Phase 108 comprises controlling, by the at least one processor, each of the one or more user devices to provide, on the user interface of the user device, at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user based on the adapted distribution of weighted answers of the user-specific quiz corresponding to the user. [0016] In an example, in accordance with at least some embodiments, the at least one digital learning material comprises textual digital learning material. The textual digital learning material may include human readable text in a machine-readable file format. Examples of the file formats comprise formats that support structured text comprise XML (Extensible Markup Language), HMTL (Hypertext Markup Language), PDF (Portable Document Format) and DOCX. Also non-structured text may be used, e.g. ASCII text, for the digital learning material.
[0017] In an example the method may be performed by a system described in an embodiment. The system may be connected to one or more user devices for providing the user-specific quizzes on user interfaces of the user devices.
[0018] In an example according to at least some embodiments phase 102 comprises generating, by the at least one processor, the user-specific quizzes based on a machine learning component trained based on labeled questions and answers associated with at least one topic of the at least one digital learning material. In an example, the machine learning component may be connected to a system described in an embodiment, whereby the user-specific quizzes may be provided.
[0019] In an example according to at least some embodiments phase 102 comprises generating, by the at least one processor, the at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user by feeding the user-specific performance data to the machine learning component trained based on labeled questions and weighted answers associated with at least one topic of the at least one digital learning material.
[0020] In an example according to at least some embodiments, in phase 102, difficulty levels of the questions, distributions of weighted answers and numbers of weighted answers of the user-specific quizzes are determined by the machine learning component. In an example, the distributions of weighted answers and numbers of weighted answers may be different for each skill level. For example, if a question is presented to a novice, the distribution of weighted answer for the question and a number of weighted answers may be different than if the question is presented to an expert.
[0021] In an example according to at least some embodiments, in phase 102, the labeling for each question and associated weighted answers comprises at least part of information indicating performance of a user on answering the question, a grade of the question, a difficulty level of the question and a distribution of weighted answers. The labeling may be performed by a user on a user interface provided on a user device.
[0022] In an example according to at least some embodiments phase 102 comprises deriving, by the at least one processor, the at least one topic, from the at least one digital learning material, and generating, by the at least one processor, the questions and associated weighted answers based on the derived the at least one topic.
[0023] In an example according to at least some embodiments phase 108 comprises, generating, by the at least one processor, a grade, or skill level, for the user, after a completion of the user-specific quiz corresponding to the user on the user device. The generated skill level may used by the system for updating a backend system. The backend system may maintain information of skill levels of users and the backend system may provide an initial skill level for generating a user-specific quiz for a given user in phase 102.
[0024] In an example according to at least some embodiments, in phase 102, the weighted answers comprise one or more answers in each of two or more categories for possible answers, wherein the two or more categories comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
[0025] In an example according to at least some embodiments, in phase 102, each of the distributions of weighted answers of the user-specific quizzes corresponding to each user comprise a configuration of the two or more categories and a number of answers in each of the two or more categories.
[0026] Fig. 2 illustrates an example of a method in accordance with at least some embodiments. The method may be performed in connection with the method of Fig. 1 , for example in phase 108. The method provides that a skill level of the user, for example improvement of the skill level of the user in a topic can be updated to a backend system. In this way the backend system may maintain information of skill level of users and associated topics. Phase 202 comprises determining, by the at least one processor, a skill level of the user based on the user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user, Phase 204 comprises determining whether the determined skill level meets a threshold for teaching the user the at least one topic. If it is determined in phase 204 that the determined skill level meets the threshold, the method proceeds to phase 206 comprising, determining, by the at least one processor, updating the skill level and the at least one topic to a backend system for tracking learning results of the user. If it is determined in phase 204 that the determined skill level does not meet the threshold, the method proceeds to phase 208 comprising, determining, by the at least one processor, a further user-specific quiz to the user; and controlling, by the at least one processor, to provide the further user-specific quiz on the user interface of the user device of the user. In an example, the phase 204 may comprise comparing the skill level of the user to a target skill level that is e.g. novice, apprentice, proficient and expert.
[0027] In an example, phase 204 comprises that the threshold may be a percentage value formed based on a proportion of a number of correct answers of the user to a total number of questions to the user-specific quiz.
[0028] In an example, phase 204 comprises that the threshold may be associated with a target skill level of the user. If the determined skill level of the user meets the threshold, e.g. is equal to the threshold or exceeds the threshold, the user may be determined to have reached the target skill level and the target skill level may be set as the skill level of the user.
[0029] In an example, phase 204 comprises that the determined skill level is compared with more than one threshold, a plurality of thresholds or ranges of threshold values, for teaching the user the at least one topic the threshold. Each of the thresholds or ranges of threshold values may be associated with a target skill level of the user. If the determined skill level of the user meets one of the thresholds, e.g. is equal to the threshold or exceeds the threshold, or falls within one of the ranges of threshold values, the user may be determined to have reached the target skill level associated with the met threshold or the range of threshold values, and the reached target skill level may be set as the skill level of the user. In an example, the plurality of thresholds, or ranges of threshold values, may comprise two or more thresholds, or two or more or ranges of thresholds. The thresholds, or ranges of threshold values, may be associated with target skill levels examples of which comprise Expert, Proficient, Apprentice and Novice. The target skill levels may be associated with the thresholds, or ranges of threshold values, that may be represented in percentage values formed based on a proportion of a number of correct answers of the user to a total number of questions to the user-specific quiz. In an example, the target skill levels and associated ranges of threshold values
Figure imgf000008_0001
in percentages may be as follows: Expert Xexpert = {90 % => x =< 100 %}; Proficient Xproficient = {60 % => x < 80 %}; Apprentice Xapprentice = {60 % => x < 80 %}; and Novice Xnovice = {x < 60 %}.
[0030] Fig. 3 illustrates an example of a method for a system in accordance with at least some embodiments. The method describes operation of the system for providing a user-specific quiz for a single student. However, it should be noted that the operation of the system may be applied for each user, or student, in a group of users, or students, in a similar way.
[0031] Phase 302 comprises receiving labeled quiz data. In an example the labeled quiz data comprises labeled questions and answers associated with at least one topic of the at least one digital learning material. The topic nay be determined based on the digital learning material using natural language processing. The answers may be weighted answers. The quiz-data may be generated based on one or more digital learning material. In an example, the digital learning material may be an online resource. Accordingly, the one or more digital learning material may be available over a data network connection. An application programming interface may be used for accessing the one or more digital learning material. In an example of the digital learning material is the Encyclopedia Britannica available at https://www.britannica.com/. In an example, the labeling for each question and associated weighted answers comprises at least part of information indicating performance of a user on answering the question, a grade of the question (difficulty level of the question), or skill level of the user, and a distribution of weighted answers.
[0032] Phase 304 comprises training a machine learning component based on the labeled quiz-data received in phase 302. In an example, the training comprises training the Generative Pre-trained Transformer 3 (GPT-3) model, for example Davinci model described in “https://beta.openai.com/docs/engines/davinci”. GPT-3 model and its applications are described in “https://openai.com/blog/gpt-3-apps/”. In general, the GPT- 3 model may be trained based on giving text prompts, i.e. prompts, to the GPT-3 model. Examples of the text prompts comprise phrases and/or sentences. In return to each prompt, the GPT-3 model returns a text completion in natural language. Accordingly, in phase 304 the training is performed based on generating one or more prompts for the GPT3 based on the labeled quiz-data received in phase 302, whereby the labeled quizdata is fed to the GPT-3 model and the GPT-3 model may output quiz-data comprising one or more user-specific quizzes. The quiz-data may comprise one or more questions and for each question associated weighted answers. The weighted answers for each question comprise one or more answers in each of two or more categories for possible answers, wherein the two or more categories comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer. The GPT-3 may be trained for different skill levels of the users. For example, the GPT-3 may output quiz-data comprising 7 weighted answers and the user-specific quiz, according to the skill level of the user, may be determined from the output of the GPT-3 model based on a distribution of weighted answers that is specific to the skill level of the user. Each of the distributions of weighted answers of the user-specific quizzes corresponding to the skill level of the user may comprise a configuration of the two or more categories and a number of answers in each of the two or more categories. Examples of the two or more categories may comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer. Examples of the skill levels comprise novice, apprentice, proficient and expert. In this way, categories of the weighted answers and number of weighted answers in each category may be determined specific to different skill levels of users. The quiz-data output by the GPT-3 may be evaluated e.g. by user such as a teacher to determine whether the GPT-3 model has been trained sufficiently. A sufficient training may be determined based on a proportion of acceptable quiz-data from a total quiz-data output by the GPT3-model. A suitable proportion of acceptable quiz-data may be e.g. 80%. In an example, the quiz-data, e.g. questions and answers may be displayed to the teacher using the user interface described with Fig. 6. In this way the teacher may assist and/or train the machine learning component. Additionally, based on the teacher’s user input on the user interface, the proportion of acceptable quiz-data may be evaluated and updated continuously based on user input of the teacher. In this way the proportion of acceptable quiz-data may be used in addition to the user input of the teacher to validate the user-specific quizzes.
[0033] Phase 306 comprises the machine learning component outputting a userspecific quiz for a user, or student. In an example, the user-specific quiz comprises questions that are associated with multiple answers of different categories. Accordingly, the user-specific quiz comprises weighted answers of at least two different categories for each question. Examples of the categories of weighted answers comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer.
[0034] In an example, phase 306 comprises the machine learning component outputting questions that have more than one grade. Accordingly, each question may have a grade, or a difficulty level of the question, and the question may be associated with and weighted answers. In an example, the difficulty level of the question may be easy, medium or hard. In an example, the grade of the question may be determined based on Flesch-Kincaid readability tests described in https://en.wikipedia.org/wiki/Flesch%E2%80%93Kincaid_readability_tests. The Flesch- Kincaid readability tests comprise Flesch reading ease and Flesch-Kincaid grade level. They may be both used for determining a grade of a question or only one of them may be used. The Flesch reading ease test is defined based the following formula that gives the Flesch reading ease score:
Figure imgf000011_0001
The Flesch-Kincaid grade level is defined based the following formula that gives the Flesch-Kincaid grade level:
Figure imgf000011_0002
Values obtained by the Flesch reading ease score and the Flesch-Kincaid grade level may be compared with value ranges that correspond with the difficulty levels of the questions, e.g. easy, medium or hard. The different difficulty levels may be determined based on value ranges, whereby the values obtained by the Flesch reading ease score and/or the Flesch-Kincaid grade level falling within a given value range may be used to determine the difficulty level of the question. It should be noted that the Flesch-Kincaid readability tests may be accessed by the system based on an Application Programming Interface (API) provided at “https://github.com/words/flesch-kincaid”.
[0035] Phase 310 comprises providing the generated user-specific quiz to the user. In this way the user may take the user-specific quiz and the user may be evaluated based on the user-specific quiz. The generated user-specific quiz may be displayed to the user on a user interface of the user device of the user. User input of the user on the user interface may be used to determine answers to the questions of the user-specific quiz. [0036] Phases 312, 314 and 316 comprises receiving user-specific performance data of the user answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration 314 for the user answering the at least one question, at least one weighted answer 312 selected by the user and monitoring data 316 of the user answering the at least one question. Accordingly, user-specific performance data of the user is received, when the user is taking the quiz provided to the user. In an example, the monitoring data may comprise sensor data received from one or more sensors that have been configured for monitoring the user. The one or more sensors may comprise video and/or still cameras, and/or microphones, and/or audio recording means.
[0037] Phase 318 comprises adapting, based on the received user-specific performance data, distributions of weighted answers of the user-specific quiz of the user and controlling the user device of the user to provide, on the user interface of the user device, at least one subsequent question and associated weighted answers of the userspecific quiz corresponding to the user based on the adapted distribution of weighted answers of the user-specific quiz corresponding to the user. In an example the machine learning component may check user inputs of the user to the user-specific-quiz and determine based on the received user-specific performance data, a distribution of weighted answers of the user-specific quiz corresponding to the user, and determine, based on the adapted distribution of weighted answers, one or more subsequent questions and associated weighted answers of the user-specific quiz to the user.
[0038] Phase 320 comprises displaying the user-specific quiz with the one or more subsequent questions to the user. In this way the user may provide his/her answers to the one or more subsequent questions that have been adapted based on the user-specific performance data. If the quiz is completed after the user provide his/her answers to the one or more subsequent questions that have been adapted based on the user-specific performance data, the method proceeds to phase 322. Otherwise the method may proceed to phase 322.
[0039] Phase 322 comprises generating, by the at least one processor, a grade, or skill level, for the user, after a completion of the user-specific quiz corresponding to the user on the user device. The grade, or skill level, of the user indicates a skill level of the user in the at least one topic. [0040] In an example in accordance with at least some embodiments, at least one of phases 320 and 322 may provide that a skill level of the user, for example improvement of the skill level of the user in a topic can be updated to a backend system. The skill level of the user may be updated in accordance with the method described in Fig. 2.
[0041] Fig. 4 illustrates an example of labeled quiz-data in accordance with at least some examples of the present invention. The labeled quiz-data may be used for training a machine learning component e.g. in phase 304 of Fig. 3. The labeled quiz-data may be generated by human, e.g. a teacher.
[0042] The labeled quiz-data is illustrated in a table for facilitating to understand the relationships between different parts of the quiz-data. However, it should be noted that also other presentations are viable. The quiz-data comprises for each key concept 402, one or more questions 404. Each question is presented in a row that defines further quiz- data associated with the question. Accordingly, each question is associated with a set of weighted answers 406, 408, 410, 412, 414, 416, 418. The question and associated weighted answers have a grade of the question, or difficulty level, 420 of the question. The question and associated weighted answers are associated with a skill level 422 that indicates a skill level of the user that the questions and associated weighted answers are targeted.
[0043] In an example, in accordance with at least some embodiments, weighted answers 406, 408, 410, 412, 414, 416, 418 may comprise one or more answers in two or more categories comprising a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer. In this way a distribution of categories may be adapted for user-specific quizzes. In an example, the distribution of categories of weighted answers may be associated with a skill level of a user. The skill levels may be for example novice, apprentice, proficient and expert. The distribution of categories for one skill level may have a different portion of weighted answers in a specific category than a distribution of categories for another skill level. In a more detailed example a portion of correct answers of the weighted answers may be the highest for a novice skill level and the lowest for an expert skill level. A question for novice skill level may be associated with altogether 3 questions comprising the following distribution of categories: 1 correct answer, 1 one nearly correct answer and 1 extremely incorrect answer. A question for apprentice skill level may be associated with altogether 4 questions comprising the following distribution of categories: 1 correct answer, 1 one nearly correct answer, 1 mostly incorrect answer and 1 extremely incorrect answer. A question for proficient skill level may be associated with altogether 6 questions comprising the following distribution of categories: 1 correct answer, 2 one nearly correct answer, mostly incorrect answer 2 and 1 extremely incorrect answer. A question for expert skill level may be associated with altogether 8 questions comprising the following distribution of categories: 1 correct answer, 4 one nearly correct answer, 2 mostly incorrect answer and 1 extremely incorrect answer.
[0044] Fig. 5 illustrates an example of user interface for generating a user-specific quiz based on one or more key concepts, or topics, in accordance with at least some examples of the present invention. The user interface may be displayed on a user device of a teacher, or a teacher user device. The user interface provides the teacher to assist and/or train a machine learning component in generating user-specific quizzes, for example in phase 304 of Fig. 3. The user interface comprises one or more views 500, 501 that each may comprise user interface elements. At least part of the user interface elements may be selectable by the user based on the user entering user input that is associated by the user device as a selection that of a user interface element displayed on the user interface. The first view comprises a user interface element 502 that causes deriving, one or more topics, from the one or more digital learning materials. After the user selects the user interface element 502, the one or more topics may be derived and displayed in the second view 501. The second view may display the topics to the user. The second view may provide the user to select one or more of the topics. Also editing of the topics may be permitted. The second view 501 illustrates selected topics by the user 504 and non- selected topics by the user 505. The second view may comprise a user interface element 506 that when selected causes the system to generate questions and associated weighted answers for each of the selected topics. The views 500, 501 may be used for example in connection with phase 304 and/or 306 of Fig. 3 for assisting the machine learning component to generate user-specific quizzes. The user-specific quizzes may be generated based on the topics selected by the user.
[0045] Fig. 6 illustrates an example of user interface for generating weighted answers to questions for a user-specific quiz in accordance with at least some examples of the present invention. The user interface may be displayed after the machine learning component has generated one or more user-specific quizzes in accordance with phase 304 and/or 306 in Fig. 3. The user interface 600 may display answers 604,606, 608 associated with each question 602 of the generated quiz. The answers may comprise one or more answers in two or more categories comprising a correct answer 604 and at least one of a nearly correct answer 606, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer 608. The answers may be weighted, whereby a number of answers in each category may be controlled by the user via the user interface, for example by way of one or more user interface elements for setting a number of answers in each category. The weighted answers may be specific to a skill level of the quiz. The skill level of the quiz may be e.g. novice, apprentice, proficient and expert. The user interface may further comprise a user interface element 610 for validating the question and/or one or more further user interface elements for setting conditions the quiz. The conditions of the quiz provide controlling the quiz.
[0046] In an example, the user interface described with of Fig. 6 may be used for assisting training of the machine learning component in phase 304 of Fig. 3. The question 602 and associated answers 604, 606, 608 may be obtained from the output of the machine learning component, e.g. during training of the machine learning component, e.g. the GPT-3 model. The question 602 and associated answers 604, 606, 608 of the quiz-data may be displayed to the teacher on the user interface. In this way, the teacher may review all questions and answers of user-specific quizzes of different skill levels via the user interface. Accordingly, the teacher can review via the user interface the content of the questions and answers, distribution of weighted answers, numbers of weighted answers for questions and difficulty of the questions. The training of the machine-learning component may be determined to be completed based on user input received from the teacher to the questions and associated answers for different skill levels displayed on the user interface. If the teacher enters user input to the user interface for validating the userspecific quizzes of each skill level, the machine learning component may be determined to be sufficiently trained. In an example, user input of the teacher to the user interface element 610 may serve for validating the user-specific quizzes of different skill levels or a further user interface element may be provided on the user interface for validating one or more of the user-specific quizzes. Consequently, the user interface to the teacher assists in training of the machine learning component at least by allowing the teacher to validate questions and associated weighted answers directed to the different skill levels of users. Once the quiz-data, i.e. questions and associated weighted answers directed to the different skill levels of users, have been validated, the machine learning component has been configured to generate user-specific quizzes based on more than one skill level of users.
[0047] Fig. 7 illustrates an example of user interface for setting quiz conditions for a user-specific quiz in accordance with at least some examples of the present invention. The user interface 700 may form a part of the user interface described with Fig. 6 for example. The quiz conditions may comprise time parameters 704, skill level 708 and users assigned to take the quiz. The time parameters may comprise a start time of the quiz, a due date of the quiz and a time limit to answer each question. Examples of further parameters comprise a setting that determines whether the quiz is a closed book exam. If the quiz is set to be a closed book exam, monitoring data of the users taking the quiz may activated during the quiz and used to determine that the users do not use extra materials during the quiz.
[0048] Fig. 8 illustrates an example of user interface for setting recommendations based on results of user-specific quizzes in accordance with at least some examples of the present invention. The user interface 802 may be displayed on a teacher user device in connection with phase 322. The user interface enables a teacher to determine recommendations to users based on the grades, or skill levels, of the users after taking the user-specific quizzes. The user interface comprises skill-level based recommendations 804,806,808, 810. The skill level -based recommendations may be determined based on a machine learning component and/or the teacher. The user interface may comprise user interface elements corresponding to the skill level -based recommendations, whereby the teacher may edit them. This may be particularly useful if the machine learning component has determined the skill level -based recommendations, whereby the teacher may review the recommendations on the user interface and edit the recommendations. Examples of the recommendations comprise a recommendation to review a specific material and a recommendation to have a session with the teacher. The session with the teacher may ne an online meeting or an in-person meeting in a classroom for example. The user interface may comprise a user interface element 814 for validating the recommendations, whereby the recommendations may be sent to the users according to their skill levels. Accordingly, users that have a specific skill level are sent the recommendations for the specific skill level.
[0049] Fig. 9 illustrates an example of user interface for a user for taking a user-specific quiz in accordance with at least some examples of the present invention. The user interface may be displayed don a user device for example in phase 310 of Fig. 3. The user interface 900 may comprise one or more questions 902 of a user-specific quiz and weighted answers 904 associated with the one or more questions. The weighted answers may be selectable by the user either directly or separate user interface elements 906 may be provided for selecting the weighted answers.
[0050] Figs 5 to 9 describe features of user interfaces. It should be noted that, that although not explicitly mentioned, the features may be implemented by user interface elements that may be graphical objects that are displayed on a user interface of a user device, whereby the user of the user device may be provided information. Functionalities of the features implemented by user interface elements may be provided at least in response to a user input of the user. When the user input is directed or used to select a user interface element on the user interface, a command may be generated which causes a system according to an embodiment to perform the functionality associated with the user interface element. The user interface may be implemented using a touch screen, a display device and/or one or more computer peripheral devices.
[0051] Fig. 10 illustrates an example of a system in accordance with at least some embodiments of the present invention. The system 400 may be a user device, such as a student user device or a teacher user device. Examples of the user devices comprise a smart phone, a tablet device, a laptop computer a personal computer or any other similar device, or a part or module therefore. Comprised in apparatus 1000 is processor 1010, which may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 1010 may comprise, in general, a control device. Processor 1010 may comprise more than one processor. Processor 1010 may be a control device. A processing core may comprise, for example, a Cortex-A8 processing core manufactured by ARM Holdings or a Zen processing core designed by Advanced Micro Devices Corporation. Processor 1010 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor. Processor 1010 may comprise at least one application-specific integrated circuit, ASIC. Processor 1010 may comprise at least one field-programmable gate array, FPGA. Processor 1010 may be means for performing method steps in system 1000, such as generating, controlling, receiving, adapting, deriving and determining. Processor 1010 may be configured, at least in part by computer instructions, to perform actions. [0052] A processor may comprise circuitry, or be constituted as circuitry or circuitries, the circuitry or circuitries being configured to perform phases of methods in accordance with embodiments described herein. As used in this application, the term “circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations, such as implementations in only analogue and/or digital circuitry, and (b) combinations of hardware circuits and software, such as, as applicable: (i) a combination of analogue and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an system, such as a server, to perform various functions) and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
[0053] This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
[0054] The system 1000 may comprise memory 1020. Memory 1020 may comprise random-access memory and/or permanent memory. Memory 1020 may comprise at least one RAM chip. Memory 1020 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 1020 may be at least in part accessible to processor 1010. Memory 1020 may be at least in part comprised in processor 1010. Memory 1020 may be means for storing information. Memory 1020 may comprise computer instructions that processor 1010 is configured to execute. When computer instructions configured to cause processor 1010 to perform certain actions are stored in memory 1020, and system 1000 overall is configured to run under the direction of processor 1010 using computer instructions from memory 1020, processor 1010 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 1020 may be at least in part comprised in processor 1010. Memory 1020may be at least in part external to system 1000 but accessible to system 1000.
[0055] The system may comprise a machine learning component 1080. The machine learning component may be computer instructions stored on a memory. The memory may comprise random-access memory and/or permanent memory. Memory may comprise at least one RAM chip. Memory may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory may be at least in part accessible to processor 1010. Memory may be at least in part comprised in processor 1010. Memory may be means for storing information. Although illustrated as separate entities, the machine learning and the memory 1020 may be a single entity. Although illustrated as included in the system the machine learning component may be hosted by the system and connected to another system 1000 over a data communications connection, where the machine learning component may be utilized over the data communication connection. In an example, data received over the data communications connection from one or more user devices may be input to the machine learning component and the machine learning component may generate data that is transmitted over the data communication connection to the user devices. It should be noted that the received data may be processed before fed as input to the machine learning component and/or the data output by the machine learning component may be processed before transmitted over the data communication connection to the user devices.
[0056] System 1000 may comprise a transmitter 1030. System 1000 may comprise a receiver 1040. Transmitter 1030and receiver 1040 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 1030may comprise more than one transmitter. Receiver 1040 may comprise more than one receiver. Transmitter 1030and/or receiver 1040 may be configured to operate in accordance with suitable communication protocols, such as those used in a radio-access and core network of a cellular communication network.
[0057] System 1000 may comprise user interface, III, 1060. Ill 1060 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing system 1000 to vibrate, a speaker and a microphone. A user may be able to operate system 1000 via III 1060, for example to take a quiz and/or generate user-specific quizzes. [0058] Processor 1010 may be furnished with a transmitter arranged to output information from processor 1010, via electrical leads internal to system 1000, to other devices comprised in system 1000. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 1020 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 1010 may comprise a receiver arranged to receive information in processor 1010, via electrical leads internal to system 1000, from other devices comprised in system 1000. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 1040 for processing in processor 1010. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.
[0059] System 1000 may comprise further systems not illustrated in Fig. 10. System 1000 may comprise one or more of video and/or still cameras, and/or microphones, and/or audio recording means, and/or a fingerprint sensor arranged to authenticate, at least in part, a user of system 1000. In some embodiments, system 1000 lacks at least one device described above.
[0060] Processor 1010, memory 1020, transmitter 1030, receiver 1040, machine learning component 1080, and/or III 1060 may be interconnected by electrical leads internal to system 1000 in a multitude of different ways. For example, each of the aforementioned devices may be separately connected to a master bus internal to system 1000, to allow for the devices to exchange information. However, as the skilled person will appreciate, this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the present invention.
[0061] System 1000 may be connected by transmitter 1030 and receiver 1040 to one or more other systems or parts thereof. Examples of the other systems comprise at least machine learning components and a backend system. Connection of the system 1000 with the one or more other systems or parts thereof may be data communications connections for example data network connections over Internet Protocol ()IP) connections.
[0062] In general, the various embodiments of the invention may be implemented in hardware or special purpose circuits or any combination thereof. While various aspects of the invention may be illustrated and described as block diagrams or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof. [0063] It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting. [0064] Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.
[0065] As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.
[0066] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
[0067] While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
[0068] The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated.

Claims

1 . A method comprising:
• generating, by at least one processor connected to a memory storing computer program code and operatively connected one or more user devices, based on at least one digital learning material, user-specific quizzes comprising questions and associated weighted answers selectable via user interfaces of the one or more user devices;
• controlling, by the at least one processor, each of the one or more user devices to provide, on a user interface of each user device, a user-specific quiz of the generated user-specific quizzes corresponding to a user of the user device;
• receiving, by the at least one processor, from each of the one or more user devices user-specific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user, wherein the user-specific performance data comprises at least one of: a time duration for the user answering the at least one question, at least one weighted answer selected by the user and monitoring data of the user answering the at least one question;
• adapting, by the at least one processor, based on the received user-specific performance data, distributions of weighted answers of the user-specific quizzes corresponding to each user;
• controlling, by the at least one processor, each of the one or more user devices to provide, on the user interface of the user device, at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user based on the adapted distribution of weighted answers of the user-specific quiz corresponding to the user.
2. The method according to claim 1 , comprising:
• generating, by the at least one processor, the user-specific quizzes based on a machine learning component trained based on labeled questions and answers associated with at least one topic of the at least one digital learning material.
3. The method according to claim 2, comprising:
• generating, by the at least one processor, the at least one subsequent question and associated weighted answers of the user-specific quiz corresponding to the user by feeding the user-specific performance data to the machine learning component trained based on labeled questions and weighted answers associated with at least one topic of the at least one digital learning material.
4. The method according to claim 2 or 3, wherein difficulty levels of the questions, distributions of weighted answers and numbers of weighted answers of the userspecific quizzes are determined by the machine learning component.
5. The method according to any of claims 2 to 4, wherein the labeling for each question and associated weighted answers comprises at least part of information indicating performance of a user on answering the question, a grade of the question, a difficulty level of the question and a distribution of weighted answers.
6. The method according to any of claims 1 to 5, comprising:
• deriving, by the at least one processor, the at least one topic, from the at least one digital learning material; and
• generating, by the at least one processor, the questions and associated weighted answers based on the derived the at least one topic.
7. The method according to any of claims 2 to 6, comprising:
• determining, by the at least one processor, a skill level of the user based on the userspecific performance data of the user of the user device answering at least one question of the user-specific quiz corresponding to the user;
• determining whether the determined skill level meets a threshold for teaching the user the at least one topic; and o if it does, determining, by the at least one processor, updating the skill level and the at least one topic to a backend system for tracking learning results of the user; o if it does not, determining, by the at least one processor, a further user-specific quiz to the user; and controlling, by the at least one processor, to provide the further user-specific quiz on the user interface of the user device of the user.
8. The method according to any of claims 1 to 7, wherein the weighted answers comprise one or more answers in each of two or more categories for possible answers, wherein the two or more categories comprise a correct answer and at least one of a nearly correct answer, a somewhat incorrect answer, a mostly incorrect answer and extremely incorrect answer. The method according to claim 8, wherein each of the distributions of weighted answers of the user-specific quizzes corresponding to each user comprise a configuration of the two or more categories and a number of answers in each of the two or more categories. The method according to any of claims 1 to 8, wherein the at least one digital learning material comprises textual digital learning material. A system comprising at least one processor connected to a memory storing computer program code that when executed by the at least one processor is configured to perform a method according to any of the claims 1 to 10. .A computer program comprising instructions for causing a method according to any of the claims 1 to 10, when executed by at least one processor.
PCT/FI2022/050869 2021-12-23 2022-12-22 User-specific quizzes based on digital learning material WO2023118669A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20216343 2021-12-23
FI20216343 2021-12-23

Publications (1)

Publication Number Publication Date
WO2023118669A1 true WO2023118669A1 (en) 2023-06-29

Family

ID=84688530

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2022/050869 WO2023118669A1 (en) 2021-12-23 2022-12-22 User-specific quizzes based on digital learning material

Country Status (1)

Country Link
WO (1) WO2023118669A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080286737A1 (en) * 2003-04-02 2008-11-20 Planetii Usa Inc. Adaptive Engine Logic Used in Training Academic Proficiency
WO2017106832A1 (en) * 2015-12-18 2017-06-22 Swank Eugene David Method and apparatus for adaptive learning
WO2018227251A1 (en) * 2017-06-16 2018-12-20 App Ip Trap Ed Pty Ltd Multiuser knowledge evaluation system or device
US20210217323A1 (en) * 2019-07-03 2021-07-15 Obrizum Group Ltd. Cheating detection in remote assessment environments
US20210383711A1 (en) * 2018-06-07 2021-12-09 Thinkster Learning Inc. Intelligent and Contextual System for Test Management

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080286737A1 (en) * 2003-04-02 2008-11-20 Planetii Usa Inc. Adaptive Engine Logic Used in Training Academic Proficiency
WO2017106832A1 (en) * 2015-12-18 2017-06-22 Swank Eugene David Method and apparatus for adaptive learning
WO2018227251A1 (en) * 2017-06-16 2018-12-20 App Ip Trap Ed Pty Ltd Multiuser knowledge evaluation system or device
US20210383711A1 (en) * 2018-06-07 2021-12-09 Thinkster Learning Inc. Intelligent and Contextual System for Test Management
US20210217323A1 (en) * 2019-07-03 2021-07-15 Obrizum Group Ltd. Cheating detection in remote assessment environments

Similar Documents

Publication Publication Date Title
US20160293036A1 (en) System and method for adaptive assessment and training
Bada et al. Students’ language learning preferences
Saito Experienced teachers' perspectives on priorities for improved intelligible pronunciation: The case of J apanese learners of E nglish
CN109189535B (en) Teaching method and device
Wagner et al. The Duolingo English test
US20190147760A1 (en) Cognitive content customization
CN110009537B (en) Information processing method, device, equipment and storage medium
US20200051451A1 (en) Short answer grade prediction
CA3114543A1 (en) System and user interfaces for monitoring reading performance and providing reading assistance
CN110796338A (en) Online teaching monitoring method and device, server and storage medium
WO2016093791A1 (en) Organizing training sequences
US11928984B2 (en) Intelligent and contextual system for test management
KR20170031808A (en) Vocabulary memorizing method using repeating-exposure
KR101380692B1 (en) Apparatus for on-line study and method for the same
US20170075881A1 (en) Personalized learning system and method with engines for adapting to learner abilities and optimizing learning processes
WO2023118669A1 (en) User-specific quizzes based on digital learning material
CN112256743B (en) Self-adaptive question setting method, device and storage medium
US20210050118A1 (en) Systems And Methods For Facilitating Expert Communications
JP2015169844A (en) Learning support system, learning support method, and learning support program
JP6182780B1 (en) Explanation information providing apparatus, method, and computer program
KR20170084953A (en) Method and device for training a conditioned speaking
Nakatsuhara et al. Use of innovative technology in oral language assessment
KR20170071924A (en) Apparatus and method for providing learning service
CN113140210B (en) Audio correction method, device, electronic equipment and storage medium
Batawi WhatsApp as a tool for meaning negotiation: The use of web-enabled phones to consolidate vocabulary learning among university students in Saudi Arabia

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22830899

Country of ref document: EP

Kind code of ref document: A1