WO2018227251A1 - Multiuser knowledge evaluation system or device - Google Patents

Multiuser knowledge evaluation system or device Download PDF

Info

Publication number
WO2018227251A1
WO2018227251A1 PCT/AU2018/050594 AU2018050594W WO2018227251A1 WO 2018227251 A1 WO2018227251 A1 WO 2018227251A1 AU 2018050594 W AU2018050594 W AU 2018050594W WO 2018227251 A1 WO2018227251 A1 WO 2018227251A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
multiuser
questions
interface
knowledge evaluation
Prior art date
Application number
PCT/AU2018/050594
Other languages
French (fr)
Inventor
Steven Sarkis
Original Assignee
App Ip Trap Ed Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2017902292A external-priority patent/AU2017902292A0/en
Application filed by App Ip Trap Ed Pty Ltd filed Critical App Ip Trap Ed Pty Ltd
Publication of WO2018227251A1 publication Critical patent/WO2018227251A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Definitions

  • Structure for enabling students to proceed in lockstep or at their own pace through exercises and quizzes, responding electronically to questions asked, the teacher being able to receive the responses, and to interpret a readout, in histogram or other graphic display form, of student responses.
  • a central computer using an IBM AT (tm) compatible system is employed, together with a plurality of student computers which range from simple devices to fully fledged personal computers.
  • Optical peripheral hardware, such as VCRs or other recording/reproducing devices, may be used to provide lessons to students in association with the computer network.
  • this system cannot provide an efficient way to pairing up students or selecting an appropriate set of questions which suits the competency of the students.
  • US Patent Application No. 2011/0257961 discloses a system and method for automatically generating various question types, including automatic selection of multiple choice answers for display.
  • the present invention further relates to a system and method for selecting presentable multiple choice answers based on use of a word in a sentence, quality of a sentence, and frequency of use of the word in other sentences.
  • the present invention further relates to an adaptive learning system which aids a user in word comprehension by asking questions in a series of rounds and then tracking the progress of the user based on the categorization of each question.
  • this system and method fails to take the competency of the students into account.
  • US Patent Application No. 2014/0024009 discloses a method for providing educational content to a computing device associated with a user.
  • the method comprises the steps of receiving a request to provide educational content associated with a topic,
  • the request is a member selected from a group consisting of: (i) a search request from the computing device associated with the user; (ii) a request from a computing device of a parent of the user to provide educational content associated with a topic to the computing device associated with the user, and (iii) a request from a computing device of a teacher of the user to provide educational content associated with a topic to the computing device associated with the user.
  • the method further comprises the step of identifying at least one of: (i) user profile information associated with the user and/or (ii) learning preference information associated with the user.
  • the learning preference information is based at least in part on information collected from a survey completed by the user.
  • the method also comprises the step of identifying educational content responsive to the request from a plurality of content sources, wherein the educational content is identified based on at least one of (i) the identified user profile information and/or (ii) the identified learning preference information; and providing, for display on the computing device associated with the user, the identified educational content.
  • This method is also not capable of providing an efficient way to pairing up students or selecting an appropriate set of questions which suits the competency of the students
  • a multiuser knowledge evaluation device comprising: a user login interface adapted to cause a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute; a subject selection user interface adapted to cause the user to select one or more subjects; a communication interface for receiving one or more questions from a server, wherein the questions are selected or generated by a server in accordance with the competency level attribute; a question interface for presenting questions to a user and receiving answers for the questions; wherein the answers are send to the server via the communication interface, such that a processor in the server evaluates a user result, and update the competency level of the user accordance with the user result and results of others users.
  • the server is associated with a question database for storing one or more questions and corresponding answers.
  • the question database comprises questions selected in accordance with an Australian State or Territory high school certificate syllabus.
  • the questions are multiple choice format questions.
  • the questions are selected from the database with a random selection algorithm.
  • the random selection algorithm comprises the steps of: generating a random set of unique random values using software a pseudorandom number generator or hardware number generator, retrieving a question in the question database correspondence to each value in the random set.
  • the server is adapted to derive an association relationship of the question database and generate a new question in accordance with the association relationship and a reference source.
  • the reference source comprises one or more of: the questions in the question database and external data source.
  • the multiuser knowledge evaluation device further comprises an input device for receiving data from a user.
  • the multiuser knowledge evaluation device further comprises a location module for determining a current geolocation information, wherein the current geolocation information is forward to the server to taking into account for pairing.
  • a multiuser knowledge evaluation method comprising the steps of: retrieving an profile of a user, wherein the profile comprises a difficulty level attribute; receiving a selected subject set data, which comprises one or more selected subjects; selecting or generating a set of questions from a questions database, wherein the set of questions are selected or generated in accordance with the difficulty level attribute; receiving answers for the questions; evaluating a user result base on the answers and updating the difficulty level attribute of the user in accordance with the user results and results of other users.
  • Figure 2 shows a multiuser knowledge evaluation system or device in accordance with an embodiment of the present invention
  • Figure 3 shows an exemplary model, view and controller modules for alternatively interactive multiuser knowledge evaluation in accordance with an embodiment
  • Figure 4 shows a supervised machine learning for query generation, answer evaluation, subject categorisation competency determination and query difficulty level determination in accordance with embodiments
  • Figure 7 shows an login interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 8 shows a home interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 9 shows a select game mode interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 10 shows a game set up interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 11 shows a query or question interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 13 shows a second statistics of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 14 shows a first leader board of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 15 shows a Second leader board interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 16 shows a Third leader board interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 17 shows a setting interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 18 shows a profile interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 19 shows another login interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 20 shows a dashboard interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 21 shows a subject selection interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 22 shows a student list interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 23 shows a progress by subject interface of the multiuser knowledge evaluation system or device of Figure 2;
  • Figure 24 shows a progress by year interface of the multiuser knowledge evaluation system or device of Figure 2.
  • FIG 1 there is a first preferred embodiment and this embodiment includes an electronic device 100 which may take the form of the server 201 or the electronic devices 202 as is substantially provided in Figure 2 described in further detail below.
  • the electronic device 100 comprises a microprocessor 101 for processing digital data.
  • a memory device 108 In operable communication with the processor 101 across a system bus is a memory device 108.
  • the memory device 108 is configured for storing digital data including computer program code instructions.
  • the computer code instructions may be design with the Model- View software patterns. Hence, part of the instructions code can be divided into data model 109, controller 110, and view 113 modules.
  • the data model 109 represents the data structure and underlying data 111 stored within the memory device 108. In one embodiment, the data model 109 is the central components of the pattern.
  • the data model 109 expresses the software application's behaviour in terms of the problem domain, independent of the user interface or view 113. It directly manages the data, logic and rules of the software application or method of the electronic device 100.
  • the view 113 defines the output representation of information of an embodiment of the present invention.
  • the view 113 may provide different look and feel for different electronic device 100 and the operating system on the electronic device, but generally provide the same functionalities for the user.
  • the view 113 in implementing various computer processing functionality including that which is described herein in further detail below.
  • the view 113 is implemented on a web browser.
  • the view 113 is implemented in an augmented reality environment.
  • the controller 110 interface the data model 109 and view 113 by accepting input data and converting the input data into commands for the model 109 or view 113.
  • the model 109 is responsible for managing the data of the application. It receives user input from the controller 110.
  • the view 113 defines presentation of the model 109 in a particular format.
  • the controller 110 is responsible for handling the user input and performing interactions on the data model objects.
  • the view 113 receives the input, validates the input data, and pass the data to the model 109 through the function calls in the controller 110.
  • the electronic device 100 comprises a display 103 in communication with the processor 101 for the presentation of digital data thereon.
  • the display 103 is a touch screen where a touch controller 104 is adapted to receive the input from the touch screen.
  • the display 103 is associated with a haptic user interface that simulates the touch and feel of a virtual object, or to provide forces, vibrations, or motions to the user.
  • the computer 100 further comprises a wireless processor 105 for sending and receiving data across a wireless data network, such as a Wi-FiTM or BluetoothTM network or the like.
  • a wireless data network such as a Wi-FiTM or BluetoothTM network or the like.
  • the electronic device 100 may take the form of a server 101, the electronic device 100 comprise a wired network interface, such as Ethernet.
  • the electronic device 100 further comprises a power controller 106 for managing various aspects of the power consumption of the electronic device 100.
  • the computer 100 may comprise a GPS processor or location module 106 for ascertaining the current geolocation information of the computer 100.
  • the electronic device 100 is adapted to look up the current geolocation information of an Internet Protocol (IP) address to determine the current geolocation information.
  • IP Internet Protocol
  • the electronic device 100 is adapted to obtain an approximate location by communicating with nearby network devices.
  • an interactive multiuser knowledge evaluation system 200 of an embodiment of the present invention comprising a server 201 in operable communication with a plurality of electronic devices 202 across a data network 203.
  • a server 201 in operable communication with a plurality of electronic devices 202 across a data network 203.
  • two electronic devices 202 comprising a first and second electronic devices 100 so as to especially illustrate the aspects of the server 201 implementing alternate user interface activation between the electronic devices 202.
  • the server 201 is a standalone server 201 accessible across the Internet, such as a physical rack mounted server or alternatively a virtualised server instance, such as that which may be implemented utilising Amazon Web Services (AWS), GoogleTM Cloud, or the like.
  • the electronic devices 202 may take the form of mobile communication devices, such as smart phones, tablet, laptop, or the like.
  • FIG. 3 There is shown exemplary model view controller (MVC) modules 300 illustrating various aspects of the data models 109, controllers 110 and views 113.
  • MVC model view controller
  • the data model may comprise Q&A data object 301 which, as will be described in further detail below, is alternately presented via corresponding query interface 321, and answer interface object 322 of the electronic devices 202.
  • the look and feel of the view will different from operating system to operating system, and device to device.
  • the electronic devices 202 is an augmented reality (AR) device which provide an AR look and feel for the view.
  • AR augmented reality
  • the view 113 is adapted to allow user may manipulate AR object for input and output.
  • Meta data 302 are the attributes in the Q&A data object 301.
  • Meta data 302 which represents query difficulty in embodiments or other aspects, may be stored in relation to the Q&A data 301 object.
  • the Q&A data 301 object may have an attribute subject category data 303. In this manner, questions may be selected in accordance with the particular subject categorisation.
  • the data model 109 may comprise user data object 304 representing various user data of the users of the system 200.
  • the user data object 304 is a superclass for the subclass Student data object, Teacher data object, and Administrator data object.
  • the Controller modules 110 comprises a pairing controller 310.
  • the pairing controller 310 is configured for pairing electronic devices 202 and specifically users of the electronic devices 202 in accordance with various parameters.
  • the pairing controller 310 will look into the various attributes of all instances of the Student object to pair up students of similar ability.
  • the attributes taken into consideration are: historical winning percentage, subjects which the student previous chosen, the difficulty of the question which the student successfully answered, etc.
  • the controllers 110 may comprise a user interface activation controller 311 which is configured for alternately activating associated user interfaces of the electronic devices 202 in turn or in a mutual exclusion manner. In this configuration, the student may compete in a turn base competition.
  • the user interface activation controller 311 may have a time for recording the time for each student uses during the competition.
  • user interface activation controller 311 which is configured for allowing activating associated user interfaces of the electronic devices 202 in real time.
  • the controllers 110 may further comprise a randomised controller 311 which acts in unison with a randomised interface 320 which is utilised for the selection of a random subject categorisation.
  • the randomised interface 320 may take the form of a spinning wheel graphical representation which may be virtually spun so as to select a pseudorandom subject categorisation.
  • the controllers 110 may further comprise a query controller 312 configured for selecting a random query from the query data 301 in accordance with the pseudo- randomly or randomly selected subject categorisation determined by the randomiser controller 311 and interface 320. Such query data may be presented via the query interface 321 allowing the user, utilising the answer interface 322 to input associated answer data.
  • the controllers 110 may further comprise an answer controller 313 configured for analysing/evaluating the input answer data so as to, in general terms, consider whether or not the provided input answer data is correct given the query data selected.
  • the controllers 110 may further comprise a difficulty determinator controller 314 which may analyse the query data 301 so as to determine associated difficulty levels and wherein, in embodiments, queries may be selected from the query data 301 in accordance with a difficulty level and in particular, a difficulty level determined in accordance with user competence wherein, in embodiments, the controllers 110 may further automate the determination of a user's competence, including by analysing the answer data provided by the user.
  • a difficulty determinator controller 314 which may analyse the query data 301 so as to determine associated difficulty levels and wherein, in embodiments, queries may be selected from the query data 301 in accordance with a difficulty level and in particular, a difficulty level determined in accordance with user competence wherein, in embodiments, the controllers 110 may further automate the determination of a user's competence, including by analysing the answer data provided by the user.
  • the controllers 110 may comprise a peer-to-peer communication controller 316 which may be utilised for broadcast or multicast communication from an electronic device 202 to multiple electronic devices via a message board, an electronic forum, newsgroup, Internet Relay Chat, FacebookTM , TwitterTM, TwitchTM, YouTubeTM, etc.
  • the user may control the electronic device 202 to forward a posed query to a message board, an electronic forum, newsgroup, Internet Relay Chat, FacebookTM , TwitterTM, TwitchTM, YouTubeTM, ... , etc. for public participation in the answering of the query.
  • the controllers 110 may further comprise a statistics generator controller 317 configured for generating various user statistics including those determined in accordance with the answers provided by the user via the answer interface 322.
  • the statistics generator controller 317 may carry our data analysis using a plurality of statistic functions, hypothesis testing, regression, classification, clustering, association... etc.
  • the controllers 110 may further comprise a resource determinator controller 318 configured for identifying electronic study material resources for provision to users in accordance with a determined user competence or learning weakness which may be intelligently determined by the server 201 by analysing the answer data provided by users.
  • a resource determinator controller 318 configured for identifying electronic study material resources for provision to users in accordance with a determined user competence or learning weakness which may be intelligently determined by the server 201 by analysing the answer data provided by users.
  • various aspects of the computer processing of the system 200 may be performed utilising aspects of artificial intelligence.
  • the system 200 may automate the generation of queries wherein the queries are generated utilising supervised and / or unsupervised machine learning.
  • the system 200 may be configured for the automated marking, evaluation of provided answer data, especially wherein such answer data is provided a free-form format wherein such an evaluation may utilise supervised and / or unsupervised machine learning for the evaluation of the semantics of the answer data for evaluation.
  • FIG. 4 shows an artificial intelligence/supervised machine learning system or method 400 for a preferred embodiment of the present invention.
  • the artificial intelligence/supervised machine learning system or method 400 is adapted to determine the query difficulty, such as the difficulty level of the questions for a student.
  • the functionality of the supervised machine learning system or method 400 as is substantially provided in Figure 4 may be equally applicable mutatis mutandis for the other automation aspects, including the automation of the generation of query data and the evaluation of answer data.
  • the supervised machine learning 400 comprises a machine learning module 403 for training a neural network with the training data set 405.
  • the electronic device 100 will be able to generate a set of optimising parameters 402 for each neurons in the artificial neural network, such as weight, number of layers, topology of the neuron connections, etc. With these optimising parameters 402 the electronic device 100 is adapted to configure and optimise a neural network to determine the query difficulty.
  • the training data 405 of an embodiment of the present invention comprises training query data and data indicative of the difficulty thereof.
  • the difficulty includes one or more of: average user response times, time for answering evaluation data, the answer evaluation data representing whether or not the queries of the query data were answered correctly or not, etc.
  • the machine learning module 403 trained using such data 405 to generate the optimising parameters 402 such that the trained artificial neural network 401 may similarly take as input query data 404 and output difficulty ratings 406.
  • the difficulty level rating 406 may be utilised for the purposes of determining appropriate queries to pose to users in accordance with user competencies.
  • user competencies may be determined in an automated manner by the system including utilising supervised machine learning system or method 400.
  • the system 200 may determine user competencies by subject categorisation so as to be able to pose queries a matching difficulty accordingly.
  • the system 200 may automate the determination that a user has strong competency for maths, average competency for geography and low competency for history.
  • the query selected by the query controller 312 may be selected in accordance with such competencies wherein, for example, difficult queries are selected for the maths subject categorisation and easy queries are selected for history.
  • FIG. 5 There is shown an exemplary user interaction (also known as the system process flow chart) 500 primarily illustrating the alternate user interface activation process performed by the server.
  • the user interaction process 500 generally illustrates the functions performed by the electronic devices 202 and the server 201.
  • Figure 5 shows a pair of electronic devices comprising first and second electronic device 202 as an example. It is envisage that the server may control many more devices at the same time.
  • the user interaction process 500 initiates with step 501 of pairing wherein the electronic devices 202 are paired.
  • the paring is performed by the pairing controller 310 so as to select appropriate users for the subsequent user interaction.
  • the pairing controller 311 may analyse user data 304 for pairing, such as by analysing user location (such as which may be input by the user, such as a school name or the like or determined by the GPS processor 106 by locality), and user subject data, such as wherein the user has input the user's school subjects.
  • the pairing controller 310 may aim to match users in accordance with locality (i.e. at the same school or learning institution) and generally by course subjects similarity.
  • the server 201 Having paired the electronic devices 202 in this manner, the server 201 then initiates the alternate turn user interaction between the electronic devices 202 wherein, for each turn, queries on the selected subjects are posed to each user and answer data are received accordingly for evaluation.
  • the user interaction 500 process to step 502 for obtaining a set of queries with a randomiser interface 320.
  • the randomiser interface 320 is associated with the randomiser controller 311 which randomly or pseudorandomly selects queries within the selected subjects.
  • the randomiser interface 320 may generate a spinning wheel randomiser interface, rolling dice interface, or other kind of randomising interfaces.
  • the user of the first electronic device 202 may be invited to spin the wheel in a virtual manner.
  • the spinning wheel may then land on a particular subject categorisation.
  • the system server 201 will retrieve a query on that particular subject and forward it to the electronic device 202.
  • the spinning wheel representation shown may comprise a plurality of divisions wherein each division is configured in accordance with the user's subjects provided during the registration stage.
  • the first user subject categorisation may comprise subjects comprising maths, history, geography and social sciences and therefore the spinning wheel virtualisation comprises divisions accordingly.
  • the spinning wheel graphical representation may further comprise a division comprising a wildcard-type division which may be represented by a crown icon for which the associated processing.
  • the outcome of the utilisation of the randomiser controller 311 and the randomiser interface 320 is a randomly or pseudorandomly selected category selection asinstep504.
  • the query data may be selected from the data defined by the data model 109.
  • step 505 the user interaction process 500 proceeds to step 505 for determining the level of the queries.
  • the level determination step 505 takes into account of the competency levels of the user which may be determined by an electronic device 202.
  • the electronic device 202 is adapted to determine that a user has certain level of competency or difficulty in the selected subject.
  • the selection of the query from the data defined by the data model 109 may comprise a selection of a difficulty query.
  • step 507 for selecting a set of queries from the database or generating a set of new queries.
  • the user interaction process 500 may comprise the selection or generation step 507 of associated query data 311.
  • the query controller 312 may be configured for query generation. For example, especially for the mathematics subject categorisation, the query controller 312 may generate random mathematical questions or queries in accordance with mathematical equation templates.
  • the query controller 312 may intelligently generate queries from subject matter data sets. For example, for history, the query controller 312 may retrieve and associated history subject matter resource, such as from an external website or from within the database.
  • the query controller 312 may browse to an external website, such as Wikipedia and select a pseudorandom history topic or a history topic within a particular field, such as relating to World War II.
  • the query controller 312 may select a Wikipedia page relating to Mussolini.
  • the query controller 312 may browse to an external study materials, such as the contents from Design Patterns: Elements of Reusable Object-Oriented Software by Gamma et al and select a pseudorandom on software design pattern.
  • the query controller 312 may select a page in the book relating to data model.
  • the user may input an associated answer in step 506 utilising the answer interface 322. Answers may be provided in various manors such as by multiple-choice, free-form text input.
  • the electronic device 202 may request a user to perform a task on the touch controller 104, such as a gesture.
  • the answer interface 322 may present a number of virtual objects on the display for a user to manipulate in a virtual environment or an augmented reality environment. The user pay select or pick a virtual object as an answer, or may assembly certain items, objects as an answer.
  • the server 201 carries out the analyses / evaluation step 508 on the answer data so as to determine whether the provided answer data is correct in accordance with the query data posed.
  • the user interaction process 500 of the server 201 may be configured such that in the event when a user has successfully answered a certain number of queries in a particular subject in succession, the associated category may be allocated 513 to the user indicating that the user has "won" or "completed” the subject.
  • the server 201 may allocate a "won” or "completed” status to the user on the mathematics subject categorisation to indicate that the user has "won” or "completed” the mathematics subject.
  • the user interaction 500 may halt in the end step 514.
  • One of the halting condition in step 514 is when a user has "won” or "completed” all subjects allocated to the user.
  • the user interaction process 500 may end in step 514.
  • the interaction process 500 may end in step 514 after the certain number of queries are posed and one user has successfully answered the most number of queries.
  • a wildcard demarcation on the spinning wheel randomised interface 312 may be utilised as an alternative to having to answer three queries correctly in succession. As such, if the spinning wheel randomised interface 320 lands on a crown icon, user is allowed to attempt in answering a series of query in relation to a particular subject.
  • step 510 is carried out by the difficulty determinator controller 314.
  • the difficulty determinator controller 314 is adapted to automate the determination process of a difficulty for the posed question and this determination process may take into consideration the time taken to answer the query whether or not the answer data was correct and other factors.
  • the user interface activation controller 311 may then implement a changeover control in step 509.
  • the user interface availability/activation is transferred to a second electronic device.
  • the user interactive process 500 may proceed to the notify step 511 of sending an electronic notification to the second electronic device 202.
  • the user interactive process is then looped back to the randomiser step 502. However, in this round, the randomiser interface step 503 and category selection step 504 are carried out in the second electronic device 202 instead of the first electronic device. Also, the server 201 is now expected to receive the answer input in step 506 from the second electronic device rather than the first electronic device.
  • the user interaction process 500 ends when a particular user is successfully obtained all subject categorisations. The user interaction process 500 may also end when after a certain number of queries have been posed and the user having answered the most number of queries successfully is determined.
  • the user interaction process 500 is implemented on a turn by turn basis wherein the user interface activation controller 301 alternately transfers control of the user interface between the electronic devices 202 depending on the manner of the answers provided by the respective users.
  • the user interface activation controller 311 may be configured to transfer control of the user interface to the second user after a certain number of subjects have been allocated, such as half of the number of subjects. This will prevent all of the subject categorisations being allocated to a user on a first turn,
  • a user may seek to challenge a subject allocation of an opposing user.
  • a user may challenge the opposing user in mathematics.
  • the server 201 is configured to present a series of questions to each respective user of each electronic device 202.
  • the user having answered the most number of queries correctly is recorded as the winner of the challenge.
  • a user may utilise the electronic device 202 to send a query to a publicly accessible message board, electronic discussion form, Internet Relay Chat channel, Bulletins Board System, TwitterTM, TwitchTM, YouTubeTM, etc. utilising the peer-to-peer communication controller 316.
  • the user may control the electronic device 202 to forward the query to an electronic forum.
  • Other participating users, utilising associated electronic devices may receive a notification of the query.
  • Any other users on the electronic forum may help to provide answers in a collaborative manner with another electronic device 202 connected to the server 201.
  • the answer controller 303 may analyse/evaluate these answers provided to validate the answer.
  • the first user to provide the answer may be allowed to pose a next query.
  • FIG. 6 shows a use case 10 of the multiuser knowledge evaluation system or device of an embodiment of the present invention.
  • the multiuser knowledge evaluation system or device also comprises a special type of user, the administrator.
  • the administration will be responsible for system administration, such as backing up the system, generating usage report, populating queries or questions into the system, maintaining and setting up the parameters for the artificial algorithm for determining the difficulties, etc.
  • the data model 109 of the user contains a number of attributes such as user identifier, email, first name, last name, school, etc.
  • the user object may also comprises an avatar of the user representation in a virtual reality or augmented reality environment.
  • the user may select or change the attributes and appearance of the avatar.
  • the avatar projects a residual self- image, which is typically an idealized subjective perception of the user's appearance.
  • the data model 109 for student 11 and teacher 12 are inheritance or subtype of the user.
  • the multiuser knowledge evaluation system or device may implement these data model 109 as an instance of objects in the system. These instance can be stored in an object oriented database or relational database. In one embodiment, the instances of the objects are manipulated, transferred, or stored in JavaScript object notation format or extended markup language format.
  • the student 11 object will also have the additional attribute of grade indicating the actual grade of the student.
  • the student object 11 may also have a status indicating he or she has "won" or "completed” a subject in the game.
  • the student object 11 may also comprise an attribute indicating the difficulty level which the student has attained.
  • the database system can be part of the server 201.
  • the database system is a standalone system which the server 201 can be accessed over the Internet.
  • the database system can be a standalone server or a distributed database system.
  • the knowledge evaluation system or device is implement with the system described in Figure 1 to Figure 5.
  • the student will access the knowledge evaluation system or device 200 through the electronic device 202 by connecting to the server 201.
  • the server 201 will provide the login function 22 to the student 11 through the login UX as shown in Figure 7.
  • the student 11 must login to the electronic device 202 before he or she can access the proper functions of the knowledge evaluation system or device 200.
  • a demonstration version with fewer functions may be provided to any user.
  • On the login US the student is required to enter the email and the password.
  • the email is one of the user identifier of the student 11.
  • the server 201 direct the student to a home UX as shown in Figure 8. From the home UX, the student 11 may access the profile administration function 24, the game function 30, and the statistic report function 36. [0131] On the home UX, the student 11 may select a new game icon to access the game function 30. When a student 11 select a new game icon, the server 201 and electronic device 202 will direct the student to the game selection UX as shown in Figure 9. On the game selection UX, the student 11 may select to play Challenge a Friend mode or Challenge Me mode. In the Challenge a Friend mode, the multiuser knowledge evaluation system or device will execute the user interaction process 500. The student 11 will pair up with other users to play the game. In the Challenge a Friend mode, the multiuser knowledge evaluation system will allow the student 11 to play the game in solo mode.
  • the server 201 and the electronic device 202 will direct the student to a game setup UX as shown in Figure 10.
  • the game set up UX allow a student to access the game set up function 32.
  • the student may: select the number of question to answer or query to answer, select a time limit per question or query, and select one or more subjects of questions or queries in the game.
  • the server 201 will then carry out the step 505 to determine the difficulty level of the questions, select / generate questions as in step 507.
  • the server 201 will then forward the questions to the electronic device 202 for the play function 34.
  • the electronic device will carry out the play function 34 by displaying the questions to the student 11 and receiving answer from the student with the play UX, query interface, or query UX as shown in Figure 11.
  • the play UX, query interface, or query UX is provided in a virtual reality or augmented reality environment, where the student 11 is represented by an three dimension avatar in selecting an object or carry out a task for generating an answer.
  • the student 11 may access the statistics function 36.
  • the multiuser knowledge evaluation system or device will record all sort of performance data of the students, such as the number of question the student chooses, the time to take to answer each question, etc.
  • the multiuser knowledge evaluation system or device 200 will collect the big data and carry a plurality of statistical analyse using traditional statistic tool as well as the artificial intelligent algorithm. Then, the multiuser knowledge evaluation system or device will generate statistics report as shown in Figure 12 and Figure 13.
  • the statistic report may also include a leader board showing the top performers of the game as shown in Figure 14 to Figure 16.
  • the multiuser knowledge evaluation system or device 200 also allows a student to access his or her profile through the profile function 24.
  • the multiuser knowledge evaluation system or device 200 will direct the student to a profile setting UX as shown in Figure 17.
  • the student 11 may select to change the password, edit the profile, logout from the profile setting UX.
  • Figure 18 shows the edit profile UX, which provides the student with the view profile function 26 and edit profile function 28.
  • the student 11 may view or change: the user name, name, email, country, state, city, school, year, ethnics, and language background.
  • the teacher or administration may select to disable or disallow the student 11 to make change to one or more of these attributes.
  • the teacher 12 will access the knowledge evaluation system or device 200 through the electronic device 202 by connecting to the server 201.
  • the server 201 will provide the login function 22 to the teacher 11 through the login UX as shown in Figure 19.
  • the user knowledge evaluation system or device 200 will direct the teacher to a dashboard UX as shown in Figure 20.
  • the teacher 12 may access the profile function 24, the statistics report function 36, and the reporting function 38.
  • the user knowledge evaluation system or device 200 may also the assignment function for the teacher 12 to assign a task or game to a particular or a group of students.
  • the user knowledge evaluation system or device 200 may also provide the function for the teacher 12 to select a set of questions or queries for a particular or a group of student to work on.
  • the teacher 12 On the dashboard UX, the teacher 12 may access the report function 38 which allows the teacher to choose to view a list of students from the student list function 40 or progress report function 42.
  • the user knowledge evaluation system or device 200 will direct the teacher to the existing subject UX as shown in Figure 21 where the teacher may select to list the students working on a particular subject. Once the teacher 12 has select the subject, the user knowledge evaluation system or device 200 will direct the teacher to the student list UX as shown in Figure 22.
  • the teacher 12 may select to view the report in progress by year or report in progress by subject in subject. This will invoke the progress report function 42 and the progress report function 42 will generate the progress report by subject as shown in Figure 23 or progress report by year as shown in Figure 24.
  • the teacher 12 may access the statistics function 36.
  • the multiuser knowledge evaluation system or device will generate statistics report as shown in Figure 12 and Figure 13.
  • the statistic report may also include a leader board showing the top performers of the game as shown in Figure 14 to Figure 16.
  • the multiuser knowledge evaluation system or device 200 also allows a teacher 12 to access his or her profile through the profile function 24.
  • the multiuser knowledge evaluation system or device 200 will direct the teacher to a profile setting UX as shown in Figure 17.
  • the teacher 11 may select to change the password, edit the profile, logout from the profile setting UX.
  • a multiuser knowledge evaluation device 202 comprising a user login interface for implementing the login function 22.
  • the user login interface is adapted to cause a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute.
  • the multiuser knowledge evaluation device 202 also has a subject selection user interface adapted to cause the user to select one or more subjects as in step 504 of the user interaction process 500.
  • the multiuser knowledge evaluation device 202 comprises a communication interface 105 for receiving one or more questions from a server 201, wherein the questions are selected or generated by a server in accordance with the competency level attribute as in step 507 of the user interaction process 500.
  • the multiuser knowledge evaluation device 202 comprises a question interface for presenting questions to a user and receiving answers for the questions as in step 506 of the user interaction process 500.
  • the multiuser knowledge evaluation device 202 is adapted to the answers are send to the server 201 via the communication interface 105 such that a processor 101 in the server 201 evaluates a user result, and update the competency level of the user accordance with the user result and results of others users as in step 510 of the user interaction process 500.
  • the server 201 is associated with a question database for storing one or more questions and corresponding answers.
  • the database is a relational database containing questions and answers in different subjects.
  • the database can be held in a standalone server or a distribute database system.
  • the question database comprises questions selected in accordance with an Australian State or Territory high school certificate syllabus, such as the High School Certificate (HSC) of New South Wales, HSC of Victoria. However, the questions may also be drawn from General Certificate of Education of the United Kingdoms, or Suite of Assessments, etc.
  • the questions are multiple choice format questions. However, other format of questions, such as fill in the bank questions, may also be used.
  • the questions are selected from the database with a random selection algorithm by the randomiser controller 311.
  • the random selection algorithm comprises the steps of: generating a random set of unique random values using a software pseudorandom number generator or hardware number generator, retrieving a question in the question database correspondence to each value in the random set.
  • the software pseudorandom number generator may utilise cryptographic method to generate a pseudorandom number.
  • the cryptographic may include ISAAC (indirection, shift, accumulate, add, and count), RC4 PRGA, Xoroshirol28+, etc.
  • ISAAC inction, shift, accumulate, add, and count
  • RC4 PRGA RC4 PRGA
  • Xoroshirol28+ etc.
  • the hard ware random number generator typically known as the true random number generator, will rely mainly the functions provided by the processor 101.
  • Intel Core i7TM and AMD RyzenTM are some processors that support hardware random number generator.
  • the random selection algorithm comprises copying the whole table of the question database into a temporary table and adds a new column with a random value. Finally, it sorts the data by that column. This method is slow and inefficient as compared to the above method.
  • the server 201 is adapted to derive an association relationship of the question database and generate a new question in accordance with the association relationship and a reference source.
  • the reference source comprises one or more of: the questions in the question database and external data source, such as Wiki page, digital textbook, discussion forum, etc.
  • the association relationship is derived by one or more algorithms such as clustering, decision tree, linear regression, association, genetic algorithm, neural network.
  • the question interface comprises a timer for recording a time for a user to answer each question.
  • the processor in the server 201 derives a new competency level in accordance with the user results, the times to answer the questions of the user and others users.
  • the multiuser knowledge evaluation device comprises an output device 103 for displaying the user login interface, the subject selection user interface, and the subject selection user interface.
  • the output device 103 is a virtual reality or augmented reality device such as AR glass which is adapted to present the questions to a user in a virtual reality or augmented reality environment.
  • the output device is adapted to present the questions on a web base environment.
  • the multiuser knowledge evaluation device further comprise an input device for receiving data from a user, and the input device can be a touch screen.
  • the server 201 is adapted to select another multiuser knowledge evaluation device for pairing in accordance with the competency level attribute. This may allow the pair play a competition game or alternative turn base game.
  • the server 201 may enquire a current geolocation information of the devices and take the current geolocation information into account for pairing.
  • the pairing is a loosely associate relationship as both electronic devices are connected to the server 201 rather than directly connected to each other.
  • the server 201 may connect more than two electronic devices for gaming.
  • the present invention and the described embodiments specifically include the best method known to the applicant of performing the invention.
  • the present invention and the described preferred embodiments specifically include at least one feature that is industrial applicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The present invention provides a multiuser knowledge evaluation device comprising a user login interface, a subject selection user interface, a communication interface, and a question interface. The user login interface causes a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute. The subject selection user interface cause the user to select one or more subjects. The communication interface for receiving questions from a server. The questions are selected by a server in accordance with the competency level attribute. The question interface for presenting questions to a user and receiving answers for the questions. A processor in the server evaluates a user result, and update the competency level of the user accordance with the user result and results of others users.

Description

MULTIUSER KNOWLEDGE EVALUATION SYSTEM OR DEVICE
[01] The present invention relates to a device and system for multiuser knowledge evaluation, and in particular, a device and system for providing interactive knowledge evaluation in a multiple user environment.
BACKGROUND.
[02] Classroom learning and assessment are changing. There are many textbooks advocate a varied approach to classroom learning and assessments system that recognises the need to teach and assess knowledge, skills, and abilities. Electronic response systems are used in different subjects and courses which allowed students to provide immediate feedback to questions, and inform the teachers of student understanding.
[03] Recent studies show that there is significant student increase of conceptual gains in physics when electronic response systems are used to facilitate feedback in a constructivist-oriented learning environment. Students have always favoured the use of electronic response systems and attribute such factors as attentiveness and personal understanding to using electronic response systems. In particular, students show more enthusiastic in participating a multiuser and competitive environment.
[04] It is shown in study that the pedagogical practices of the teacher along with the incorporation of the technology is an important factor to student comprehension. Hence, it is beneficial to provide a multiuser knowledge evaluation system or device to assist student to learn in an enjoyable environment and at the same time assist teacher in understanding progress of the students. However, none of the previous systems or devices are able to provide an efficient way for pairing up or matching students or selecting an appropriate set of questions which suits the competency of the students. [05] US Patent No. 5,618,182 discloses an interactive electronic classroom system for enabling teachers to teach students concepts and to receive immediate feedback regarding how well the students have learned the concepts. Structure is provided for enabling students to proceed in lockstep or at their own pace through exercises and quizzes, responding electronically to questions asked, the teacher being able to receive the responses, and to interpret a readout, in histogram or other graphic display form, of student responses. In a preferred embodiment, a central computer using an IBM AT (tm) compatible system is employed, together with a plurality of student computers which range from simple devices to fully fledged personal computers. Optical peripheral hardware, such as VCRs or other recording/reproducing devices, may be used to provide lessons to students in association with the computer network. However, this system cannot provide an efficient way to pairing up students or selecting an appropriate set of questions which suits the competency of the students.
[06] US Patent Application No. 2011/0257961 discloses a system and method for automatically generating various question types, including automatic selection of multiple choice answers for display. The present invention further relates to a system and method for selecting presentable multiple choice answers based on use of a word in a sentence, quality of a sentence, and frequency of use of the word in other sentences. The present invention further relates to an adaptive learning system which aids a user in word comprehension by asking questions in a series of rounds and then tracking the progress of the user based on the categorization of each question. However, this system and method fails to take the competency of the students into account.
[07] US Patent Application No. 2014/0024009 discloses a method for providing educational content to a computing device associated with a user. The method comprises the steps of receiving a request to provide educational content associated with a topic, The request is a member selected from a group consisting of: (i) a search request from the computing device associated with the user; (ii) a request from a computing device of a parent of the user to provide educational content associated with a topic to the computing device associated with the user, and (iii) a request from a computing device of a teacher of the user to provide educational content associated with a topic to the computing device associated with the user. The method further comprises the step of identifying at least one of: (i) user profile information associated with the user and/or (ii) learning preference information associated with the user. The learning preference information is based at least in part on information collected from a survey completed by the user. The method also comprises the step of identifying educational content responsive to the request from a plurality of content sources, wherein the educational content is identified based on at least one of (i) the identified user profile information and/or (ii) the identified learning preference information; and providing, for display on the computing device associated with the user, the identified educational content. This method is also not capable of providing an efficient way to pairing up students or selecting an appropriate set of questions which suits the competency of the students
[08] Therefore, it currently lacks a method or device that has the ability and power to analyse the detected signals in real time and adapted to reconfigure the apparatus scanning method.
SUMMARY
[09] It is an object for the present invention to invention provide a knowledge evaluation system and device for multiple user.
[010] It is another object of the present invention to provide an improvement of multiuser knowledge evaluation system and device that takes into account of the competency of the students in selecting or generating questions for the students to practice.
[011] It is, therefore, an object of the present invention to provide a new and novel multiuser knowledge evaluation system or device. [012] Other objectives and advantages will become apparent when taken into consideration with the following specification and drawings.
[013] It is also an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
[014] It is, therefore, an object of the present invention to provide a multiuser knowledge evaluation device comprising: a user login interface adapted to cause a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute; a subject selection user interface adapted to cause the user to select one or more subjects; a communication interface for receiving one or more questions from a server, wherein the questions are selected or generated by a server in accordance with the competency level attribute; a question interface for presenting questions to a user and receiving answers for the questions; wherein the answers are send to the server via the communication interface, such that a processor in the server evaluates a user result, and update the competency level of the user accordance with the user result and results of others users.
[015] Preferably, the server is associated with a question database for storing one or more questions and corresponding answers.
[016] Preferably, the question database comprises questions selected in accordance with an Australian State or Territory high school certificate syllabus. [017] Preferably, the questions are multiple choice format questions.
[018] Preferably, the questions are selected from the database with a random selection algorithm.
[019] Preferably, the random selection algorithm comprises the steps of: generating a random set of unique random values using software a pseudorandom number generator or hardware number generator, retrieving a question in the question database correspondence to each value in the random set.
[020] Preferably, the server is adapted to derive an association relationship of the question database and generate a new question in accordance with the association relationship and a reference source.
[021] Preferably, wherein the reference source comprises one or more of: the questions in the question database and external data source.
[022] Preferably, the association relationship is derived by one or more algorithms of: clustering, decision tree, linear regression, association, genetic algorithm, neural network.
[023] Preferably, the question interface comprises a timer for recording a time for a user to answer each question.
[024] Preferably, the processor in the server derives a new competency level in accordance with the user results, the times to answer the questions of the user and others users. [025] Preferably, the multiuser knowledge evaluation further comprises an output device for displaying the user login interface, the subject selection user interface, and the subject selection user interface.
[026] Preferably, the output device is adapted to present the questions to a user in a virtual reality or augmented reality environment.
[027] Preferably, the output device is adapted to present the questions on a web base environment.
[028] Preferably, the multiuser knowledge evaluation device further comprises an input device for receiving data from a user.
[029] Preferably, the input device is a touch screen.
[030] Preferably, the server is adapted to select another multiuser knowledge evaluation device for pairing in accordance with the competency level attribute.
[031] Preferably, the multiuser knowledge evaluation device further comprises a location module for determining a current geolocation information, wherein the current geolocation information is forward to the server to taking into account for pairing.
[032] In a second aspect of an embodiment of the present invention, there is provided a multiuser knowledge evaluation method comprising the steps of: retrieving an profile of a user, wherein the profile comprises a difficulty level attribute; receiving a selected subject set data, which comprises one or more selected subjects; selecting or generating a set of questions from a questions database, wherein the set of questions are selected or generated in accordance with the difficulty level attribute; receiving answers for the questions; evaluating a user result base on the answers and updating the difficulty level attribute of the user in accordance with the user results and results of other users.
[033] In a third aspect of the present invention, there is provided an interactive multiuser knowledge evaluation system comprising a server; two electronic devices in operable to communicate with the server across a data network; a user interface activation controller conFigured for alternately activating a respective user interface on each of the electronic devices in turn; a randomiser controller controlling a randomiser interface for random subject categorisation selection; a query controller configured for selecting or generating query data according to the subject categorisation selection and presentation of the query data via a query interface; and an answer controller configured for receiving answer data from an answer interface and evaluating the answer data with reference to the query data, and wherein, the user interface activation controller is configured to transfer activation of the user interface alternately in turn between the two electronic devices in accordance with the activation. BRIEF DESCRIPTION OF THE FIGURES
[034] Features and advantages of the present invention will become apparent from the following description of embodiments thereof, by way of example only, with reference to the accompanying drawings, in which:
[035] Figure 1 shows an electronic device in accordance with an embodiment;
[036] Figure 2 shows a multiuser knowledge evaluation system or device in accordance with an embodiment of the present invention;
[037] Figure 3 shows an exemplary model, view and controller modules for alternatively interactive multiuser knowledge evaluation in accordance with an embodiment;
[038] Figure 4 shows a supervised machine learning for query generation, answer evaluation, subject categorisation competency determination and query difficulty level determination in accordance with embodiments;
[039] Figure 5 shows an exemplary user interaction with the system illustrating the ultimate activation of respective user interfaces of electronic devices in accordance with an embodiment;
[040] Figure 6 shows a use case diagram of a multiuser knowledge evaluation system or device of Figure 2;
[041] Figure 7 shows an login interface of the multiuser knowledge evaluation system or device of Figure 2;
[042] Figure 8 shows a home interface of the multiuser knowledge evaluation system or device of Figure 2; [043] Figure 9 shows a select game mode interface of the multiuser knowledge evaluation system or device of Figure 2;
[044] Figure 10 shows a game set up interface of the multiuser knowledge evaluation system or device of Figure 2;
[045] Figure 11 shows a query or question interface of the multiuser knowledge evaluation system or device of Figure 2;
[046] Figure 12 shows a first statistics of the multiuser knowledge evaluation system or device of Figure 2;
[047] Figure 13 shows a second statistics of the multiuser knowledge evaluation system or device of Figure 2;
[048] Figure 14 shows a first leader board of the multiuser knowledge evaluation system or device of Figure 2;
[049] Figure 15 shows a Second leader board interface of the multiuser knowledge evaluation system or device of Figure 2;
[050] Figure 16 shows a Third leader board interface of the multiuser knowledge evaluation system or device of Figure 2;
[051] Figure 17 shows a setting interface of the multiuser knowledge evaluation system or device of Figure 2;
[052] Figure 18 shows a profile interface of the multiuser knowledge evaluation system or device of Figure 2;
[053] Figure 19 shows another login interface of the multiuser knowledge evaluation system or device of Figure 2; [054] Figure 20 shows a dashboard interface of the multiuser knowledge evaluation system or device of Figure 2;
[055] Figure 21 shows a subject selection interface of the multiuser knowledge evaluation system or device of Figure 2;
[056] Figure 22 shows a student list interface of the multiuser knowledge evaluation system or device of Figure 2;
[057] Figure 23 shows a progress by subject interface of the multiuser knowledge evaluation system or device of Figure 2; and
[058] Figure 24 shows a progress by year interface of the multiuser knowledge evaluation system or device of Figure 2.
DESCRIPTION OF THE INVENTION
[059] Referring to Figure 1, there is a first preferred embodiment and this embodiment includes an electronic device 100 which may take the form of the server 201 or the electronic devices 202 as is substantially provided in Figure 2 described in further detail below.
[060] The electronic device 100 comprises a microprocessor 101 for processing digital data. In operable communication with the processor 101 across a system bus is a memory device 108. The memory device 108 is configured for storing digital data including computer program code instructions.
[061] The computer code instructions may be design with the Model- View software patterns. Hence, part of the instructions code can be divided into data model 109, controller 110, and view 113 modules. [062] In general terms, the data model 109 represents the data structure and underlying data 111 stored within the memory device 108. In one embodiment, the data model 109 is the central components of the pattern. The data model 109 expresses the software application's behaviour in terms of the problem domain, independent of the user interface or view 113. It directly manages the data, logic and rules of the software application or method of the electronic device 100.
[063] The view 113 defines the output representation of information of an embodiment of the present invention. The view 113 may provide different look and feel for different electronic device 100 and the operating system on the electronic device, but generally provide the same functionalities for the user. In one embodiment, the view 113 in implementing various computer processing functionality including that which is described herein in further detail below. In one embodiment of the present invention, the view 113 is implemented on a web browser. In another embodiment, the view 113 is implemented in an augmented reality environment.
[064] The controller 110 interface the data model 109 and view 113 by accepting input data and converting the input data into commands for the model 109 or view 113.
[065] The model 109 is responsible for managing the data of the application. It receives user input from the controller 110. The view 113 defines presentation of the model 109 in a particular format. The controller 110 is responsible for handling the user input and performing interactions on the data model objects. In a preferred embodiment, the view 113 receives the input, validates the input data, and pass the data to the model 109 through the function calls in the controller 110.
[066] The electronic device 100 comprises a display 103 in communication with the processor 101 for the presentation of digital data thereon. In embodiments, the display 103 is a touch screen where a touch controller 104 is adapted to receive the input from the touch screen. In another embodiment, the display 103 is associated with a haptic user interface that simulates the touch and feel of a virtual object, or to provide forces, vibrations, or motions to the user.
[067] The computer 100 further comprises a wireless processor 105 for sending and receiving data across a wireless data network, such as a Wi-Fi™ or Bluetooth™ network or the like. In another embodiments, where the electronic device 100 takes the form of a server 101, the electronic device 100 comprise a wired network interface, such as Ethernet.
[068] The electronic device 100 further comprises a power controller 106 for managing various aspects of the power consumption of the electronic device 100. Furthermore, in embodiments, and wherein the computer 100 takes the form of an electronic device 202 as is substantially shown in Figure 2, the computer 100 may comprise a GPS processor or location module 106 for ascertaining the current geolocation information of the computer 100. In another embodiment, the electronic device 100 is adapted to look up the current geolocation information of an Internet Protocol (IP) address to determine the current geolocation information. In yet another embodiment of the present invention, the electronic device 100 is adapted to obtain an approximate location by communicating with nearby network devices.
[069] Referring to Figure 2, there is provided an interactive multiuser knowledge evaluation system 200 of an embodiment of the present invention comprising a server 201 in operable communication with a plurality of electronic devices 202 across a data network 203. Primarily for illustrative convenience, there will be described herein the interaction of two electronic devices 202 comprising a first and second electronic devices 100 so as to especially illustrate the aspects of the server 201 implementing alternate user interface activation between the electronic devices 202.
[070] In embodiments, the server 201 is a standalone server 201 accessible across the Internet, such as a physical rack mounted server or alternatively a virtualised server instance, such as that which may be implemented utilising Amazon Web Services (AWS), Google™ Cloud, or the like. Furthermore, in an embodiment, the electronic devices 202 may take the form of mobile communication devices, such as smart phones, tablet, laptop, or the like.
[071] Reference is now made to Figure 3. There is shown exemplary model view controller (MVC) modules 300 illustrating various aspects of the data models 109, controllers 110 and views 113.
[072] Considering initially the data model 109, as is represented, the data model may comprise Q&A data object 301 which, as will be described in further detail below, is alternately presented via corresponding query interface 321, and answer interface object 322 of the electronic devices 202. Depending on the actual electronic devices 202, the look and feel of the view will different from operating system to operating system, and device to device. In one embodiment, the electronic devices 202 is an augmented reality (AR) device which provide an AR look and feel for the view. The view 113 is adapted to allow user may manipulate AR object for input and output.
[073] In one embodiment, Meta data 302 are the attributes in the Q&A data object 301. In one example, there is provide an instance of Meta data 302 which represents query difficulty in embodiments or other aspects, may be stored in relation to the Q&A data 301 object.
[074] Furthermore, the Q&A data 301 object may have an attribute subject category data 303. In this manner, questions may be selected in accordance with the particular subject categorisation.
[075] Furthermore, the data model 109 may comprise user data object 304 representing various user data of the users of the system 200. The user data object 304 is a superclass for the subclass Student data object, Teacher data object, and Administrator data object. [076] Reference is now made to the Controller modules 110 in Figure 3. The Controller modules 110 comprises a pairing controller 310. In general terms, the pairing controller 310 is configured for pairing electronic devices 202 and specifically users of the electronic devices 202 in accordance with various parameters. For example, the pairing controller 310 will look into the various attributes of all instances of the Student object to pair up students of similar ability. The attributes taken into consideration are: historical winning percentage, subjects which the student previous chosen, the difficulty of the question which the student successfully answered, etc.
[077] The controllers 110 may comprise a user interface activation controller 311 which is configured for alternately activating associated user interfaces of the electronic devices 202 in turn or in a mutual exclusion manner. In this configuration, the student may compete in a turn base competition. The user interface activation controller 311 may have a time for recording the time for each student uses during the competition. In another embodiment, user interface activation controller 311 which is configured for allowing activating associated user interfaces of the electronic devices 202 in real time.
[078] The controllers 110 may further comprise a randomised controller 311 which acts in unison with a randomised interface 320 which is utilised for the selection of a random subject categorisation. In embodiments, the randomised interface 320 may take the form of a spinning wheel graphical representation which may be virtually spun so as to select a pseudorandom subject categorisation.
[079] The controllers 110 may further comprise a query controller 312 configured for selecting a random query from the query data 301 in accordance with the pseudo- randomly or randomly selected subject categorisation determined by the randomiser controller 311 and interface 320. Such query data may be presented via the query interface 321 allowing the user, utilising the answer interface 322 to input associated answer data. In this regard, the controllers 110 may further comprise an answer controller 313 configured for analysing/evaluating the input answer data so as to, in general terms, consider whether or not the provided input answer data is correct given the query data selected.
[080] The controllers 110 may further comprise a difficulty determinator controller 314 which may analyse the query data 301 so as to determine associated difficulty levels and wherein, in embodiments, queries may be selected from the query data 301 in accordance with a difficulty level and in particular, a difficulty level determined in accordance with user competence wherein, in embodiments, the controllers 110 may further automate the determination of a user's competence, including by analysing the answer data provided by the user.
[081] Furthermore, the controllers 110 may comprise a peer-to-peer communication controller 316 which may be utilised for broadcast or multicast communication from an electronic device 202 to multiple electronic devices via a message board, an electronic forum, newsgroup, Internet Relay Chat, Facebook™ , Twitter™, Twitch™, YouTube™, etc. In embodiments, the user may control the electronic device 202 to forward a posed query to a message board, an electronic forum, newsgroup, Internet Relay Chat, Facebook™ , Twitter™, Twitch™, YouTube™, ... , etc. for public participation in the answering of the query.
[082] The controllers 110 may further comprise a statistics generator controller 317 configured for generating various user statistics including those determined in accordance with the answers provided by the user via the answer interface 322. The statistics generator controller 317 may carry our data analysis using a plurality of statistic functions, hypothesis testing, regression, classification, clustering, association... etc.
[083] In embodiments, the controllers 110 may further comprise a resource determinator controller 318 configured for identifying electronic study material resources for provision to users in accordance with a determined user competence or learning weakness which may be intelligently determined by the server 201 by analysing the answer data provided by users.
[084] In embodiments, various aspects of the computer processing of the system 200 may be performed utilising aspects of artificial intelligence. For example, in embodiments, the system 200 may automate the generation of queries wherein the queries are generated utilising supervised and / or unsupervised machine learning. Similarly, the system 200 may be configured for the automated marking, evaluation of provided answer data, especially wherein such answer data is provided a free-form format wherein such an evaluation may utilise supervised and / or unsupervised machine learning for the evaluation of the semantics of the answer data for evaluation.
[085] Referring to Figure 4 which shows an artificial intelligence/supervised machine learning system or method 400 for a preferred embodiment of the present invention. The artificial intelligence/supervised machine learning system or method 400 is adapted to determine the query difficulty, such as the difficulty level of the questions for a student. However, it should be appreciated that the functionality of the supervised machine learning system or method 400 as is substantially provided in Figure 4 may be equally applicable mutatis mutandis for the other automation aspects, including the automation of the generation of query data and the evaluation of answer data.
[086] As shown in Figure 4, the supervised machine learning 400 comprises a machine learning module 403 for training a neural network with the training data set 405. As a result of the training, the electronic device 100 will be able to generate a set of optimising parameters 402 for each neurons in the artificial neural network, such as weight, number of layers, topology of the neuron connections, etc. With these optimising parameters 402 the electronic device 100 is adapted to configure and optimise a neural network to determine the query difficulty. [087] The training data 405 of an embodiment of the present invention comprises training query data and data indicative of the difficulty thereof. The difficulty includes one or more of: average user response times, time for answering evaluation data, the answer evaluation data representing whether or not the queries of the query data were answered correctly or not, etc. As such, the machine learning module 403 trained using such data 405 to generate the optimising parameters 402 such that the trained artificial neural network 401 may similarly take as input query data 404 and output difficulty ratings 406.
[088] In one embodiment, the difficulty level rating 406 may be utilised for the purposes of determining appropriate queries to pose to users in accordance with user competencies. Such user competencies may be determined in an automated manner by the system including utilising supervised machine learning system or method 400.
[089] For example, the system 200 may determine user competencies by subject categorisation so as to be able to pose queries a matching difficulty accordingly. In another example, for a user having subject categorisations comprising maths, geography and history, the system 200 may automate the determination that a user has strong competency for maths, average competency for geography and low competency for history. As such, the query selected by the query controller 312 may be selected in accordance with such competencies wherein, for example, difficult queries are selected for the maths subject categorisation and easy queries are selected for history.
[090] Reference is now made to Figure 5. There is shown an exemplary user interaction (also known as the system process flow chart) 500 primarily illustrating the alternate user interface activation process performed by the server.
[091] In one embodiment, the user interaction process 500 generally illustrates the functions performed by the electronic devices 202 and the server 201. Figure 5 shows a pair of electronic devices comprising first and second electronic device 202 as an example. It is envisage that the server may control many more devices at the same time.
[092] The user interaction process 500 initiates with step 501 of pairing wherein the electronic devices 202 are paired. The paring is performed by the pairing controller 310 so as to select appropriate users for the subsequent user interaction. In one embodiment, the pairing controller 311 may analyse user data 304 for pairing, such as by analysing user location (such as which may be input by the user, such as a school name or the like or determined by the GPS processor 106 by locality), and user subject data, such as wherein the user has input the user's school subjects. For example, the pairing controller 310 may aim to match users in accordance with locality (i.e. at the same school or learning institution) and generally by course subjects similarity.
[093] Having paired the electronic devices 202 in this manner, the server 201 then initiates the alternate turn user interaction between the electronic devices 202 wherein, for each turn, queries on the selected subjects are posed to each user and answer data are received accordingly for evaluation.
[094] As is shown in Figure 5, the user interaction 500 process to step 502 for obtaining a set of queries with a randomiser interface 320. The randomiser interface 320 is associated with the randomiser controller 311 which randomly or pseudorandomly selects queries within the selected subjects.
[095] In a preferred embodiment, the randomiser interface 320 may generate a spinning wheel randomiser interface, rolling dice interface, or other kind of randomising interfaces. As such, for a first turn, the user of the first electronic device 202 may be invited to spin the wheel in a virtual manner. The spinning wheel may then land on a particular subject categorisation. The system server 201 will retrieve a query on that particular subject and forward it to the electronic device 202. In this regard, the spinning wheel representation shown may comprise a plurality of divisions wherein each division is configured in accordance with the user's subjects provided during the registration stage. For example, the first user subject categorisation may comprise subjects comprising maths, history, geography and social sciences and therefore the spinning wheel virtualisation comprises divisions accordingly.
[096] In one embodiment, the spinning wheel graphical representation may further comprise a division comprising a wildcard-type division which may be represented by a crown icon for which the associated processing. As such, the outcome of the utilisation of the randomiser controller 311 and the randomiser interface 320 is a randomly or pseudorandomly selected category selection asinstep504.
[097] The query data may be selected from the data defined by the data model 109.
[098] After a set of categories is selected in step 504, the user interaction process 500 proceeds to step 505 for determining the level of the queries. The level determination step 505 takes into account of the competency levels of the user which may be determined by an electronic device 202. The electronic device 202 is adapted to determine that a user has certain level of competency or difficulty in the selected subject. The selection of the query from the data defined by the data model 109 may comprise a selection of a difficulty query.
[099] Once the subjects and the difficulties are determined in step 504 and step 505, the user interaction process 500 proceeds to step 507 for selecting a set of queries from the database or generating a set of new queries. As such, having the category selection step 504, such as, for example, mathematics, the user interaction process 500 may comprise the selection or generation step 507 of associated query data 311.
[0100] In one embodiment, as opposed to selecting query data from the database, the query controller 312 may be configured for query generation. For example, especially for the mathematics subject categorisation, the query controller 312 may generate random mathematical questions or queries in accordance with mathematical equation templates.
[0101] Alternatively, the query controller 312 may intelligently generate queries from subject matter data sets. For example, for history, the query controller 312 may retrieve and associated history subject matter resource, such as from an external website or from within the database.
[0102] For example, for the history subject, the query controller 312 may browse to an external website, such as Wikipedia and select a pseudorandom history topic or a history topic within a particular field, such as relating to World War II. The query controller 312 may select a Wikipedia page relating to Mussolini.
[0103] In another example, for the Science Engineering Technology Mathematics subject, the query controller 312 may browse to an external study materials, such as the contents from Design Patterns: Elements of Reusable Object-Oriented Software by Gamma et al and select a pseudorandom on software design pattern. The query controller 312 may select a page in the book relating to data model.
[0104] Having posed the query data via the query interface 321, the user may input an associated answer in step 506 utilising the answer interface 322. Answers may be provided in various manors such as by multiple-choice, free-form text input. Depending on the answer interface 322, the electronic device 202 may request a user to perform a task on the touch controller 104, such as a gesture. In another embodiment, the answer interface 322 may present a number of virtual objects on the display for a user to manipulate in a virtual environment or an augmented reality environment. The user pay select or pick a virtual object as an answer, or may assembly certain items, objects as an answer. [0105] Having received answer input data, the server 201 carries out the analyses / evaluation step 508 on the answer data so as to determine whether the provided answer data is correct in accordance with the query data posed.
[0106] In an embodiment as shown from Figure 5, in the condition that correct answer data is provided, the user will be allowed to attempt a further question in relation to the same subject.
[0107] In another embodiment embodiments, the user interaction process 500 of the server 201 may be configured such that in the event when a user has successfully answered a certain number of queries in a particular subject in succession, the associated category may be allocated 513 to the user indicating that the user has "won" or "completed" the subject.
[0108] For example, if the user has successfully answered three mathematical questions in series, the server 201 may allocate a "won" or "completed" status to the user on the mathematics subject categorisation to indicate that the user has "won" or "completed" the mathematics subject.
[0109] In one embodiment the user interaction 500 may halt in the end step 514. One of the halting condition in step 514 is when a user has "won" or "completed" all subjects allocated to the user. In embodiments, if at least one of the users have not "won" or "completed" all of the subjects after a certain number of queries have been posed, the user interaction process 500 may end in step 514. In another embodiment, the interaction process 500 may end in step 514 after the certain number of queries are posed and one user has successfully answered the most number of queries.
[0110] In yet another embodiment, a wildcard demarcation on the spinning wheel randomised interface 312 may be utilised as an alternative to having to answer three queries correctly in succession. As such, if the spinning wheel randomised interface 320 lands on a crown icon, user is allowed to attempt in answering a series of query in relation to a particular subject.
[0111] If the queries are answered correctly, the user is deemed to have "won" or "completed" that subject. The subject category status of the user will be change to "won" or "completed", and the user data will be updated to the database.
[0112] As shown from Figure 5, after the user answer the queries in step 506, the user interaction process 500 is proceed to step 510 for determining the difficulty level which the user should attempt next. This is step 510 is carried out by the difficulty determinator controller 314. The difficulty determinator controller 314 is adapted to automate the determination process of a difficulty for the posed question and this determination process may take into consideration the time taken to answer the query whether or not the answer data was correct and other factors.
[0113] As shown in Figure 5, if a first user fails to answer a query correctly, the user interface activation controller 311 may then implement a changeover control in step 509. In step 509, the user interface availability/activation is transferred to a second electronic device.
[0114] After the change of control step 509, the user interactive process 500 may proceed to the notify step 511 of sending an electronic notification to the second electronic device 202.
[0115]
[0116] The user interactive process is then looped back to the randomiser step 502. However, in this round, the randomiser interface step 503 and category selection step 504 are carried out in the second electronic device 202 instead of the first electronic device. Also, the server 201 is now expected to receive the answer input in step 506 from the second electronic device rather than the first electronic device. [0117] The user interaction process 500 ends when a particular user is successfully obtained all subject categorisations. The user interaction process 500 may also end when after a certain number of queries have been posed and the user having answered the most number of queries successfully is determined.
[0118] The user interaction process 500 is implemented on a turn by turn basis wherein the user interface activation controller 301 alternately transfers control of the user interface between the electronic devices 202 depending on the manner of the answers provided by the respective users.
[0119] In one embodiment, the user interface activation controller 311 may be configured to transfer control of the user interface to the second user after a certain number of subjects have been allocated, such as half of the number of subjects. This will prevent all of the subject categorisations being allocated to a user on a first turn,
[0120] It should be noted that, in embodiments, a user may seek to challenge a subject allocation of an opposing user. For example, a user may challenge the opposing user in mathematics. During the challenge, the server 201 is configured to present a series of questions to each respective user of each electronic device 202. At the end of the challenge, the user having answered the most number of queries correctly is recorded as the winner of the challenge.
[0121] In another embodiment, a user may utilise the electronic device 202 to send a query to a publicly accessible message board, electronic discussion form, Internet Relay Chat channel, Bulletins Board System, Twitter™, Twitch™, YouTube™, etc. utilising the peer-to-peer communication controller 316. For example, for a query posed for the history subject, should the user not know the answer, the user may control the electronic device 202 to forward the query to an electronic forum. Other participating users, utilising associated electronic devices may receive a notification of the query. Any other users on the electronic forum may help to provide answers in a collaborative manner with another electronic device 202 connected to the server 201. In this regard, the answer controller 303 may analyse/evaluate these answers provided to validate the answer. In one embodiment, the first user to provide the answer may be allowed to pose a next query.
[0122] Referring to Figure 6 which shows a use case 10 of the multiuser knowledge evaluation system or device of an embodiment of the present invention. In one embodiment, there are two type of users for the multiuser knowledge evaluation system or device of an embodiment of the present invention. They are the student 11 and the teacher 12.
[0123] In one embodiment of the present invention, the multiuser knowledge evaluation system or device also comprises a special type of user, the administrator. The administration will be responsible for system administration, such as backing up the system, generating usage report, populating queries or questions into the system, maintaining and setting up the parameters for the artificial algorithm for determining the difficulties, etc.
[0124] The data model 109 of the user contains a number of attributes such as user identifier, email, first name, last name, school, etc. The user object may also comprises an avatar of the user representation in a virtual reality or augmented reality environment. In one embodiment, the user may select or change the attributes and appearance of the avatar. In another embodiment, the avatar projects a residual self- image, which is typically an idealized subjective perception of the user's appearance.
[0125] The data model 109 for student 11 and teacher 12 are inheritance or subtype of the user. The multiuser knowledge evaluation system or device may implement these data model 109 as an instance of objects in the system. These instance can be stored in an object oriented database or relational database. In one embodiment, the instances of the objects are manipulated, transferred, or stored in JavaScript object notation format or extended markup language format. [0126] The student 11 object will also have the additional attribute of grade indicating the actual grade of the student. The student object 11 may also have a status indicating he or she has "won" or "completed" a subject in the game. The student object 11 may also comprise an attribute indicating the difficulty level which the student has attained.
[0127] The database system can be part of the server 201. In another embodiment, the database system is a standalone system which the server 201 can be accessed over the Internet. The database system can be a standalone server or a distributed database system.
[0128] The user case 10 for the student 11 will be described with reference to Figure 8 to Figure 24 which shows the user interfaces (UX) of an embodiment of the present invention.
[0129] In one embodiment of the present invention, the knowledge evaluation system or device is implement with the system described in Figure 1 to Figure 5. The student will access the knowledge evaluation system or device 200 through the electronic device 202 by connecting to the server 201. The server 201 will provide the login function 22 to the student 11 through the login UX as shown in Figure 7. The student 11 must login to the electronic device 202 before he or she can access the proper functions of the knowledge evaluation system or device 200. However, it is envisaged that a demonstration version with fewer functions may be provided to any user. On the login US, the student is required to enter the email and the password. The email is one of the user identifier of the student 11.
[0130] Once the student 11 is logged into the multiuser knowledge evaluation system or device, the server 201 direct the student to a home UX as shown in Figure 8. From the home UX, the student 11 may access the profile administration function 24, the game function 30, and the statistic report function 36. [0131] On the home UX, the student 11 may select a new game icon to access the game function 30. When a student 11 select a new game icon, the server 201 and electronic device 202 will direct the student to the game selection UX as shown in Figure 9. On the game selection UX, the student 11 may select to play Challenge a Friend mode or Challenge Me mode. In the Challenge a Friend mode, the multiuser knowledge evaluation system or device will execute the user interaction process 500. The student 11 will pair up with other users to play the game. In the Challenge a Friend mode, the multiuser knowledge evaluation system will allow the student 11 to play the game in solo mode.
[0132] After the student 11 has selected the game mode, the server 201 and the electronic device 202 will direct the student to a game setup UX as shown in Figure 10. The game set up UX allow a student to access the game set up function 32. On the game setup UX, the student may: select the number of question to answer or query to answer, select a time limit per question or query, and select one or more subjects of questions or queries in the game. When the student 11 select the subjects on the electronic device 202 in step 504, the server 201 will then carry out the step 505 to determine the difficulty level of the questions, select / generate questions as in step 507. The server 201 will then forward the questions to the electronic device 202 for the play function 34. The electronic device will carry out the play function 34 by displaying the questions to the student 11 and receiving answer from the student with the play UX, query interface, or query UX as shown in Figure 11. In another embodiment, the play UX, query interface, or query UX is provided in a virtual reality or augmented reality environment, where the student 11 is represented by an three dimension avatar in selecting an object or carry out a task for generating an answer.
[0133] From the home UX, the student 11 may access the statistics function 36. When the student 11 participating in the game, the multiuser knowledge evaluation system or device will record all sort of performance data of the students, such as the number of question the student chooses, the time to take to answer each question, etc. The multiuser knowledge evaluation system or device 200 will collect the big data and carry a plurality of statistical analyse using traditional statistic tool as well as the artificial intelligent algorithm. Then, the multiuser knowledge evaluation system or device will generate statistics report as shown in Figure 12 and Figure 13. The statistic report may also include a leader board showing the top performers of the game as shown in Figure 14 to Figure 16.
[0134] The multiuser knowledge evaluation system or device 200 also allows a student to access his or her profile through the profile function 24. When the student 11 chooses to access the profile function 24, the multiuser knowledge evaluation system or device 200 will direct the student to a profile setting UX as shown in Figure 17. The student 11 may select to change the password, edit the profile, logout from the profile setting UX. Figure 18 shows the edit profile UX, which provides the student with the view profile function 26 and edit profile function 28. On the edit profile UX, the student 11 may view or change: the user name, name, email, country, state, city, school, year, ethnics, and language background. The teacher or administration may select to disable or disallow the student 11 to make change to one or more of these attributes.
[0135] The teacher 12 will access the knowledge evaluation system or device 200 through the electronic device 202 by connecting to the server 201. The server 201 will provide the login function 22 to the teacher 11 through the login UX as shown in Figure 19. Once the teacher 12 is logged in, the user knowledge evaluation system or device 200 will direct the teacher to a dashboard UX as shown in Figure 20. From the dashboard US, the teacher 12 may access the profile function 24, the statistics report function 36, and the reporting function 38. In another embodiment, the user knowledge evaluation system or device 200 may also the assignment function for the teacher 12 to assign a task or game to a particular or a group of students. The user knowledge evaluation system or device 200 may also provide the function for the teacher 12 to select a set of questions or queries for a particular or a group of student to work on. [0136] On the dashboard UX, the teacher 12 may access the report function 38 which allows the teacher to choose to view a list of students from the student list function 40 or progress report function 42.
[0137] When the teacher 12 selects the student list icon to access the student list function 40, the user knowledge evaluation system or device 200 will direct the teacher to the existing subject UX as shown in Figure 21 where the teacher may select to list the students working on a particular subject. Once the teacher 12 has select the subject, the user knowledge evaluation system or device 200 will direct the teacher to the student list UX as shown in Figure 22.
[0138] On the dashboard US, the teacher 12 may select to view the report in progress by year or report in progress by subject in subject. This will invoke the progress report function 42 and the progress report function 42 will generate the progress report by subject as shown in Figure 23 or progress report by year as shown in Figure 24.
[0139] In another preferred embodiment, the teacher 12 may access the statistics function 36. The multiuser knowledge evaluation system or device will generate statistics report as shown in Figure 12 and Figure 13. The statistic report may also include a leader board showing the top performers of the game as shown in Figure 14 to Figure 16.
[0140] The multiuser knowledge evaluation system or device 200 also allows a teacher 12 to access his or her profile through the profile function 24. When the teacher 12 chooses to access the profile function 24, the multiuser knowledge evaluation system or device 200 will direct the teacher to a profile setting UX as shown in Figure 17. The teacher 11 may select to change the password, edit the profile, logout from the profile setting UX. there [0141] In one embodiment of the present invention, there is provided a multiuser knowledge evaluation device 202 comprising a user login interface for implementing the login function 22. The user login interface is adapted to cause a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute.
[0142] The multiuser knowledge evaluation device 202 also has a subject selection user interface adapted to cause the user to select one or more subjects as in step 504 of the user interaction process 500.
[0143] The multiuser knowledge evaluation device 202 comprises a communication interface 105 for receiving one or more questions from a server 201, wherein the questions are selected or generated by a server in accordance with the competency level attribute as in step 507 of the user interaction process 500.
[0144] The multiuser knowledge evaluation device 202 comprises a question interface for presenting questions to a user and receiving answers for the questions as in step 506 of the user interaction process 500.
[0145] The multiuser knowledge evaluation device 202 is adapted to the answers are send to the server 201 via the communication interface 105 such that a processor 101 in the server 201 evaluates a user result, and update the competency level of the user accordance with the user result and results of others users as in step 510 of the user interaction process 500.
[0146] The server 201 is associated with a question database for storing one or more questions and corresponding answers. In one embodiment, the database is a relational database containing questions and answers in different subjects. The database can be held in a standalone server or a distribute database system. [0147] The question database comprises questions selected in accordance with an Australian State or Territory high school certificate syllabus, such as the High School Certificate (HSC) of New South Wales, HSC of Victoria. However, the questions may also be drawn from General Certificate of Education of the United Kingdoms, or Suite of Assessments, etc. In one embodiment, the questions are multiple choice format questions. However, other format of questions, such as fill in the bank questions, may also be used.
[0148] In one embodiment, the questions are selected from the database with a random selection algorithm by the randomiser controller 311. The random selection algorithm comprises the steps of: generating a random set of unique random values using a software pseudorandom number generator or hardware number generator, retrieving a question in the question database correspondence to each value in the random set.
[0149] The software pseudorandom number generator may utilise cryptographic method to generate a pseudorandom number. The cryptographic may include ISAAC (indirection, shift, accumulate, add, and count), RC4 PRGA, Xoroshirol28+, etc. The hard ware random number generator, typically known as the true random number generator, will rely mainly the functions provided by the processor 101. Intel Core i7™ and AMD Ryzen™ are some processors that support hardware random number generator.
[0150] In another embodiment, the random selection algorithm comprises copying the whole table of the question database into a temporary table and adds a new column with a random value. Finally, it sorts the data by that column. This method is slow and inefficient as compared to the above method.
[0151] The server 201 is adapted to derive an association relationship of the question database and generate a new question in accordance with the association relationship and a reference source. The reference source comprises one or more of: the questions in the question database and external data source, such as Wiki page, digital textbook, discussion forum, etc. The association relationship is derived by one or more algorithms such as clustering, decision tree, linear regression, association, genetic algorithm, neural network.
[0152] In a preferred embodiment, the question interface comprises a timer for recording a time for a user to answer each question. The processor in the server 201 derives a new competency level in accordance with the user results, the times to answer the questions of the user and others users.
[0153] The multiuser knowledge evaluation device comprises an output device 103 for displaying the user login interface, the subject selection user interface, and the subject selection user interface. In one embodiment, the output device 103 is a virtual reality or augmented reality device such as AR glass which is adapted to present the questions to a user in a virtual reality or augmented reality environment. In another embodiment, the output device is adapted to present the questions on a web base environment. The multiuser knowledge evaluation device further comprise an input device for receiving data from a user, and the input device can be a touch screen.
[0154] In one embodiment, the server 201 is adapted to select another multiuser knowledge evaluation device for pairing in accordance with the competency level attribute. This may allow the pair play a competition game or alternative turn base game. In pairing up electronic devices 202, the server 201 may enquire a current geolocation information of the devices and take the current geolocation information into account for pairing. The pairing is a loosely associate relationship as both electronic devices are connected to the server 201 rather than directly connected to each other. In one embodiment, the server 201 may connect more than two electronic devices for gaming.
[0155] Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms, in keeping with the broad principles and the spirit of the invention described herein.
[0156] The present invention and the described embodiments specifically include the best method known to the applicant of performing the invention. The present invention and the described preferred embodiments specifically include at least one feature that is industrial applicable.

Claims

1. A multiuser knowledge evaluation device comprising: a user login interface adapted to cause a user to provide a user identifier and a password for retrieving a profile which comprises a competency level attribute; a subject selection user interface adapted to cause the user to select one or more subjects; a communication interface for receiving one or more questions from a server, wherein the questions are selected or generated by a server in accordance with the competency level attribute; a question interface for presenting questions to a user and receiving answers for the questions; wherein the answers are send to the server via the communication interface, such that a processor in the server evaluates a user result, and update the competency level of the user accordance with the user result and results of others users.
2. A multiuser knowledge evaluation device of Claim 1, wherein the server is associated with a question database for storing one or more questions and corresponding answers.
3. A multiuser knowledge evaluation device of Claim 2, wherein the question database comprises questions selected in accordance with an Australian State or Territory high school certificate syllabus.
4. A multiuser knowledge evaluation device of Claim 3, wherein the questions are multiple choice format questions.
5. A multiuser knowledge evaluation device of Claim 2, wherein the questions are selected from the database with a random selection algorithm.
6. A multiuser knowledge evaluation device of Claim 5, wherein the random selection algorithm comprises the steps of: generating a random set of unique random values using software a pseudorandom number generator or hardware number generator, retrieving a question in the question database correspondence to each value in the random set.
7. A multiuser knowledge evaluation device of Claim 2, wherein the server is adapted to derive an association relationship of the question database and generate a new question in accordance with the association relationship and a reference source.
8. A multiuser knowledge evaluation device of Claim 7, wherein the reference source comprises one or more of: the questions in the question database and external data source.
9. A multiuser knowledge evaluation device of Claim 7, wherein the association relationship is derived by one or more algorithms of: clustering, decision tree, linear regression, association, genetic algorithm, neural network.
10. A multiuser knowledge evaluation device of Claim 1, wherein the question interface comprises a timer for recording a time for a user to answer each question.
11. A multiuser knowledge evaluation device of Claim 10, wherein the processor in the server derives a new competency level in accordance with the user results, the times to answer the questions of the user and others users.
12. A multiuser knowledge evaluation device of Claim 1, further comprising an output device for displaying the user login interface, the subject selection user interface, and the subject selection user interface.
13. A multiuser knowledge evaluation device of Claim 12, wherein the output device is adapted to present the questions to a user in a virtual reality or augmented reality environment.
14. A multiuser knowledge evaluation device of Claim 12, wherein the output device is adapted to present the questions on a web base environment.
15. A multiuser knowledge evaluation device of Claim 1, further comprising an input device for receiving data from a user.
16. A multiuser knowledge evaluation device of Claim 15, wherein the input device is a touch screen.
17. A multiuser knowledge evaluation device of Claim 1, wherein the server is adapted to select another multiuser knowledge evaluation device for pairing in accordance with the competency level attribute.
18. A multiuser knowledge evaluation device of Claim 17, further comprising a location module for determining a current geolocation information, wherein the current geolocation information is forward to the server to taking into account for pairing.
19. A interactive multiuser knowledge evaluation method comprising the steps of: retrieving an profile of a user, wherein the profile comprises a difficulty level attribute; receiving a selected subject set data, which comprises one or more selected subjects; selecting or generating a set of questions from a questions database, wherein the set of questions are selected or generated in accordance with the difficulty level attribute; receiving answers for the questions; evaluating a user result base on the answers and updating the difficulty level attribute of the user in accordance with the user results and results of other users.
20. An interactive multiuser knowledge evaluation system comprising a server; two electronic devices in operable to communicate with the server across a data network; a user interface activation controller configured for alternately activating a respective user interface on each of the electronic devices in turn; a randomiser controller controlling a randomiser interface for random subject categorisation selection; a query controller configured for selecting or generating query data according to the subject categorisation selection and presentation of the query data via a query interface; and an answer controller configured for receiving answer data from an answer interface and evaluating the answer data with reference to the query data, and wherein, the user interface activation controller is configured to transfer activation of the user interface alternately in turn between the two electronic devices in accordance with the activation.
PCT/AU2018/050594 2017-06-16 2018-06-15 Multiuser knowledge evaluation system or device WO2018227251A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2017902292A AU2017902292A0 (en) 2017-06-16 An alternately interactive multiuser knowledge evaluation system
AU2017902292 2017-06-16

Publications (1)

Publication Number Publication Date
WO2018227251A1 true WO2018227251A1 (en) 2018-12-20

Family

ID=64658807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2018/050594 WO2018227251A1 (en) 2017-06-16 2018-06-15 Multiuser knowledge evaluation system or device

Country Status (1)

Country Link
WO (1) WO2018227251A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112214651A (en) * 2020-07-27 2021-01-12 上海乂学教育科技有限公司 Intelligent learning competition system and method
EP4099262A1 (en) * 2021-06-02 2022-12-07 The Wise Seeker, S.L. Method for assessing subject knowledge using collective intelligence in combination with machine learning algorithms
WO2023118669A1 (en) * 2021-12-23 2023-06-29 New Nordic School Oy User-specific quizzes based on digital learning material

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060046237A1 (en) * 2004-09-02 2006-03-02 Griffin Charles W Methods, systems and computer program products for creating and delivering prescriptive learning
US20100005413A1 (en) * 2008-07-07 2010-01-07 Changnian Liang User Interface for Individualized Education
US20120208166A1 (en) * 2011-02-16 2012-08-16 Steve Ernst System and Method for Adaptive Knowledge Assessment And Learning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060046237A1 (en) * 2004-09-02 2006-03-02 Griffin Charles W Methods, systems and computer program products for creating and delivering prescriptive learning
US20100005413A1 (en) * 2008-07-07 2010-01-07 Changnian Liang User Interface for Individualized Education
US20120208166A1 (en) * 2011-02-16 2012-08-16 Steve Ernst System and Method for Adaptive Knowledge Assessment And Learning

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112214651A (en) * 2020-07-27 2021-01-12 上海乂学教育科技有限公司 Intelligent learning competition system and method
EP4099262A1 (en) * 2021-06-02 2022-12-07 The Wise Seeker, S.L. Method for assessing subject knowledge using collective intelligence in combination with machine learning algorithms
WO2023118669A1 (en) * 2021-12-23 2023-06-29 New Nordic School Oy User-specific quizzes based on digital learning material

Similar Documents

Publication Publication Date Title
Daghestani et al. Adapting gamified learning systems using educational data mining techniques
Wan et al. A learner oriented learning recommendation approach based on mixed concept mapping and immune algorithm
Hilton et al. Learning science through computer games and simulations
Konold et al. Understanding distributions by modeling them
CN106846200A (en) A kind of ideological and political theory teaching system
Petri et al. Quality of games for teaching software engineering: an analysis of empirical evidences of digital and non-digital games
WO2018227251A1 (en) Multiuser knowledge evaluation system or device
Petrović et al. Development of an educational game based on IoT
Bassi et al. Software agents in large scale open e-learning: A critical component for the future of massive online courses (MOOCs)
Chen et al. The effect of mobile business simulation games in entrepreneurship education: a quasi-experiment
Elaachak et al. Towards a system of guidance, assistance and learning analytics based on multi agent system applied on serious games.
Wang et al. Three social classroom applications to improve student attitudes
Akdere et al. To simulate or not to simulate? Comparing the effectiveness of video‐based training versus virtual reality‐based simulations on interpersonal skills development
Troussas et al. Adaptive e-learning interactions using dynamic clustering of learners’ characteristics
Qiu et al. Effectiveness of artificial intelligence (AI) in improving pupils’ deep learning in primary school mathematics teaching in Fujian Province
Castellano et al. Empowering human anatomy education through gamification and artificial intelligence: An innovative approach to knowledge appropriation
Thanyaphongphat et al. Effects of a personalised ubiquitous learning support system based on learning style-preferred technology type decision model on University Students' SQL learning performance
Ruipérez-Valiente et al. The affordances of multivariate elo-based learner modeling in game-based assessment
dos Santos et al. A comparison of methodological frameworks for digital learning game design
CN106530162A (en) Education service system based on intelligent community
Shminan et al. Dynamic student assessment to advocate personalized learning plan
Brigui-Chtioui et al. Multidimensional decision model for classifying learners: the case of massive online open courses (MOOCs)
CN108140329A (en) Information processing equipment, information processing method and program
Lee et al. Design based system research of an online platform prototype to foster higher-order questioning
KR20160006586A (en) Systme, method for providing avatar service and computer readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18817642

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18817642

Country of ref document: EP

Kind code of ref document: A1