CA2574575A1 - Online qualification management system - Google Patents

Online qualification management system Download PDF

Info

Publication number
CA2574575A1
CA2574575A1 CA 2574575 CA2574575A CA2574575A1 CA 2574575 A1 CA2574575 A1 CA 2574575A1 CA 2574575 CA2574575 CA 2574575 CA 2574575 A CA2574575 A CA 2574575A CA 2574575 A1 CA2574575 A1 CA 2574575A1
Authority
CA
Canada
Prior art keywords
candidate
participant
window
module
qualification management
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA 2574575
Other languages
French (fr)
Inventor
Paul Blanchard
Luc Chevalier
Pierre Blanchard
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
HR ALLOY
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HR ALLOY filed Critical HR ALLOY
Publication of CA2574575A1 publication Critical patent/CA2574575A1/en
Abandoned legal-status Critical Current

Links

Abstract

An online qualification management system which can be used to perform qualification management, such as candidate assessment for recruitment purposes, job description simulations and learning management. Also described is an authoring tool to produce such online qualification management system.

Description

TITLE: Online qualification management system FIELD OF THE INVENTION
The present invention relates to web based tools to perform qualification management, such as candidate assessment for recruitment purposes, job description simulations and learning management. More specifically, the invention relates to authoring tools to produce a web-based qualification management system, methods to produce a web-based qualification management system, a web-based qualification management system and methods to perform qualification management via web-based system.

DETAILED DESCRIPTION OF NON-LIMITING EMBODIMENTS
Figure 1 shows an example of an online qualification management system for performing qualification management, in particular assess a person's suitability for a certain job, in accordance with a non-limiting embodiment of the invention. The qualification management system is implemented over a data network 14 and comprises a qualification management system server 12 that implements a data network site accessible via the data network 14. In this example, the data network 14 is the Internet and the data network site is a website.

A candidate for a job can use a client device 10 to interact with the qualification management system server 12 over the Internet 14. The client device 10 comprises a computing device, a display and a speaker (and possibly other output devices), and one or more input devices such as a keyboard, a mouse, a microphone, a stylus, and/or a touchscreen. In various embodiments, the client device 10 may be a personal computer (e.g., a desktop or laptop computer) or may be a networked personal digital assistant (PDA) or other wireless communication device having access to the data network 14.

Broadly stated, and as further detailed below, the candidate accesses the qualification management system website, logs on to the website by providing some personal identification, either a special code or personal information such as, for example, his/her name, date of birth, email address and a personal password. The candidate may also be expected to complete a consent document and answer specific questions about his/her education, prior work experience and work restrictions.
The job seeker may then be presented with one or more positions in an organization and is provided the option of viewing job descriptions of these one or more positions.
Each job description may be presented in the form of a text document and/or a 3D virtual simulation in order to give the candidate a more realistic idea of the corresponding job. The candidate may also be asked to complete an online qualification assessment task, which would help further assess his/her suitability for the chosen position.

Assuming the candidate decides to take the assessment task, he/she is presented with a series of text based tests and/or interactive scenarios. An interactive scenario is a virtual 3D simulation designed to present information to measure or assess certain predefined qualities of the candidate. With each scenario, the candidate is given the opportunity to choose an answer or a course of action in response to a question or situation that arises.

The qualification management system includes a scoring function to record the candidate's interaction and response to each text based test and/or interaction scenario and produce a report on the candidate's performance.

On completion of the assessment task, the qualification management system may send the candidate or anyone else an email acknowledging the completion of the assessment and at the option of the manager of the system, the result given by the scoring function or a report based on it.

Figure 2 is a general block diagram which provides additional details on the interaction between the client computer 10 and the qualification management system server 12. The client computer 10 includes an Internet browser 16 that includes integrally or as a plug-in a client application 18 for managing different modules that constitute the building blocks of the qualification management system. This client application can be MacroMedia Flash player or any other suitable one. The qualification management system server 12 includes at least one qualification management project with which the client computer 10 interacts. Typically, a qualification management project is associated with a given job type.
Therefore, there will be as many qualification management projects as there are job types for which openings exist.
Examples of job types include: sales clerk, security guard, store manager, secretary, files clerk, among many other, all of which would be associated with different qualification management projects.

As it will be described in greater detail later, a given qualification management project includes a series of modules and also a series of constructs. Typically, the modules convey information, either graphical or textual, to the job seeker that interacts with the qualification management project. An example of a module is a text based assessment test that the job seeker is asked to complete. Another example of a module is a 3D
interactive simulation that provides a description of the job type or that interacts with the job seeker for assessment purposes. Many other different modules can be used. The constructs define the system logic, such as the order in which the job seeker will interact with the different modules. The constructs are designed to take into account the manner in which the job seeker interacts with the qualification management project. For instance, actions of the job seeker or answers the job seeker has provided to an assessment test will condition the subsequent interaction with the qualification management project.

Figure 3 is block diagram that illustrates generally the infrastructure used to develop the qualification management system. A qualification management project author at client computer 22 communicates with the qualification management system server 12 over the Internet 14. The qualification management system server 12, in addition to the qualification management projects, implements also authoring tools that allow the author at client computer 22 to generate one or more qualification management projects. In this instance, the authoring tools and qualification management projects reside on the same server. Therefore, as the author at the client computer 22 is working with the authoring tools to produce 5 a qualification management project, job seekers at client computers 10 interact with respective qualification management projects on the same server. Note that it is equally well feasible to run the authoring tools such as to create qualification management projects, and run the qualification management projects on different servers.

Figure 4 is a high level block diagram of the architecture of the authoring tools and of the qualification management project produced by the authoring tools. The authoring tools designated generally by 24 includes a software implemented authoring tools module 26 which is designed to interact via the Internet 14 with the author at the client computer 22, such as to remotely receive commands and display on the client computer 22 information for the author to see. The authoring tools module 26 communicates with a resources database 28. The resources database may be implemented on the same platform on which the authoring tools module 26 resides or on a different platform. For instance, the resources database 28 may be implemented by a different server that is physically remote from the server 12 but communicates with it over the Internet 14.

The qualification management project 20 that is produced by the authoring tools 24 includes a plurality of module/construct pairs 30. Each module/construct pair 30 includes a module component 32 (for brevity it will be simply referred to as "module") and construct component 34 (for brevity it will be simply referred to as "construct"). In addition to the module/construct pairs 30, the qualification management project 20 also includes a core construct 36.

Any one of the modules 32 is a component of the project with which the job seeker interacts. By "interact" is meant that the job seeker will obtain some type of information from the module 32, and optionally will input some information into that module 32. A simple example of a module 32 that requires no input from the job seeker is a welcoming message, or an e-mail sent to the job seeker confirming his participation and completion of the online assessment. Another example of a module 32 that requires information input is a consent form, on which the job seeker must expressly indicate his/her agreement to rules that condition his/her participation to the assessment. Another example is a multiple choice test where the job seeker must answer questions. Yet, a more complex example of a module is a virtual 3D simulation.

The construct 34 associated with each module 32 is the logic associated with that particular module 32. The construct will determine how the module 32 behaves and may also include data collection and data processing functions. In the case of simple modules that are designed simply to communicate information to a user, such an e-mail generation module, the associated construct 34 will normally be quite simple. In this example, the construct will contain instructions to trigger the dispatch of the e-mail and notify the system that the e-mail dispatch has been completed. However, in more complex modules 32 where information is sought from the job seeker, more complex constructs 34 will be required.
In the case of a module 32 implementing a text based test, the associated construct 34 contains logic to capture the responses to the questions that the job seeker provides and then a scoring or marking logic to compute on the basis of the answers the score of the job seeker. The scoring logic may have multiple layers, in the sense that the same questionnaire may be used to evaluate the job seeker for different personality traits, such as for example, sociability, aggressiveness, etc.

The qualification management project 20 also includes a core construct 36 that defines the workflow of the project. Specifically, the core construct 36 selectively invokes the module/construct pairs 30 and determines the order in which the modules/construct pairs 30 will be run, makes decisions as to whether some module/construct pairs 30 will be run or not run and makes decisions on branching operations. In a branching operation, the core construct 34 makes a decision about an action to be carried out on the basis of characteristics of the job seeker or on the basis of the manner in which the job seeker has interacted with the qualification management project 20. An example of a branching operation is a situation where the job seeker is a female and the core construct 36 will take a workflow branch specifically designed for females.
Another example of branching operation is a situation where the job seeker scores so poorly at initial tests and, therefore, there is no sense to continue the assessment. Therefore, on the basis of scores the core construct 36 will terminate the assessment and will not invoke any subsequent tests.
A more detailed block diagram of the authoring tools module 26 and of the resources database 28 is provided at Figure 40. Specifically, the authoring tools module 26 has two main components, namely a workflow automation module 38 and a virtual simulation authoring tools module 40. The workflow automation module is used for generating all the module/construct pairs 30, except those that provide a 3D virtual simulation to the job seeker. The workflow automation module 38 communicates with a database 42 of resource documents that are used for building the modules 32. Specifically, the author at the client computer 22 can access the database and select among the documents and resources available to build one or more of the modules 32. Examples of resource documents that reside in the database 52 includes, a set of different welcoming messages, a set of different consent forms, a set of different screens to capture biographic data of the job seeker, a set of different text based tests, a set of different confirmation e-mails, etc. In addition to providing a wide variety of resource documents the author can chose from, those resource documents can be specifically tailored to meet specific conditions or criteria. For instance, the resource documents may be organized and tailored on a country basis, for use in qualification management projects that will be implemented in different countries, where employment laws require or prohibit certain practices. Another possible form of classification is language, where the resource documents are grouped per language of the job seeker.

The virtual simulation authoring tools module 40 generates the 3D virtual simulation that constitutes a module/construct pair 30. The virtual simulation authoring tools module 40 communicates with a media database/server 44 that provides media services during the building of the virtual simulation.

The operation of the workflow automation module 38 will now be discussed in more detail in connection with the drawings at Figures 41 to 49. Those Figures are different screen pages of the Graphical User Interface (GUI) of the workflow automation module 38 that allow the author at the client computer 20 to input commands via any suitable input device, such as keyboard, pointer, voice recognition or other.

Figure 41 is a screen page that the author sees at the beginning of the creation of a project. The screen page has at the left a window 50 with a list of selectable objects, such as Base 52, Branding 54, Documents 56, Modules 58, and Constructs 60. By selecting any one of those objects, the author is directed to an appropriate screen page. The screen page shown at Figure 41 corresponds to the Base 52 object. Above the window 50 is provided an area 62 with input fields 64, 66 and 68 where the author can input a project name, any abbreviations and specify the type of project. Above the area 62 are located a series of buttons 70 that correspond to typical commands, such as New, Save, SaveAs, Search, Report, Refresh and Delete. On the right of the window 50 is provided an area 72 which contains additional fields about the project. Those fields allow the author to specify information such as the priority of the project, whether the project is active or not, its date of expiration, hiring authority, among others.

A selection of the Branding 54 object will direct the author to the screen page shown at Figure 42. Note that the window 50, the commands 70 and the area 62 are the same as the previous screen page. The area 74 on the 5 right of the window 50 contains various fields 76 allowing the author to import graphical material that can be used in designing the pages or documents that job seekers will see when they are being assessed. For instance, the graphical material can be the logo of the company that is 10 looking to fill a position. The fields allow specifying the graphical background to use and the frame, if any.
Also the fields 76 allow specifying any top or bottom logos to use and frames, if any.

A selection of the Documents 56 object brings the author to the screen page shown at Figure 43. Note that the window 50, the commands 70 and the area 62 are the same as the previous screen page. The area 78 at the right of the window 50 contains a table 80 that allows the author to import documents, such as text documents, that will be used to build modules. For instance, in certain project the author may require welcome messages, consent forms or assessment tests that are unique to the employer and which may not be already available in the resources database 28. In such case, the employer may make available to the author of the project sample documents for use. Those sample documents are imported in the project via the table 80. Specifically, the table 80 contains a FileName column 82, where the various documents associated with the project are listed and a Modified column 84 that shows the documents on which modifications have been made. Below the table 80 is provided a set of controls 84, such as buttons, allowing manipulating the documents in the table 80. Specifically, the controls 84 allow refreshing the table 80, adding documents to the table 80, open documents or delete documents.

A selection of the Modules 58 object brings the author to the screen page shown at Figure 44. Note that the window 50, the commands 70 and the area 62 are the same as the previous screen page. The area 86 at the right of the window 50 contains two windows 88 and 90 allowing the author to select modules for use in the project. The window 88 shows a folders list, where each selection (folder) corresponds to a module type. Several module types can be considered. The following is a non-exhaustive list:

1. Consent Forms Modules typically containing an explanation text and a question. Examples: consent to share information or review background, acceptance to continue the process.

2. Messages Modules designed to provide information without gathering data from the candidate. Examples:
Welcome page, overall instructions, e-mails to candidate.

3. Equal Employment Opportunity (EEO) forms Modules used to gather information about the candidate. Examples include:

= Name;

= Social Security Number;
= Position applied for:
= Position title;
= Sex;

= Race/Ethnic Identification such as Caucasian, Black, Hispanic, American Indian or Alaskan native, or Asian or Pacific Islander = Handicaps = Where did the candidate hear about the position Answers are typically stored in the candidate table, but they can also be stored in the answer table or both.

4. Tests All modules designed to rate the candidate with their answers to a set of questions. Answers are typically stored in the answer table. Examples:
Cognitive and personality assessment; IQ test; job specific tests such as 'Sales ability', 'Custormer service', 'Call center agents', etc.; career inventory and planing (e.g., accountant career goals); attention and reflexes testing;
motivation testing; etc.
5. Surveys All modules designed to gather the candidate's opinion. Answers are typically stored in the answer table. Example: Individual preferences, feedback.

6. Multi-Raters Feedback (MRF) All modules used for multi-raters feedback. It is similar to the survey. A group of candidates fill out the same survey about one other candidate.
Answers are typically stored in the answer table.
7. Simulations Modules typically created with the virtual simulation authoring) tool module.

8. Interview Modules designed to gather information about a candidate by an administrative employee. Example:
interview guide.

Figure 45 shows what happens by selecting the Test module type 92. Window 88 shows a menu of the various test modules that are available for selection. Here, the author can chose the different tests that the participant should be subjected to in order to evaluate the participant for the job. The window 90 contains a list of the different modules that are selected from the window 88. The selection or de-selection can be made via control arrows 94. Accordingly, the list in window 90 contains the modules that would constitute the project. For instance, the list may be as following: (1) welcoming message (message type module); (2) consent form (consent form module); (3) job preview simulation (simulation type module); (4) IQ test (test type module); (5) job interactive simulation (simulation type module); (5) thank you message (message type module) and (6) confirmation e-mail (message type module).
Anyone of the modules listed in the window 90 can be selected and a Module parameters window 94 invoked, as shown in Figure 46. The window 94 allows specifying different parameters for the module selected. One particular parameter in connection with test modules is the normalization parameter. When tests are scored, the returned marks are absolute numbers which provide little information by themselves unless they are put in a context. This is the case of many assessment tests, except perhaps the IQ test. By normalizing the results, however, it can be easily seen how the candidate compares to the rest of the population and thus provides more insight to his/her behavior. The normalization parameter will therefore perform a normalization of the results such as to show what those results mean in a specific context.
Different normalizations can be made, specifically the author can chose the population group that will be used as a basis for the normalization. Examples of population groups include:

= Urban white males;

= French speaking rural females;

= Job level (e.g., Senior, mid level, lower level manager);

= Job type (e.g. Accountant, police officer, teacher) When the author selects the Constructs object 60 the screen page at Figure 47 is invoked. The area 96 at the right of the window 50 allows the author to create the core construct 36. The core construct 36 is built by using any suitable programming language. Languages that can be interpreted by expression parsers have been found suitable. The area 96 includes a declarations window 98 where declarations are put and an algorithm window 100 where the logic of how the different modules interact is input. As a baseline, the authoring tools module is 5 designed with logic that reads the modules selected for the project (that appear in window 90 at Figure 45), and create a sample code designed to successively invoke the various modules when the qualification management project is run. Therefore, the built-in logic, if executed would 10 present the various modules to the participant one after the other.

The declarations window 98 and the algorithm window 100 can be modified by the author in order to provide the 15 project with more sophisticated functions. One of those functions is branching. Branching is an execution or invocation of a module only when a certain condition is met. For example, the test which is presented to the participant is sex dependent. If the participant is a male then test A is used, while if the participant is a female then test B is used. Conditional algorithms in the core construct 36 will determine which module to invoke (test A or test B) depending upon the answer provided to the EEO form (male or female) by the participant.

Another example is a situation which requires that the project is run only when the participant has accepted the conditions in the consent form. Accordingly the logic in the core construct 36 looks for the answer to the consent form and only if the consent form has been accepted, then the modules to perform the evaluation are invoked.
Otherwise, the project run is aborted or a special message module can be invoked confirming to the participant his/her refusal of the conditions in the consent form and then the system exits.

Yet another example of branching is the situation where the number, or kind of tests that the participant is asked to complete is conditional upon the score obtained at an earlier test. For instance, if the participant has obtained a very low score to one test, he/she is dismissed by the system logic, instead of being directed to the next test. So, after the test is completed and the score determined, the logic in the core construct 36 will invoke a farewell message or screen instead of calling another test.

As discussed earlier, the core construct 36 implements the logic that manages the workflow when the project 20 is run. However, the logic that controls how the different modules behave is determined by the constructs 34 which are associated to respective modules of the projects. The constructs 34 can be built by accessing the screen page shown in Figure 48. At the left is a window 102 in which there is a list of modules, while at the right are provided two superposed windows 104 and 106 where declarations and algorithms, respectively can be input. The remainder of the screen page is similar to the screen pages described previously so no further details are deemed necessary.

The modules that appear within the window 102 are not necessarily those associated with the current project 20 but may include all the modules that are available to the author to choose from at the module selection stage. The logic that the author builds by entering programming language at the screen page of Figure 48 determines the functions that the module will perform. Example of functions includes:

= Causing the information contained in a module (such as a message type module) to be shown to the participant, and detecting when the participant has finished viewing it, by sensing clicking of a close button for example.

= Causing dispatch of an e-mail message to the participant.

= Tracking the actions of the participant while the participant interacts with the module. A simple example of tracking is determining the answers provided by the participant to questions in a test.
This is done by entering code that will monitor which answers are provided to individual questions and storing those questions in a table. Other examples of actions tracking will be discussed later in the context of a simulation example.

= Scoring the answers provided by the participant.
Here, the construct 34 includes code that interprets the various answers according to built-in logic and provides a score. A simple scoring algorithm is one that accumulates the marks to individual questions to provide a collective score. Such simple scoring algorithm could be used for IQ tests, for instance.
A more complex scoring algorithm is one that interprets the answers to provide insights about the personality or behavior of the participant. For instance, the answers to the same question set can be interpreted to determine different behavioral traits, such as ascendancy, responsibility, emotional stability, sociability, cautiousness, original thinking, personal relations and vigor, among many others. The interpretation of the score is based on interpretation keys that can be described as logic linking the answers to a particular behavioral trait.
Once a given interpretation key has been developed, it can be coded via the screen page 48.

The screen page at Figure 49, which is very similar to the page at Figure 48, shows that the author has available for selection a series of constructs corresponding to common or frequently used interpretation keys. For example, a selected module in the window 102 shows a sub-menu with different interpretation keys for which the code for the construct 34 is provided. In the example shown, ascendancy is selected and the window 106 shows the code that implements the interpretation key allowing evaluating ascendancy on the basis of the answers provided by the participant.

The design of the modules 32 that correspond to a virtual simulation is done via the virtual simulation authoring tools module 40. The operation of the virtual simulation authoring tools module 40 will now be discussed in more detail in connection with the drawings at Figures 5 to 39. Those Figures are different screen pages of the Graphical User Interface (GUI) of the virtual simulation authoring tools module 40 that allow the author at the client computer 22 to input commands via any suitable input device, such as a keyboard, pointer, voice recognition unit or other input device.

Figure 5 is a screen page that the author sees at the beginning of the creation of a virtual simulation module.
The screen page has at the left a window 110 with a list of selectable object classes, such as Static images 112, Animations 114, Controls 116 and Avatars 118. At the bottom of the window 112 is provided a set of controls 120 that allow determining how graphic elements will behave, such as a position on the X axis, a position on the Y
axis, the X axis scale, the Y axis scale, whether the graphic element is visible, locked or a mirror image. At the top of the window 110 is provided a set of controls 122 that identify the scene that is being designed (field 124), allow adding or removing elements and determine the depth position of an element.

On the right of the window 110 is a large area 126 that constitutes a viewing pane in which graphical elements that have been selected for the scene can be viewed by the author. Above the window 110 and the area 126 is a status bar 128 providing status information about the project, such as the name of the project in field 130, the client for whom the simulation is made in field 132, the status in field 134, update and version information in fields 136 and 138. Immediately above the status bar appears a menu 139 allowing the author to access different functionalities of the virtual simulation authoring tools module 40. The following elements can be selected, Base 140, Scenes 142, Flowchart 144, Media Library 146, Run Scenario 148 and Run Event 150.

Above the menu 139 are located a series of buttons 152 that correspond to typical commands, such as New, Save, SaveAs, Search, Report, Refresh and Delete.

From a visual perspective, virtual simulations are structured in terms of scenes. A scene would usually include a background or static image that can be selected via the GUI at Figure 5. By clicking the Static images 10 112 object class, a menu is shown listing the available backgrounds. For example, the viewing pane in area 126 shows a background that corresponds to a reception area.
Figure 6 is a somewhat different example, where the author has selected a "Store Front" background.

Figure 7 provides yet another example of the GUI for selecting a background. Here, the author has selected two static images, namely a conference room and a conference table mask. Both images appear in the viewing pane in area 126 as overlaid. Therefore, the control 112 allows "stacking" graphical objects by individually selecting those objects such as to create complex scene backgrounds.

The static images that are selected allow for little or no interaction with the participant. They are intended mostly to provide a pictorial representation of context for the simulation.

The controls object class 116 allows access to objects that can interact with the participant during the simulation. Those interactive objects will be selected according to the simulation that is being run. Typically, the interactive objects constitute "tools" allowing the participant to accomplish a certain task or they convey information and/or collect information from the participant. The interactive objects can be set to appear anywhere within the area 126. In the example of the screen page at Figure 8, five interactive objects have been selected and they are located along the left border of the viewing pane.

Examples of objects that are suitable to a simulation in an office environment include:

= An interactive e-mail object. Simulates arrival of e-mail, allows the participant to open the e-mail, read the e-mail and respond to the e-mail.
The screen page at Figure 9 provides an example of the view that the participant will see when invoking an e-mail object. A window 160 pups-up simulating the "in-box" of an e-mail system. In that window the participant can read an e-mail, file the e-mail in an appropriate folder, and if enabled, compose a response, select recipients and send the response. When the e-mail object is not active, it appears on the participant's display as an icon, showing the total number of e-mails that have arrived and those that have not yet been read.

= An interactive calendar object. Simulates a calendar and allows the participant to see a schedule of activities, such as a daily schedule, weekly schedule or a monthly schedule.
The screen page at Figure 10 provides an example of the view that the participant will see when invoking a calendar object. A window 162 pups-up simulating a calendar page. In that window the participant can determine a daily schedule such as what meeting are planed, and if enabled allow the participant to create new events such as meetings, modify existing events by cancelling them or re-scheduling them. When the calendar object is not active, it appears on the participant's display as an icon.

= Telephone interactive object. Simulates a voice mail system. When invoked by the participant a window pups-up to provide in text form a voice mail message. Optionally, the window can present choices for possible answers, enabling the participant to select an answer. When the telephone interactive object is not active, it appears on the participant's display as an icon.
In Figure 10, the icon is shown at 170. A
counter also appears next to the icon to show the total number of voice mails that have been received and those that have not been read.

= Messenger interactive object. Simulates a messenger that delivers information to the participant. When invoked, the information appears in a text box on applicant's screen.
Optionally, the information also provides the possibility for an answer. This is shown in the screen page example at Figure 11. The text box 180 provides a series of possible answers the participant can chose from. The selection of an answer is made by clicking on the appropriate selection. Larger text can be accommodated by a larger text box, as shown at Figure 12. When the messenger interactive object is not active, it appears on the participant's display as an icon. In Figure 11, the icon is shown at 182.

= Knowledge bank object. Simulates a database that delivers information to the participant. When invoked, the information appears in a text box on applicant's screen. When the knowledge bank interactive object is not active, it appears on the participant's display as an icon. In Figure 11, the icon is shown at 184.

The avatars class 116 of objects can be selected via the window 110. In a specific example of implementation an avatar is a graphical representation of a colleague or coworker the participant interacts with during the simulation. For example, an avatar can be caused to take different postures and also talk to the participant via speakers or any other suitable sound producing device.
The avatars can also show a limited degree of animation.
When they talk, the lips move in accordance with the speech.

As shown at the screen page on Figure 13, several avatar choices are available to the author when designing the virtual simulation. The avatar choices are listed in the window 110. The avatars are distinguished from one another on the basis of:

0 Sex. Male or female avatars can be selected.

= Race. Avatars of different races can be selected, such as Caucasian, Black, Hispanic, American Indian or Alaskan native, or Asian or Pacific Islander.

= Dress code. Avatars can be dressed differently such as with business attire, casually or with a uniform.

= Posture. An avatar can be standing, sitting or have some other posture.

The screen page at Figure 14 illustrates the manner in which the author can determine the relative position (foreground or background) of the various objects selected for the scene, namely the background, the interactive objects and the avatar. The depth control is accessed via the "Depth" button 190 that appears in Figure 5, for example. When this button is pressed via the pointer device, the screen page at Figure 14 appears with a window 200 that lists all the objects that have been selected for the scene. The order in which the objects are listed is the order in which they are graphically overlaid in the viewing pane. For instance, the office and desk that constitutes the scenery is at the top of the list and that is in the background of the image. Calendar object 202 overlays the office and desk scenery. The e-mail object 204 is in front of the calendar object 202. The e-mail button object 206 (to invoke the e-mail object) is in front of the e-mail object 204. The messenger object 208 is in front of the e-mail button object 206. The voicemail object 210 is in front of the messenger object 208. The knowledge bank 212 is in front of the voicemail object 210. The office desk 214 is in front of the knowledge bank 212. Finally, the avatar 216 is in front of the office desk object 214. Since the avatar 216 is at the foreground of the image it appears visually in front of everyting else.

5 The relative visual position of anyone of the objects in the stack can be controlled via the up and down arrows 218. For instance, the object is selected and by clicking the appropriate arrow its position is changed toward the background or toward the foreground of the image. For 10 instance, as shown in Figure 16 the position of the avatar 216 has been changed to lay behind the office desk 214.

The screen pages at Figures 16 to 28 provide the author with control over the behavior of the objects 15 previously selected for the scene or scenes. The basis control is achieved via the screen page at Figure 16.
The action taking place during the simulation is defined in terms of events. In the current example, a simulation can therefore be defined as a series of events in the 20 context of scenes. So, once the author has created one or more scenes that will be used in the virtual simulation, the next step is to create the events.

The screen page at Figure 16 has two main components.
25 The first main component is a window pane 300 where the relationship between the various events is graphically shown. In other words, the window pane 300 displays an event tree 302 showing the logical relationship between the events and also describes the flow of events that takes place as the simulation is being run by the participant. The event tree is generated automatically as the various events are created. As the author specifies characteristics of each event, those characteristics are ~

noted and logical relationships defined and graphically represented in the window pane 300.

The generation of the event tree is done by an event tree generator which is not apparent to the user, except of course for its output which is the event tree 302. A
flowchart of the event tree generator is shown in Figure 50. The event tree generator 400 is a software module that takes as input the various parameters that define each event to determine how to create the event tree.
Specifically, the event tree generator 400 determines the order of the events in the simulation. Secondly on the basis of the logic of the various events, links are created between the events to create a path, from one event to the next event in the simulation. The links are particularly useful when branching points are encountered.
A branching point occurs when the current event leads to two or more subsequent events, where the choice of the path to follow depends on some condition. An example of a condition is an action taken by the participant. For instance, consider the situation where the participant is presented with an e-mail message and a voice mail message.
If the participant elects to open the e-mail message, the event flow will go along one branch which includes further events associated with the e-mail message. On the other hand, if the participant elects to open the voice mail message then the event flow will go along another branch which includes further events associated with the voice mail message.

The output of the event tree generator 400 appears in the window pane 300 shown in Figure 16. Individual events are shown by event boxes 304 with labels therein to identify them. The order of the events 304 is along a vertical axis, from top to bottom. In other words, the event box 304 at the top will be run before the event boxes 304 below. The links between the events appear as lines 306 between the respective event boxes 304. In a simple event tree as shown in Figure 16, the path from one event to the other is linear. In more complex event trees, of the type shown in Figures 37 and 38, where branching points exists, the links 306 allow to visually identify which are the possible pathways that arise form an event that defines a branching point.

In addition to graphically illustrating the flow of events and their logical relationship, the event tree 302 also provides a convenient navigation tool to select anyone of the events in the simulation. The selection is simply done by clicking with a pointer the event box 304 corresponding to the desired event and this causes the area 400 at the right of the viewing pane 300 to fill with information relating to the selected event. The area 400 contains several areas of information, namely a block of fields 402 that identify the event and the scene to which the event is associated. Below the block of fields 402 is provided a set of small windows 404 and 406 that allow programming the actions or the behavior of various objects for the event. Specifically, the window 404 is in the form of a menu listing the various objects that have been previously specified for the scene. The window 406 displays the actions selected for the objects.

The screen page at Figure 17 provides a more detailed example. The field 408 immediately above the window 404 shows that the e-mail object has been selected. The list of objects that appeared previously in the window 404 was replaced a list of actions or behaviors that are available for that object. The author therefore can select among the available list those that are most suitable for the desired activity.

The screen page at Figure 18 shows that the "StackEmail" action has been selected. This allows the author to determine the content of an e-mail message that the participant will receive during the simulation. When this selection is made, a box pops-up in the window pane that allows specifying the characteristics of the e-mail message that the participant will receive. The following characteristics can be defined:

= The e-mail number.

= The originator of the e-mail.

= The person to whom the e-mail is sent.
= Individuals copied in the e-mail.

= The time at which the e-mail is received.
= The subject of the e-mail.

= The folder in which the e-mail is classified = The priority of the e-mail message = The message contained in the e-mail.

As shown in the screen page at Figure 19, the author can define certain other parameters of the e-mail message by invoking the box 500, allowing specifying in the check boxes 502 if the participant can reply to the e-mail message, if the participant can create a new mail message, if the participant can forward the e-mail message or if the participant can delete the e-mail message.

The behavior of interactive objects listed earlier, except of avatars that will be discussed later can be defined in a similar fashion. For each object, the virtual simulation authoring tools module provides a set of available functions or actions from which the author can chose. For example in the case of a voice mail object, the author can define the content of the message stored and provide the participant with the option to respond or not respond. If the option to respond is enabled, a list of possible responses or actions can be provided so that the participant can chose the one he/she thinks is the most appropriate to the situation. In the case of a calendar object, the author can define entries such as meetings that can be read by the participant.
Again, the calendar object can be enabled to receive input from the participant, such as create new events, alter existing ones or delete them.

When an action in connection with an object is selected, a code generator (not shown) in the virtual simulation authoring tools module produces the necessary lines of program code such that during the execution of the simulation the desired behavior will be obtained. The language produced by the code generator any suitable high level language. Languages that can be interpreted by expression parsers have been found suitable.

While the use of a code generator is very practical and can be used for defining often used actions, there are some limitations to the range of options that are available. For that reason, a code editor is provided allowing the author to input code that determines the behavior of the object and that can be used to obtain some very specific and sophisticated actions. The screen page at Figure 39 illustrates the code editor window 600 open.
The author can input lines of code that specify desired 5 actions and save them such that when the simulation is run the action will be produced.

A specific control exists that allows determining at the end of a given event which event will follow next.
10 This is shown at screen page 28. The box 700 provides a list of options in terms of conditions that may arise and the corresponding event to be triggered. In this specific example, the participant is asked a question in the current event and the event that will follow next is 15 determined on the basis of the answer provided to the question. If response 1 is provided then the event that will be triggered is event number 4. In contrast, if response 2 is provided then event number 5 is followed.
This is a simple example of branching where the branching 20 condition is the response to a question. Since only two possible answers (response 1 and response 2) define the branching tree, the box 700 contains only two entries. It will be plain to a person skilled in the art that much more entries can be provided to create more complex and 25 sophisticated conditions to determine which limb of the branch to follow depending upon the actions taken by the participants.

In general, the following occurrences during the 30 simulation can be used in determining the path to follow at a branching point:

0 The response by the participant to a question;

= The time taken by the participant to perform a certain action or respond to a question. For instance, if the participant does not know what to do any subsidiary questions to the main questions are bypassed and the simulation flow moves to a different event.

= The choice made by the participant in interacting with objects. For example, the participant may be faced with a voice mail message and an e-mail message. The selection of which one will be handled first determines the event that will be triggered subsequently.
=, The ability to create branching points in the events flow constitutes a very powerful tool that enables the creation of dynamic simulations that can create a higher level of challenge for the participant. Accordingly, the results of those dynamic simulations can provide a better assessment of the suitability of the participant for a job, for example. The window panes 300 in Figures 37 and 38 provide more detailed examples of branching situations.
In Figure 37, three branching points exist in the event tree. The first branching point originates at the start event box 304 (a). If a particular condition arises there, the simulation aborts by leading to the end box event 304 (b).

The second branching point occurs at the event 304 (c) which corresponds to a question. Depending upon the answer to the question, the even flow can continue with subsidiary questions at events 304 (d), (e) or (f). The last branching point is at event box 304 (g). This is a reverse branch where the limbs lead to a common point.
Here, the event boxes 304 (d), (e) and (f) all lead to the common event box 304 (g).

Similar branching point situations arise in the event tree on the screen page at Figure 38 (which is a continuation of the event tree of Figure 37).

The screen pages at Figures 20 to 26 allow defining the behavior of an avatar. Generally, there are two main aspects or properties of the avatar that can be programmed. One it's the animation and the other is the speech.

The animation can be programmed via an animation control window 1000 shown in Figure 20. The animation control window shows an animation control interface 1002 allowing the author to determine the initial posture of the avatar as well as the manner the posture will vary throughout the interaction with the participant. The animation control interface includes a number of possible body movement selections. By "clicking" on one of those options the movement is automatically built-into the simulation such that when the simulation is run the avatar will, at the appropriate time perform the desired body movement. To facilitate the selection of the body movements the animation control interface provides a visual framework 1004 that mimics the shape of the human body on which are overlaid body movement selection controls 1006. The body movement selection controls 1006 are distributed over the visual framework 1004 according to the body parts to which motion is to be imparted. In other words, the controls 1006 for head movement are located over or adjacent the head part of the visual framework 1004, the controls 1006 for the left hand movement are located over the left hand part of the visual framework 1004, etc. This creates an input interface that is very intuitive to use and allows the author to program body movements very quickly and efficiently.

The body movement selection controls 1006 are in the of boxes with labels therein identifying the movement they produce. Some or all of the boxes are also associated with arrows 1008 that provide additional visual cue about the motion produced.

The lower part of the animation control window 1000 is provided with a set of controls 1010 allowing to play the motions that have been selected such as to see if the desired animation has been produced. When this control is activated, the avatar that appears at the right of the animation control window 1000 performs the body movements in the same fashion as they would appear to a participant when the simulation is run.

Specific examples of body movements are shown on the screen page at Figures 22, 23 and 24. The screen page at Figure 22 shows the body movement produced when the control 1006(a) has been selected. The body movement is an extension of the right hand of the avatar. The screen page at Figure 23 shows a "hello" body movement produced by selecting the control 1006 (b) Finally, the screen page at Figure 24 shows a finger pointing body movement produced by selecting the control 1006 (c).

The speech uttered by the avatar can be programmed via the interface shown on the screen pages at Figures 25 and 26. With reference to Figure 25 first, the user interface includes a selection control 1100 allowing determining the characteristic pronunciation of the avatar. Specifically, the control 1100 is in the form of a menu of choices, where different choices correspond to different characteristic pronunciations. Characteristic pronunciations in the list are distinguished on the basis of sex (male, female), social background and native language among others.

The screen page also has an input window 1102 in which the text to the uttered by the avatar is typed by the author. Once the text is input, it is converted to speech by a text-to-speech engine according to the selected characteristic pronunciation. There may be independent text-to-speech engines for each characteristic pronunciation in the list of the control 1100, or a single text-to-speech engine with modifiable parameters according to the characteristic pronunciation that is selected. The text-to-speech engine can be part of the qualification management system server 12 or it may reside in the media server database 44. The output of the text-to-speech engine is an audio file in any appropriate format.

In addition to the audio file the text-to-speech engine also outputs a lip synchronization code that determines how the avatar will move the lips to provide a realistic talking effect. The lip synchronization code is shown in the window 1104. The code is in the form of a series of ones and zeros, where a zero corresponds to a closed mouth and a one to an opened mouth. The lip synchronization code is derived from the audio file. The text-to-speech engine analyzes the audio file to determine where text exists and where silence exists. Specifically, the audio stream is broken into small intervals, say 5 or 5 20 millisecond intervals. The text-to-speech engine analyses the intervals to determine if they contain active speech or silence. The techniques for performing this analysis are well known in the art and there is no need to describe them in detail. The lip synchronization code is 10 produced by associating a zero to the intervals which contain silence and one to those containing active speech.
At the right of the window 1104 is provided a pictorial representation 1106 of the avatar's head. The 15 author is capable to play the audio file produced by the text-to-speech engine and at the same time the lips of the avatar head move according to the lip synchronization. In this fashion the author can determine if the desired talking effect has been produced, either from a message 20 point of view or from a visual point of view.

When a simulation is being run the program logic that manages the simulation is also designed to track the activities of the candidate such as to be able to score or 25 rate the candidate on the basis of its interactions with the simulation. As the simulation is being run the program logic captures data points that characterize the manner in which the candidate interacts with the simulation, in particular with the interactive objects.
30 Specifically, the program logic is a way to track mouse clicks, rollovers, scrolls, typed responses or other actions during a simulation. The measurement is highly granular - including what the candidate did, when it did it, in what order, react times, and who the candidate sent things to.

Examples of data points that can be captured during a simulation include:

= In connection with an e-mail interactive object o if e-mail opened o time e-mail opened o number of e-mails opened o choice of e-mail opened o if document attached to e-mail was read or not o if reply, reply to all, forward, cc or bcc was selected o if subject line changed o if e-mail closed without action o if e-mail re-opened o if e-mail deleted o if e-mail transferred to folder and in the affirmative which folder = In connection with an instant messenger interactive object o if instant messenger window closed without action (if reply empty on close) o if contact added o if contact deleted o understanding text message acronyms such as AFK = Away From Keyboard o interaction between instant messenger and e-mail such as copy and paste messages from one into the other = In connection with a telephone interactive object o Action taken immediately after call (did the candidate open calendar interactive object to enter task or meeting) o If message was left o If call was transferred o Were was the call transferred = In connection with question box interactive object o Time at which question was opened o Answers entered by candidate o If answered where changed during selection o Number of times answers were changed o If question skipped (viewed but not answered) = In connection with a voice mail interactive object o Time at which voicemail was accessed o Which answer to voicemail was selected o Was message saved o Was message deleted o Was message transferred to another recipients voicemail box o Was contact added o Was contact deleted o Next action taken after call (did the candidate open a calendar interactive object to enter task or meeting immediately after the call) Turning now to Figures 51 and 52, there is shown a flow chart describing an example of interaction of the candidate with the qualification management system server 12 over the Internet 14.

At step 2000, the candidate uses the client device 10 to access the qualification management system website implemented by the qualification management system server 12. The candidate logs into to the website by providing some personal identification, either a special code or personal information such as, for example, his/her name, date of birth, email address and a personal password. This login can be effected via a login page provided with various field in which the user can enter the information required to log into the website.

At step 2100, the qualification management system server 12 causes the candidate to be presented with a consent document describing terms that the candidate can accept or not accept. An example of a consent document is shown in Figure 53. In this example, the candidate can indicate his acceptance or non-acceptance by selecting an appropriate graphical box associated with the consent document.

At step 2200, the qualification management system server 12 determines whether the candidate accepts the terms of the consent document. If the candidate accepts, the process proceeds to step 2300; otherwise, the process ends and the candidate may receive a message indicating that his/her non-acceptance is such that the process cannot continue.

Assuming that the candidate accepts the terms of the consent document, at step 2300, the qualification management system server 12 causes prompting of the candidate to provide personal information about himself/herself. For example, the candidate may be prompted to provide personal information such as his/her name, contact information (e.g., telephone number, email address), home address, education (e.g., degrees obtained), work experience, work restrictions (e.g., authorization to work in a certain country), or other personal information. This prompting may be effected by presenting a series of questions to the candidate to obtain the required personal information. An example of a window prompting the candidate to provide personal information through one of a series of questions is shown in Figure 54. The qualification management system server 12 creates a record associated with the candidate and includes in that record the personal information provided by the candidate.

At step 2400, the qualification management system server 12 presents to the candidate one or more positions that may be applied for and allows the candidate to view a description of each of these one or more positions. Each description may be presented in the form of a text document and/or a virtual simulation in order to give the candidate a more realistic idea of the corresponding position. For example, the candidate can select a given one of the one or more positions that he/she would like to view a description of by clicking on a graphical element (e.g., a link or button) associated with that position.

At step 2500, the qualification management system server 12 determines whether the candidate desires to view the description of at least one of the one or more positions that may be applied for. If so, the process 5 proceeds to step 2600; otherwise, the process moves to step 2700.

At step 2600, which is performed as a result of the candidate having expressed a desire to view the 10 description of at least one of the one or more positions that may be applied for, the qualification management system server 12 proceeds to present to the candidate the description of each position that he/she desired to view.
As mentioned above, the description of a given position 15 may be presented in the form of a text document and/or a virtual simulation.

At step 2700, the candidate selects a position that he/she desires to apply for (hereinafter referred to as 20 "the selected position") from the one or more positions that may be applied for. For example, the candidate can select the selected position by clicking on a graphical element (e.g., a link or button) associated with that position.

At step 2800, in this example, the qualification management system server 12 proceeds to invite the candidate to complete an online qualification assessment task, which would assist in further assessing his/her suitability for the selected position. This can be effected by displaying a message inviting the candidate to complete the online qualification assessment task.

At step 2900, the qualification management system server 12 determines whether the candidate accepts to complete the online qualification assessment task. If the candidate accepts, the process proceeds to step 3000;
otherwise, the process ends and the candidate may receive a message indicating that his/her non-acceptance is such that the process cannot continue.

Assuming that the candidate accepts to complete the online qualification assessment task, at step 3000, the qualification management system server 12 causes the candidate to be presented with a series of text based tests and/or interactive scenarios. An interactive scenario is a virtual simulation designed to present information to measure or assess certain predefined qualities of the candidate. With each scenario, the candidate is given the opportunity to choose an answer or a course of action in response to a question or situation that arises. As the candidate goes through the series of text based tests and/or interactive scenarios, the qualification management system server 12 captures the candidate response to each text based test and/or interactive scenario. The scoring function of the qualification management system server 12 uses the captured information in order to rate the candidate's performance in completing the online qualification assessment task. Further detail regarding step 3000 is described in connection with Figure 55.

With additional reference to Figure 55, step 3000 is the assessment step where the candidate is assessed to determine if its character, behavior, knowledge or experience is suitable for the position. Specifically, at step 3002 the candidate is provided with a first text based test and requested to answer the questions on the test. Assume for the sake of this example that this test is multiple choice test and each question is provided with a series of possible answers. The candidate selects the answers by "clicking" with the pointer device the choices he/she thinks are the correct choices. As discussed previously, the text based test is a module 32 associated with a construct 34. The construct includes logic to capture the responses of the candidate and also to score the responses according to an interpretation key.

The process continues at step 3004 where the candidate is presented with a second text based test. The process performed here is similar to step 3002.

Step 3006 is a conditional step where a determination is made on the score of the candidate to the second text based test. The rationale for this conditional step is that if the score is outside a certain range, the candidate is deemed unsuitable for the position and there is no point continuing with the assessment any further.
Accordingly, if the conditional step 3006 is answered "no"
then the process flow branches out and the assessment is terminated. On the other hand, if the candidate's score is within the range considered acceptable then the process continues to step 3008. Step 3008 is the last step of the assessment process where a virtual simulation is presented to the candidate, the interactions of the candidate with the virtual simulation collected and a score generated.

Continuing with Figure 52, at step 3100, in this example, on completion of the assessment task, the qualification management system server 12 produces a report on the candidate's performance using the information captured as part of the series of text based tests and/or interactive scenarios effected in step 3000.

At step 3200, in this example, the qualification management system server 12 sends to the candidate or another party (e.g., a recruiter) an email acknowledging the completion of the assessment task and at the option of the manager of the system, the result given by the scoring function or the performance report generated at step 3100 or a report derived therefrom.

It will be appreciated that, in some embodiments, certain functionality of a given component described herein (including the online qualification management server 12) may be implemented as pre-programmed hardware or firmware elements (e.g., application specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.) or other related elements. In other embodiments, a given component described herein (including the online qualification management server 12) may comprise a processor having access to a code memory which stores program instructions for operation of the processor to implement functionality of that given component. The program instructions may be stored on a medium which is fixed, tangible, and readable directly by the given component (e.g., removable diskette, CD-ROM, ROM, fixed disk, USB key, etc.). Alternatively, the program instructions may be stored remotely but transmittable to the given component via a modem or other interface device connected to a network over a transmission medium. The transmission medium may be either a tangible medium (e.g., optical or analog communications lines) or a medium implemented using wireless techniques (e.g., microwave, infrared or other wireless transmission schemes).

It will also be appreciated that, while certain examples have been presented, the management system contemplated herein may be used in various applications, including, for example, recruiting, job description simulation, training and coaching applications.

Although various embodiments have been illustrated, this was for the purpose of describing, but not limiting, the invention. Various modifications will become apparent to those skilled in the art and are within the scope of this invention, which is defined more particularly by the attached claims.

Claims (4)

1. An online qualification management system.
2. An authoring tool to produce an online qualification management system.
3. A method to produce an online qualification management system.
4. A method to perform qualification management via an online system.
CA 2574575 2006-10-18 2007-01-19 Online qualification management system Abandoned CA2574575A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US86203406P 2006-10-18 2006-10-18
US60/862,034 2006-10-18

Publications (1)

Publication Number Publication Date
CA2574575A1 true CA2574575A1 (en) 2008-04-18

Family

ID=39315261

Family Applications (1)

Application Number Title Priority Date Filing Date
CA 2574575 Abandoned CA2574575A1 (en) 2006-10-18 2007-01-19 Online qualification management system

Country Status (1)

Country Link
CA (1) CA2574575A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111339188A (en) * 2020-02-20 2020-06-26 百度在线网络技术(北京)有限公司 Block chain-based media content processing method, apparatus, device, and medium
CN117709780A (en) * 2023-12-11 2024-03-15 北京谦润和科技有限公司 High tension switchgear construction qualification's audit management system based on artificial intelligence

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111339188A (en) * 2020-02-20 2020-06-26 百度在线网络技术(北京)有限公司 Block chain-based media content processing method, apparatus, device, and medium
CN111339188B (en) * 2020-02-20 2023-10-31 百度在线网络技术(北京)有限公司 Media content processing method, device, equipment and medium based on blockchain
CN117709780A (en) * 2023-12-11 2024-03-15 北京谦润和科技有限公司 High tension switchgear construction qualification's audit management system based on artificial intelligence

Similar Documents

Publication Publication Date Title
Ober et al. Contemporary business communication
Wilson Interview techniques for UX practitioners: A user-centered design method
Jung et al. Towards an ecological account of media choice: a case study on pluralistic reasoning while choosing email
Hooley et al. What is online research?: Using the internet for social science research
van der Goot et al. Customer service chatbots: A qualitative interview study into the communication journey of customers
Nardon et al. Valuing virtual worlds: The role of categorization in technology assessment
US11043136B2 (en) Personality-type training system and methods
US20110047528A1 (en) Software tool for writing software for online qualification management
Ostergaard et al. An experimental methodology for investigating communication in collaborative design review meetings
Canavor Business writing for dummies
Rioux Information acquiring-and-sharing in Internet-based environments: An exploratory study of individual user behaviors
Mahyar et al. Designing technology for sociotechnical problems: challenges and considerations
Gibson Effective help desk specialist skills
CA2574575A1 (en) Online qualification management system
Ten Social media as an internal communication tool in project management practices.: exploring an impact of social media use on employee communication in small and medium-sized companies in Uzbekistan
Deng et al. Understanding Practices, Challenges, and Opportunities for User-Driven Algorithm Auditing in Industry Practice
McIntosh et al. Interpersonal communication skills in the workplace
Kottorp et al. Chatbot as a potential tool for businesses: a study on chatbots made in collaboration with Bisnode
Bridgewater et al. Instant messaging reference: a practical guide
Becker et al. Cooperative solidarity among crowdworkers? Social learning practices on a crowdtesting social media platform
Mohlin Professional Isolation and Connectedness in Computer Supported Cooperative Work Systems: A Focused Ethnographic Study of Knowledge Workers Working from Home
McQueen The effect of voice input on information exchange in computer supported asynchronous group communication
Mattison Virtual teams and e-collaboration technology: A case study investigating the dynamics of virtual team communication
McRae Technology and Organizational Decision-Making: A Qualitative Case Study Approach
Kent III User Perceptions of the Impact of Anonymity on Collaboration Using Enterprise Social Media

Legal Events

Date Code Title Description
FZDE Dead