WO2017058205A1 - Surveillance d'utilisateur lors d'un examen - Google Patents

Surveillance d'utilisateur lors d'un examen Download PDF

Info

Publication number
WO2017058205A1
WO2017058205A1 PCT/US2015/053249 US2015053249W WO2017058205A1 WO 2017058205 A1 WO2017058205 A1 WO 2017058205A1 US 2015053249 W US2015053249 W US 2015053249W WO 2017058205 A1 WO2017058205 A1 WO 2017058205A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
computing device
baseline
facing camera
downward
Prior art date
Application number
PCT/US2015/053249
Other languages
English (en)
Inventor
Bish GUHA
Tong Zhang
Jerry Liu
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2015/053249 priority Critical patent/WO2017058205A1/fr
Publication of WO2017058205A1 publication Critical patent/WO2017058205A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Definitions

  • Online education has become prevalent due to the increasing reliability of computing devices and the internet. For example, students may have the option of taking online courses, where the student may use their computing device to access content related to those online courses, such as course materials, lectures, and the like.
  • FIG. 1 is a block diagram of an example system for monitoring a user during an examination
  • FIG. 2 is a block diagram of an example computing device for monitoring a user during an examination
  • FIG. 3 is a block diagram of an example computing device for monitoring a user during an examination.
  • FIG.4 is a flowchart of an example method for monitoring a user during an examination.
  • a computing device may identify a user (e.g., a student) who is about to begin an examination and may monitor the user during an examination period by monitoring various user activities and features of a user while the user is taking an examination.
  • An examination period may be the duration of time during which a user takes an examination via the computing device.
  • the computing device may include one or more downward-facing cameras to monitor user activities along a horizontal surface of the computing device (e.g., to monitor a user's writing stream as the user writes along the horizontal surface), one or more forward-facing cameras to monitor the user's features while the user takes the examination (e.g., monitor head position, eye position, etc.), a microphone to monitor the surrounding audio while the user takes the examination, and/or a pen input (e.g., a digital pen input) to receive inputs from a pen device that a user may use for writing on the horizontal surface.
  • a pen input e.g., a digital pen input
  • a baseline is a set of data associated with a user and that may be used to identify the user, where the set of data is used as a reference point to compare with user activity that is monitored during an examination period.
  • a baseline may include data associated with the user's appearance, the user's writing technique and style, the noise surrounding the user, and the like.
  • the user may be prompted to place the user's school- provided photo identification card and the user's palms (e.g., palms facing down) on the horizontal surface below the downward-facing camera.
  • the user may also be prompted to look toward the display device of the computing device, which may also be toward the forward-facing camera.
  • the computing device may take appropriate video and/or still images using the downward- facing camera and/or the forward-facing camera, as well as record the ambient sound received by the microphone of the computing device.
  • the computing device may also prompt the user to sign their name on the horizontal surface using a pen device that may be detectable by the pen input of the computing device, and the user's writing stream while the user is signing their name may be recorded.
  • the user may wear any suitable biometric wearable devices such that the computing device may measure and monitor the user's physiological state (e.g., heartrate, galvanic skin response, etc.). Any suitable data collected may be used to generate the baseline for the user.
  • the computing device may verify the identity of the user with the user's school (e.g., via an online verification system) by verifying the any suitable credentials of the user (e.g., the user's image, Identification card, fingerprint, student identification number, etc.) with data managed by the school.
  • the any suitable credentials of the user e.g., the user's image, Identification card, fingerprint, student identification number, etc.
  • the user may begin reading and/or listening to the test questions.
  • the test questions may be delivered to the user in any suitable manner, such as on a physical test sheet, on the display of the computing device, through audio, and the like.
  • the computing device may prompt the user to place an answer sheet on the horizontal surface below the downward-facing camera, and the user may use the pen device to write answers on the answer sheet.
  • the answer sheet may be projected onto the horizontal surface of the computing device using a projector device of the computing device, and the pen device and pen input may be used to capture answers on the horizontal surface (e.g., such that no physical answer sheet needed).
  • the user's response to questions may be in any suitable format
  • the response may include manipulating physical objects under the downward-facing camera (e.g., a dissection, building objects using physical blocks, etc.).
  • the computing device may monitor and/or record various user activities by taking and/or recording photos, video, audio, and/or pen inputs that may be associated with the user activities.
  • a user activity is used herein to refer to any activity associated with a user that may be monitored and/or recorded during an examination period, such as a user's physical appearance, a user's orientation and/or movement, the background appearance surrounding the user, audio detected during the examination period, pen inputs received during the examination period (e.g., a writing stream), and the like.
  • video captured during the examination period may be displayed in a portion of the display of the computing device while the user takes the examination. For example, a video being captured may be displayed in a corner of the display with a visible note to indicate that recording is in progress.
  • a trigger event may be any event associated with a user activity that may suggest any sort of impropriety on the part of the user, where the trigger event may be detected by comparing any deviations from the user's baseline that may exceed a particular threshold.
  • Examples of trigger events that may be detected based on the user's baseline may include the user's head being rotated in a particular manner (e.g., user's head rotated by more than a certain angle while the user is looking down), the user's eyes directed toward a particular direction, certain movements of the user, the user not being visible by a camera at any given time, an unexpected change in ambient sound, an unexpected voice, a change in the user's appearance (e.g., a change in the user's face and/or hand), an interruption and/or an anomaly in the writing stream of the user, and the like.
  • a rating associated with the user's examination period may be calculated based on the trigger events detected, where the rating may be any suitable indicator of the degree of impropriety and/or trustworthiness of the user's examination period. For example, when an examination period begins, a user may start with a rating of 100, and for each trigger event (or for a combination of trigger events occurring simultaneously or within a particular amount or range of time), the computing device may reduce the rating by an appropriate amount, based on the nature of the trigger event(s). In some examples, each time the rating is reduced, the corresponding timestamp associated with the trigger event is stored. At the end of the examination period, the computing device may calculate the overall rating for the examination period.
  • the computing device may send data
  • the computing device may send the rating associated with the examination period, data associated with the examination (e.g., a test score, the user's test answers, etc.), data associated with trigger events (e.g., a timestamp, audio, video, photos, writing streams, etc. associated with each trigger event), and the like.
  • the data collected from each student may be collectively used to create an examination-related baseline that may be used to detect any anomalies from each individual test taker.
  • the computing device may follow certain procedures to ascertain its own fidelity before, during, and/or after an examination period.
  • the computing device may have a Basic Input/Output System (BlOS)-level protection to identify potential tampering of the computing device's hardware and/or software and flag any detected abnormalities.
  • BlOS Basic Input/Output System
  • the data associated with the abnormality may be sent to the school.
  • FIG. 1 is a block diagram of an example system 100 for monitoring a user during an examination.
  • System 100 may include computing device 102, which may be any suitable computing device in communication with any systems, such as the backup system 110 and/or the school administration system 112.
  • Computing device 102 may include a processor 104 and an examination management engine 106 that may be any suitable engine that may cause the processor 106 to monitor a user during an examination.
  • Computing device 102 may also include a downward- facing camera 108, which may be any suitable camera(s) to capture content (e.g.. photos and/or videos) along a horizontal surface of the computing device 102.
  • the downward-facing camera 108 may be at least one camera oriented such that it faces downward toward the horizontal surface of the computing device 102.
  • the examination management engine 106 of the computing device 102 may cause the processor 104 to monitor a user during an examination in any suitable manner.
  • the examination management engine 106 may cause the processor 104 to generate a baseline associated with a user, where the baseline is generated based on an image of the user's hand on the horizontal surface captured by the downward-facing camera 108.
  • the processor 104 may monitor the user using the downward-facing camera 108 and, based on the baseline, may determine whether any trigger events occur.
  • the processor 104 may calculate a rating based on the trigger events, where the rating may be any suitable indicator of the degree of impropriety and/or trustworthiness of the user's examination period.
  • the computing device 102 may be in communication with the backup system 110, which may be any suitable system to create backups of the content managed and/or created by the examination management engine 106.
  • the computing device 102 may send the rating and/or any data associated with trigger events to the backup system 110.
  • the backup system 110 may be a cloud-based system to store a backup of the content
  • the computing device 102 may also be in communication with the school administration system 112, which may be any suitable system managed by a school.
  • the school administration system 112 may manage records associated with students, where the student records may include any information associated with each student, such as identification information of the student and/or information associated with the student's relationship with the school (e.g., a student identification number, a passport photo, biometric information of the student, a copy of a student identification card, grades of the student, courses currently being taken by the student and/or course previously taken by the student, and the like).
  • the computing device 102 may request student information from the school administration system 112 (e.g., the student information system of the school administration system 112) in order to verify the identity of a test taker.
  • the computing device 102 may also send any information associated with a student to the school administration system 112, such as a rating and/or any data associated with trigger events relating to the student. While a school administration system 112 is described, one of ordinary skill in the art will appreciate that the techniques disclosed herein may be used in a non-school setting and may be used for any suitable testing setting, such as a driver examination, a professional examination, and the like.
  • Computing device 102, backup system 110, and school administration system 112 may be in communication with each other directly or over a network, which may be any suitable network, such as an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, or any other type of network, or a combination of two or more such networks.
  • VPN virtual private network
  • LAN local area network
  • WLAN wireless LAN
  • WAN wide area network
  • WWAN wireless WAN
  • MAN metropolitan area network
  • PSTN Public Switched Telephone Network
  • PSTN Public Switched Telephone Network
  • FIG. 2 is a block diagram of an example computing device 200 for monitoring a user during an examination.
  • the computing device 200 may be any suitable computing device (e.g., computing device 102 of FIG. 1) that may monitor a user during an examination.
  • the computing device 200 may include a display device 202, an examination management engine 204, a downward-facing camera 206, and a horizontal surface 208.
  • the display device 202 may be any suitable display device to display content, such as to display examination questions.
  • the display device 202 may be a touch screen display device that may receive touch inputs (e.g., from pen device 212, from the user's hand 210, etc.)
  • the examination management engine 204 may be a hardware- implemented, software-implemented, and/or a combination of a hardware- implemented and software-implemented engine that may cause a processor of computing device 200 to generate and/or create a baseline associated with a user, where the baseline is generated based on an image of the user's hand 210 on a horizontal surface 208 of the computing device 200.
  • the image may be generated using the downward-facing camera 206 of the computing device 200, which may be any suitable camera to capture photos and/or video of content along the horizontal surface 208.
  • the examination management engine 204 may also cause the processor to monitor the user during an examination period using the downward-facing camera 206 (e.g., monitor the user's hand 210) and, based on the baseline, determine whether any trigger events occur during the examination period. For example, the examination management engine 204 may cause the processor to record user activity that includes content captured using the downward-facing camera 206 (e.g., movements of the user's hand 210), compare the user activity to the baseline, and determine whether any bigger events occur based on the comparison. The examination management engine 204 may cause the processor to calculate a rating based on the trigger events.
  • the pen device 212 may be any suitable pen device capable of providing and/or recording inputs from the user.
  • the user may use the pen device 212 to write along the horizontal surface 208 or on the display device 202 of the computing device 200, or to write on paper placed on the horizontal surface 208 of the computing device 200.
  • an image e.g., an answer sheet image
  • the pen device 212 may record a writing stream of the user and may send the data associated with the writing stream to the computing device (e.g., via Bluetooth, via a direct connection, etc.).
  • the pen device 212 may be used to provide touch inputs to the display 202.
  • the horizontal surface 208 may be any suitable surface oriented in a substantially horizontal manner (e.g., place on a table or desk) such that the horizontal surface 208 may be captured by the downward-facing camera 206.
  • the horizontal surface 208 may include a pen input device to receive inputs from the pen device 212.
  • FIG. 3 is a block diagram of an example computing device 300 for monitoring a user during an examination.
  • Computing device 300 may be any suitable computing device (e.g., computing device 102 of FIG. 1, computing device 200 of FIG.2) that may monitor user activity during an examination.
  • Computing device 300 may be, for example, a web-based server, a local area network server, a cloud-based server, a notebook computer, a desktop computer, an all-in-one system, a tablet computing device, a mobile phone, an electronic book reader, a printing device, or any other electronic device suitable for monitoring a user during an examination.
  • Computing device 300 may include a processor 302 and a machine-readable storage medium 304.
  • Computing device 300 may be in communication with other systems to provide and/or receive data relating to examinations taken by a user (e.g., to the backup system 110 of FIG. 1 and/or to the school administration system 112 of FIG. 1 ).
  • Processor 302 is a tangible hardware component that may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 304.
  • Processor 302 may fetch, decode, and execute instructions 306, 308, 310, and 312 to control a process of monitoring a user during an examination.
  • processor 302 may include at least one electronic circuit that includes electronic components for performing the functionality of instructions 306, 308, 310, 312, or a combination thereof.
  • Machine-readable storage medium 304 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • machine-readable storage medium 304 may be, for example, Random Access Memory (RAM), an EPROM, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like.
  • RAM Random Access Memory
  • EPROM Electrically Erasable Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • machine-readable storage medium 304 may be a non-transitory storage medium, where the term "non- transitory" does not encompass transitory propagating signals.
  • machine-readable storage medium 304 may be encoded with a series of processor executable instructions 306, 308, 310, 312 for creating a baseline associated with a user, the baseline including an image of the user's hand on a horizontal surface captured by a downward-facing camera 314 of computing device 300; recording user activity during an examination period, where the user activity include content captured by the downward-facing camera 314; comparing the user activity to the baseline; determining whether any trigger events occur during the examination period based on the comparison; and calculate a rating based on the trigger events.
  • Camera analyzer instructions 306 may manage and control the analysis of content (e.g., video analysis, image analysis, etc.) captured by the downward-facing camera 314 and/or the forward-facing camera 316.
  • the downward-facing camera 314 and the forward-facing camera 316 of the computing device 300 may be any suitable type of camera for capturing content in the direction that the respective camera is facing.
  • the downward-facing camera 314 may be oriented in a downward manner (e.g., toward the horizontal surface of the computing device 300), and the forward- facing camera 316 may be oriented in a forward manner (e.g., toward the user).
  • the camera analyzer instructions 306 may aid in creating a baseline for a user using the content captured during the baseline creation phase and may analyze the content captured by the downward-facing camera 314 and/or the forward-facing camera 316 during an examination period by comparing the content to the baseline created for the user in order to determine whether any trigger events have occurred.
  • the camera analyzer instructions 306 may identify and track a user's face in real-time, including tracking the location of the user's face, head, and/or body when the user moves and/or turns their back to one of the cameras while allowing the user to move within the viewing scope of the cameras.
  • the camera analyzer instructions 306 may analyze frames from the forward-facing camera 316 in real-time, and as it detects the user's face in a forward-looking position and/or detects the user's eyes gazing at the display device of the computing device 300 (e.g., the display device 202 of FIG.
  • the camera analyzer instructions 306 may monitor the duration of the event and may send a trigger event notification to the examination management instructions 312 if the duration exceeds a pre-defined amount of time.
  • the camera analyzer instructions 306 may also track the user's head in real-time using the forward-facing camera 316 to determine whether there is a significant change in the image from one frame to another (e.g., a head of a different user), or to detect if the orientation of the user's head changes by more than a pre-defined angle in any direction, and may send a trigger event notification to the examination management instructions 312 if so.
  • the camera analyzer instructions 306 may also check for the presence of the user (e.g., at ail times) using both cameras 314 and 316, and if the user is not visible by one or both cameras for more than a predefined duration of time, a trigger event notification may be sent.
  • the camera analyzer instructions 306 may also detect various actions, such as the user standing up, walking, talking, reading monitors, and the like to determine whether any trigger events have occurred.
  • the camera analyzer instructions 306 may also periodically compare photos taken during the examination period to the baseline photos taken of the user's features (e.g., the user's face, top of the head, back of the hand, etc.) to determine if there is any discrepancy and may send a trigger event notification if so.
  • Audio analyzer instructions 308 may manage and control the analysis of content (e.g, audio) captured by the microphone 318.
  • the microphone 318 may be any suitable type of microphone to capture audio surrounding the computing device 300.
  • the microphone 318 may capture the ambient sounds occurring around the computing device 300.
  • the audio analyzer instructions 308 may aid in creating a baseline for a user using the content captured during the baseline creation phase and may analyze the content captured by the microphone 318 during the examination period by comparing the content to the baseline created for the user in order to determine whether any trigger events have occurred.
  • the audio analyzer instructions 308 may detect surrounding audio, such as conversations, phone calls, background speech, and the like.
  • the audio analyzer instructions 308 may compare the level of ambient sound during the examination period to the baseline ambient sound to determine if a trigger event has occurred and may send a trigger event notification if any change in sound-type or sound-intensity beyond a pre-defined acceptable amount has occurred.
  • Text analyzer instructions 310 may manage and control the analysis of content (e.g., writing streams) captured by the pen input 320 and/or the downward-facing camera 314 of the computing device 300.
  • the pen input 320 may be any suitable device to receive pen inputs from a pen device associated with the computing device 300 (e.g., pen device 212 of FIG.2).
  • the pen input 320 may capture the writing stream of a user.
  • the text analyzer instructions 310 may aid in creating a baseline for a user using the content captured during the baseline creation phase and may analyze the content captured by the pen input 320 and/or the downward- facing camera 314 during the examination period by comparing the content to the baseline created for the user in order to determine whether any trigger events have occurred.
  • the text analyzer instructions 310 may monitor a writing stream of the user to determine whether any trigger events occur, such as an interruption in the writing stream for more than a pre-defined acceptable amount of time or non-recognizable marks, letters, and/or words being written. If such events are detected, a trigger event notification may be sent to the examination management instructions 312.
  • Examination management instructions 312 may be similar to the examination management engine 106 of FIG. 1 and the examination management engine 204 of FIG. 2.
  • the examination management instructions 312 may work in conjunction with the camera analyzer instructions 306, the audio analyzer instructions 308, and the text analyzer instructions 310 to manage and control the identification of a user, the creation of a baseline for a user, the monitoring of the user during an examination period, and the calculation of a rating based on any trigger events that occur, where the rating may suggest an overall risk associated with the examination period (e.g., a risk of impropriety).
  • the examination management instructions 312 may receive trigger event notifications from the camera analyzer instructions 306, the audio analyzer instructions 308. and/or the text analyzer instructions 310, may tag each notification with a
  • the examination management instructions 312 may create a risk profile for the examination period and may send the risk profile along with any content recorded and/or an examination score to the appropriate school of the user.
  • FIG. 4 is a flowchart of an example method 400 for monitoring a user during an examination.
  • Method 400 may be implemented using computing device 102 of FIG. 1, computing device 200 of FIG.2, and/or computing device 300 of FIG. 3.
  • Method 400 includes, at 402, creating a baseline associated with a user, where the baseline includes an image of the user's hand on a horizontal surface captured by a downward-facing camera.
  • the baseline includes an image of the user's hand on a horizontal surface captured by a downward-facing camera.
  • examination management engine 106 of FIG. 1 may create the baseline based on an image captured by the downward-facing camera 108 of FIG. 1.
  • Method 400 also includes, at 404, monitoring the user, including monitoring a writing stream of the user using the downward-facing camera.
  • monitoring the user including monitoring a writing stream of the user using the downward-facing camera.
  • the examination management engine 106 of FIG. 1 may monitor the user's writing stream during the examination period using the downward- facing camera 108 of FIG. 1.
  • Method 400 also includes, at 406, comparing user activity to the baseline, where the user activity includes the writing stream.
  • the examination management engine 106 of FIG. 1 may compare the user's writing stream to a baseline writing stream associated with the user.
  • Method 400 also includes, at 408, determining whether any trigger events occur based on the comparing of 406.
  • the examination management engine 106 of FIG. 1 may detect whether any interruption in the user's writing stream has occurred.
  • 404, 406, and 408 may be performed throughout the duration of the examination period in order to identify the trigger events that occur during that time.
  • Method 400 also includes, at 410, generating a rating based on the trigger events determined.
  • the examination management engine 106 of FIG. 1 may generate an appropriate rating based on the trigger events that occur during the examination period. This rating may be sent to the appropriate school of the user.
  • Example systems may include a controller/processor and memory resources for executing instructions stored in a tangible non-transitory medium (e.g., volatile memory, non-volatile memory, and/or machine-readable media).
  • a tangible non-transitory medium e.g., volatile memory, non-volatile memory, and/or machine-readable media.
  • Non-transitory machine-readable media can be tangible and have machine-readable instructions stored thereon that are executable by a processor to implement examples according to the present disclosure.
  • An example system can include and/or receive a tangible non- transitory machine-readable medium storing a set of machine-readable instructions (e.g., software).
  • the controller/processor can include one or a plurality of processors such as in a parallel processing system.
  • the memory can include memory addressable by the processor for execution of machine-readable instructions.
  • the machine-readable medium can include volatile and/or non-volatile memory such as a random access memory (“RAM”), magnetic memory such as a hard disk, floppy disk, and/or tape memory, a solid state drive (“SSD”), flash memory, phase change memory, and the like.
  • RAM random access memory
  • SSD solid state drive

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne, dans des exemples de mise en œuvre, la surveillance d'un utilisateur lors d'un examen. Par exemple, un dispositif informatique comprend une caméra orientée vers le bas pour capturer un contenu le long d'une surface horizontale. Le dispositif informatique comprend également un processeur pour générer une ligne de base associée à un utilisateur, la ligne de base étant générée sur la base d'une image de la main de l'utilisateur sur la surface horizontale capturée par la caméra orientée vers le bas. Pendant une période d'examen, l'utilisateur est surveillé en utilisant la caméra orientée vers le bas et, sur la base de la ligne de base, il est déterminé si des événements déclencheurs se produisent. Une note est calculée en fonction des événements déclencheurs.
PCT/US2015/053249 2015-09-30 2015-09-30 Surveillance d'utilisateur lors d'un examen WO2017058205A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2015/053249 WO2017058205A1 (fr) 2015-09-30 2015-09-30 Surveillance d'utilisateur lors d'un examen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2015/053249 WO2017058205A1 (fr) 2015-09-30 2015-09-30 Surveillance d'utilisateur lors d'un examen

Publications (1)

Publication Number Publication Date
WO2017058205A1 true WO2017058205A1 (fr) 2017-04-06

Family

ID=58427812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/053249 WO2017058205A1 (fr) 2015-09-30 2015-09-30 Surveillance d'utilisateur lors d'un examen

Country Status (1)

Country Link
WO (1) WO2017058205A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368808A (zh) * 2020-04-10 2020-07-03 刘志锋 答题数据的采集方法、装置、系统和教学设备
WO2022003703A1 (fr) * 2020-07-01 2022-01-06 Ofek Eshkolot Research And Development Ltd. Système et procédé d'examen en ligne

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070291017A1 (en) * 2006-06-19 2007-12-20 Syeda-Mahmood Tanveer F Camera-equipped writing tablet apparatus for digitizing form entries
WO2011067786A1 (fr) * 2009-12-02 2011-06-09 Iyer Vandana S Dispositif électronique pour examens écrits et procédé associé
US8103057B2 (en) * 2003-03-03 2012-01-24 Smart Technologies Ulc System and method for capturing images of a target area on which information is recorded
US20120176220A1 (en) * 2009-09-18 2012-07-12 Ruben Garcia Apparatus And System For And Method Of Registration, Admission And Testing Of A Candidate

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8103057B2 (en) * 2003-03-03 2012-01-24 Smart Technologies Ulc System and method for capturing images of a target area on which information is recorded
US20070291017A1 (en) * 2006-06-19 2007-12-20 Syeda-Mahmood Tanveer F Camera-equipped writing tablet apparatus for digitizing form entries
US20120176220A1 (en) * 2009-09-18 2012-07-12 Ruben Garcia Apparatus And System For And Method Of Registration, Admission And Testing Of A Candidate
WO2011067786A1 (fr) * 2009-12-02 2011-06-09 Iyer Vandana S Dispositif électronique pour examens écrits et procédé associé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KO, JOOYOUNG ET AL.: "A study of examinee motion analysis for online test us ing differential images and color information", KOREA MULTIMEDIA SOCIETY, vol. 12, pages 424 - 426 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368808A (zh) * 2020-04-10 2020-07-03 刘志锋 答题数据的采集方法、装置、系统和教学设备
WO2022003703A1 (fr) * 2020-07-01 2022-01-06 Ofek Eshkolot Research And Development Ltd. Système et procédé d'examen en ligne

Similar Documents

Publication Publication Date Title
Ochoa et al. The RAP system: Automatic feedback of oral presentation skills using multimodal analysis and low-cost sensors
US10614832B2 (en) System and method for diarization based dialogue analysis
US10241990B2 (en) Gesture based annotations
WO2019071903A1 (fr) Procédé auxiliaire, dispositif et support d'informations pour un examen de micro-expression de visage
US10553125B2 (en) Evidence based digital training portfolio
US11182600B2 (en) Automatic selection of event video content
US20150279220A1 (en) Method and system for analyzing exam-taking behavior and improving exam-taking skills
JP2014165565A (ja) テレビ会議装置およびシステムおよび方法
KR20210012503A (ko) 온라인 면접을 제공하는 방법, 시스템 및 컴퓨터-판독가능 매체
WO2019052053A1 (fr) Procédé et dispositif de lecture des informations de tableau blanc, support de stockage lisible et tableau électronique
US11860918B1 (en) Virtual secure rooms
KR20230007970A (ko) 녹음 녹화 프로그램이 구비된 응시자 단말의 정면 카메라와 보조 카메라를 사용하여 부정행위를 방지하는 안면윤곽선 인식 인공지능을 사용한 온라인 시험 시스템 및 그 방법
US20230267734A1 (en) Information Extraction from Live Online Sessions
US20160180170A1 (en) Systems and methods for eye tracking-based exam proctoring
WO2017058205A1 (fr) Surveillance d'utilisateur lors d'un examen
US20180315131A1 (en) User-aware interview engine
US20230359421A1 (en) Systems and methods for ensuring and verifying remote health or diagnostic test quality
US20230185361A1 (en) System and method for real-time conflict management and safety improvement
Soneda et al. M3B corpus: Multi-modal meeting behavior corpus for group meeting assessment
US20230238114A1 (en) Applied behavioral therapy apparatus and method
JP2019215731A (ja) 集中度評価プログラム、装置、及び方法
WO2022191798A1 (fr) Système de sécurité d'examen en ligne et procédé associé
Satre et al. Online Exam Proctoring System Based on Artificial Intelligence
KR20230026410A (ko) 비디오 향상된 시간 추적 시스템 및 방법
US20220141298A1 (en) Methods and apparatus for online test taking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15905592

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15905592

Country of ref document: EP

Kind code of ref document: A1