WO2011013117A1 - System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input - Google Patents

System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input Download PDF

Info

Publication number
WO2011013117A1
WO2011013117A1 PCT/IL2010/000586 IL2010000586W WO2011013117A1 WO 2011013117 A1 WO2011013117 A1 WO 2011013117A1 IL 2010000586 W IL2010000586 W IL 2010000586W WO 2011013117 A1 WO2011013117 A1 WO 2011013117A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
attentiveness
application
operative
input device
Prior art date
Application number
PCT/IL2010/000586
Other languages
French (fr)
Inventor
Yinon Edrei
Original Assignee
Yinon Edrei
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yinon Edrei filed Critical Yinon Edrei
Publication of WO2011013117A1 publication Critical patent/WO2011013117A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the present invention relates generally to interactive systems and more particularly to monitoring performance of a user thereof.
  • Certain embodiments of the present invention seek to provide a system and methods which monitor for attentiveness by altering transfer function governing user- perceptible behaviors vs. user input.
  • Certain embodiments of the present invention seek to provide apparatus and methods for evaluating a user of a computing device, comprising receiving user inputs via a MMI of the computing device; providing sensory feedback to the user via the MMI in response to the user inputs; introducing one or more perturbations into the sensory feedback; processing the user inputs so as to measure a response of the user to the perturbations; and analyzing the measured response so as to provide an indication of a level of attentiveness of the user.
  • Certain embodiments of the present invention seek to provide an interactive computerized application typically having hardware and software components including data storage functionality such as memory, the application including:
  • a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the user input device including or operating in conjunction with an input- application behavior correspondence manipulator operative for imperceptibly modifying the pre-defined initial correspondence based on which the user input device is operative to generate the user-perceptible behaviors, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs;
  • a user compensation monitor which may be a module in a processor, operative for generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences and computing an indication of user attentiveness;
  • an attentiveness accommodator which may operate in conjunction with or integrally with or within at least one output or display device, operative to accommodate for the indication of user attentiveness by adapting at least one aspect of subsequent operation, typically including at least one output or display-generating operation of the interactive application.
  • a method for evaluating attentiveness of a user of an interactive application employing a motor-controlled user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the method comprising imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs; generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences; and accommodating for the indication of user attentiveness by modifying at least one aspect of subsequent operation of the on-line computerized application vis a vis the user.
  • a computer-implemented method for evaluating attentiveness of a user of an interactive application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user- perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the method comprising using at least one processor for:
  • one or more processors may be employed to perform each of the above steps.
  • at least one aspect of the user's compensation includes an aspect of quality of compensation.
  • At least one aspect of the user's compensation includes an aspect of speed of compensation.
  • the user input device comprises a user input generator controlled by the user, such as but not limited to : a mouse, keyboard, joystick, steering wheel; pedal. It is appreciated that user input may also be generated by a user without resorting to an input generator, e.g. if a user input device is voice-activated.
  • the imperceptibly modifying step is differentially activated at a differentiation rate which prevents a user from reaching full adaptation.
  • the imperceptibly modifying step may be alternately activated and de-activated at a rate which is fast enough to prevent a user from reaching full adaptation.
  • the imperceptibly modifying step may be activated to produce different e.g. opposite effects, at a rate which is fast enough to prevent a user from reaching full adaptation.
  • tone volume may be alternately increased and decreased at a rate which is fast enough to prevent a user from reaching full adaptation.
  • the adaptation process takes place only once and the user eventually adapts his movements/actions reaching good quality and speed as time goes on.
  • the 'modifying' operation is alternately activated/deactivated, the user adapts his movements/actions repeatedly, never reaching full adaptation state.
  • the user's up-to-date adaptation characteristics may be computed continuously in real-time.
  • a computerized system for evaluating attentiveness of a user of an interactive computerized application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the system comprising an input-application behavior correspondence manipulator operative for imperceptibly modifying the pre-defined initial correspondence based on which the user input device is operative to generate the user-perceptible behaviors, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs; a user compensation monitor operative for generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences and using a processor to compute an indication of user attentiveness; and an attentiveness accommodator operative to accommodate for the indication of user attentiveness by adapting at least one aspect of subsequent
  • the application comprises a training simulator and the attentiveness accommodator is operative to change characteristics of a warning indication provided to a user to alert the user to potential dangers.
  • the attentiveness accommodator is operative to compute threshold warning indication characteristics which are sufficient to cause the user to be attentive to potential dangers.
  • the application comprises a control center controlling a population of human operators and the attentiveness accommodator is operative to alert each individual human operator each time the individual operator is suffering from low attentiveness.
  • the attentiveness accommodator is also operative to send a second alert to a supervisor that the individual operator is suffering from low attentiveness, if the first alert does not sufficiently improve the individual operator's attentiveness.
  • the application comprises a computerized game environment and the attentiveness accommodator is operative to enhance attractiveness of at least one characteristic of the computerized game environment if a user of the computerized game environment is found by the user compensation monitor to have a low attentiveness level.
  • the application comprises an e-learning/courseware environment and the attentiveness accommodator is operative to classify learning paradigms presented to a learning user as a function of attentiveness which they engender in the user and to replace, in a learning session with an individual user, learning paradigms which have engendered low attentiveness for the individual user, with learning paradigms which engender high attentiveness for the individual user.
  • the application comprises a website and the attentiveness accommodator is employed to identify a user's attraction to specific website locations during the engagement session between the individual user and the website.
  • the application comprises a website developers' environment and the input-application behavior correspondence manipulator and the user compensation monitor are utilized during pre-testing of a website under development and the attentiveness accommodator is operative to alert an individual developer of problematic locations within the website which are characterized by low attentiveness so as to allow the website developer to modify the problematic locations.
  • the imperceptibly modifying comprises acquiring real-time data characterizing at least one characteristic of the input device; acquiring real-time data characterizing a behavior of a sensory component controlled by the input device; and breaking the direct control of the interactive application over the sensory component controlled by the input device by imperceptibly perturbing the sensory component's normal behavior.
  • the imperceptibly modifying also comprises quantifying the human user attentiveness related to the interactive application as a function of his/her control over the input device in response to the subliminal perturbations activated.
  • the at least one characteristic of the input device comprises at least one of the following: motion affecting the input device, force affecting the input device, user's voice affecting the input device, indication of input device button clicks, indication of keyboard keys' tapping.
  • the user's up-to-date adaptation characteristics are computed continuously in realtime.
  • the user-perceptible application behavior comprises at least one of: cursor position, characteristic such as size and/or position of a displayed element which may not include a cursor, tone volume; tone pitch; color.
  • the adapting comprises modifying at least one aspect of subsequent operation of the interactive application vis a vis the user.
  • a computer program product comprising a computer usable medium or computer readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. It is appreciated that any or all of the computational steps shown and described herein may be computer-implemented. The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a computer readable storage medium.
  • Any suitable processor, display and input means may be used to process, display e.g. on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor, display and input means including computer programs, in accordance with some or all of the embodiments of the present invention.
  • processors workstation or other programmable device or computer or electronic computing device, either general-purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine-readable memory such as optical disks, CDROMs, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse or steering wheel or pedal or any other input device for accepting.
  • processor includes a single processing unit or a plurality of distributed or remote such units.
  • the above devices may communicate via any conventional wired or wireless digital communication means, e.g. via a wired or cellular telephone network or a computer network such as the Internet.
  • the apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements some or all of the apparatus, methods, features and functionalities of the invention shown and described herein.
  • the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program such as but not limited to a general purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may, wherever suitable, operate on signals representative of physical objects or substances.
  • “associating”, “superimposing”, “obtaining” or the like refer to the action and/or processes of a computer or computing system, or processor or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories, into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
  • the term "computer” should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, computing system, communication devices, processors (e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • Fig. l is a simplified flowchart illustration of a computer-implemented method for evaluating attentiveness of a user of an interactive device, operative in accordance with certain embodiments of the present invention.
  • Fig. 2A is a simplified functional block diagram of a tool for Real-Time Evaluation of User Attentiveness on Interactive applications constructed and operative in accordance with a first embodiment of the present invention.
  • Fig. 2B is a simplified functional block diagram of a tool for Real-Time Evaluation of User Attentiveness on Interactive applications constructed and operative in accordance with a second embodiment of the present invention.
  • Fig. 3 is a timeline of certain tasks, some or all of which may be performed by the systems of Fig. 2A - 2B where tasks are represented by textured rectangles and rectangles having similar texture representing tasks being performed on a single (the same) data block.
  • Fig. 4A is a simplified flowchart illustration of a motor action analysis method which may be performed by the MAP unit of Fig. 2A or Fig. 2B and is operative in accordance with certain embodiments of the present invention.
  • Fig. 4B is a simplified flowchart illustration of a method for performing the motor action filtering step of Fig. 4A in accordance with certain embodiments of the present invention.
  • Fig. 4C is a simplified flowchart illustration of a method for performing the combined sorting coefficient computation step of Fig. 4 A in accordance with certain embodiments of the present invention.
  • Fig. 4D is a simplified flowchart illustration of a method for performing the user motor error computation step of Fig. 4A in accordance with certain embodiments of the present invention.
  • Fig. 4E is a pictorial illustration of an example location of a spatial angle ⁇ of an on-screen cursor motor action path between starting and ending points thereof, which is useful in understanding the method of Fig. 4D.
  • Fig. 4F is a simplified flowchart illustration of a method for performing the adaptation computation step of Fig. 4A in accordance with certain embodiments of the present invention.
  • Fig. 4G is a simplified flowchart illustration of a method for performing the attentiveness evaluation step of Fig. 4A in accordance with certain embodiments of the present invention.
  • Fig. 5 is a graph of suitable timings of motor actions in accordance with certain embodiments of the present invention.
  • Fig. 6 is a simplified diagram of a data structure which may be utilized by the memories of Figs. 2 A - 2B in accordance with certain embodiments of the present invention.
  • Fig. 7 is a table of some of the many applications in which certain embodiments of the present invention may be implemented and of the added functionality which may be provided in accordance with certain embodiments of the present invention.
  • Fig. 8A is a simplified functional block diagram of a system for Real-Time Evaluation of User Attentiveness on Interactive applications which may incorporate the tools of Fig. 2 A or Fig. 2B and which is constructed and operative in accordance with a first embodiment of the present invention.
  • Fig. 8B is a simplified flowchart illustration of an example method of operation for the system of Fig. 8A in accordance with certain embodiments of the present invention.
  • Figs. 9A - 9B taken together, define user-related parameters for each data source, some or all of which may be computed or otherwise provided in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
  • Figs. 1OA - 1OB taken together, form a table describing computation or determination of user-related parameters, some or all of which may be performed in the user-related parameter computation step of Fig. 4A in accordance with certain embodiments of the present invention, all in accordance with certain embodiments of the present invention.
  • Figs. HA - HC taken together, form a table suggesting example values or example computations useful in providing user-related parameters in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
  • Fig. 12 is a diagram of example functionalities of the MSDG routine of Figs. 2A - 2B.
  • Fig. 13 is a graph of an example motor-action of a screen cursor movement.
  • Fig. 1 is a simplified flowchart illustration of a computer-implemented method for evaluating attentiveness of a user of an interactive device, the method including some or all of the following steps, suitably ordered e.g. as shown:
  • Step 10 providing an interactive application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, and at least one processor, which may or may not be integrated with the application, for performing at least one of the following steps 12 - 16.
  • Step 12 imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs, wherein optionally, the imperceptibly modifying step is differentially activated at a differentiation rate which prevents a user from reaching full adaptation.
  • Step 14 generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences, e.g. an aspect of quality of compensation and/or an aspect of speed of compensation, and using a processor to compute an indication of attentiveness based on the measurements.
  • Step 16 accommodating for the indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
  • FIG. 1 A system for Real-Time Evaluation of User Attentiveness on Interactive applications according to an embodiment of the present invention is described herein, including examples of suitable architecture, functionality and external interfaces, with reference to Figs. 1 - 2B.
  • MSDG routine suitable for implementing the MSDG routine 40 of Fig. 2A, is next described, including examples of methods for Movement monitoring and dynamic data recording, examples of methods for Sensorimotor redirecting, example Data transfer methods and an example graphical user interface.
  • FIG. 12 A diagram of example functionalities of the MSDG routine 40 is depicted in Fig. 12.
  • An example Main Control unit/program suitable for implementing the Main Control unit/program 60 of Fig. 2A, is next described, including examples of suitable operation control according to certain embodiments of the present invention, Timing control, Raw data acquisition timing, Raw data partitioning timing, MAP timing control (with reference to Fig. 3), Data transfer control, Sensorimotor redirecting instructions, and methods for Periodic restoring of user dynamic behavior characteristic parameters.
  • the MAP units of Figs. 2A - 2B are typically operative to perform the method of Fig. 4A, e.g. as described in detail herein with reference to Fig. 4A generally and with reference to Figs. 4B - 4G by way of example.
  • An example MAP unit 70 of Fig. 2A also suitable for implementing the MAP unit of Fig. 2B, is described hereinbelow, including inputs and outputs thereof, example methods for Motor action filtering (step 410 of Fig. 4A), Motor actions sorting (step 420) including Time Length sorting and Space distance sorting, User motor error computation (step 430), and methods for computing Adaptation (step 440) according to an Averaging method or Exponential regression method both as described herein.
  • Attentiveness evaluation methods suitable for implementing step 450 of Fig. 4, and, with reference to Fig. 6, User-related parameters computation suitable for implementing step 460 of Fig. 4A, are next described.
  • FIG. 6 An example Interna I/Virtual Memory suitable for implementing memory 80 of Fig. 2A is next described with reference to Fig. 6, including Raw data stream read/write control methods therefor.
  • AEMS Attentiveness evaluation method selection CPU Central processing unit
  • DMMID Dynamic MMI device e.g., pc mouse, joystick, etc.
  • Certain embodiments of the present invention seek to provide a tool for typically concealed and generic real-time evaluation of user attentiveness on interactive computerized applications such as the tools shown and described herein.
  • the functional components, data structures and methods described herein for evaluating user attentiveness based on interaction with the MMI of a computer are merely exemplary and are not intended to be limiting.
  • the present invention may alternatively be implemented using other components, data structures and methods for presenting stimuli to the user through the MMI and for measuring the user's response.
  • the system shown and described herein typically evaluates the user level of attentiveness referring to the host current active application, while the user is unaware of its operation; and acquires and analyzes in real-time dynamic data recorded through any DMMID and through the sensory element controlled by the DMMID (e.g. the PC mouse and the on screen cursor), operated by the human user during his/her interaction with the current active application.
  • the system shown and described herein may function as a significant diagnostic tool within various types of hosts of different types of applications as motor simulators, courseware applications or computer games, in order to enhance their knowledge of the human user characteristics in real-time.
  • Certain embodiments of the invention generate significant improvements for a wide variety of applications e.g. some or all of those set out in the table of Fig. 7.
  • the system shown and described herein typically examines user attentiveness referring to the current active host application, based on the data of the dynamic interaction between the user and the application, acquired through any DMMID (e.g. PC mouse, joystick, paddle or wheel) used to interact with the application.
  • DMMID e.g. PC mouse, joystick, paddle or wheel
  • the system shown and described herein typically utilizes relationships between user attentiveness and the process of motor adaptation. It may function constantly or for a predefined time period, and the user is unaware of its operation.
  • the system shown and described herein typically handles its interaction with the active host application typically through one or both of the following routes:
  • DMMID digital versatile disk
  • the sensory elements/actions e.g. computer mouse and the on-screen cursor
  • PC mouse, joystick, steering wheel and pedal are all examples of input devices that may be placed under the category of dynamic MMI devices (i.e., DMMID).
  • the on-screen cursor, Pac-Man movements and speaker volume are all examples of sensory elements/actions (e.g., visualized on a display screen, or sounded by a speaker) that are usually controlled by the behavior of a DMMID according to human user actions.
  • redirecting includes quantifying the human user attentiveness related to the interactive application as a function of his/her control over the input device in response to the subliminal perturbations activated.
  • the system shown and described herein typically implements subliminal perturbations (e.g. some or all of the following: undersized angular perturbations of the cursor motion, below the conscious sensory threshold of the human user) upon the host screen cursor or any other sensory element/action controlled by a DMMID, and analyses the user motor adaptation characteristics which lead to the evaluation of the user attentiveness in real-time.
  • the user may be unaware of the perturbations effects, nevertheless, the common user adapts his/her behavior accordingly, unconsciously.
  • the quality of the user feed-forward motor adaptation decreases.
  • the perturbation type and parameters are typically selected to fit the type of properties controlled by the input device, and the characteristics of the input device's control-function over the above properties.
  • a suitable perturbation of a cursor is an angular perturbation. If the 2D cursor was originally set to move R pixels in a direction defined by the angle alpha (or based on Cartesian coordinates: x pixels right and y pixels up) then a suitable perturbation causes the cursor to move R pixels in a direction defined by alpha+beta.
  • the perturbation angle beta may be low enough to prevent user awareness to the existence of the perturbation e.g. 10 degrees or less.
  • the perturbation can be expressed by slightly increasing/decreasing the strength of the input device's affect on the above properties.
  • the maximum strength of the perturbation may be predefined according to the characteristics of the input device's control-function over the above properties.
  • the system shown and described herein typically produces a real-time evaluation of user attentiveness as a discrete time based function which ranges from a low to high endpoints, e.g. 0-10, which may correlate with low-high attentiveness, respectively.
  • GUI graphical user interface
  • Example block diagrams and external interfaces are illustrated in Figs. 2A - 2B.
  • thesystem shown and described herein may, in certain applications, comprise a hardware device (formed with FPGA/DSP or ASIC) and associated software, particularly when host CPU utilization may impede functionality of the host application.
  • the hardware device may perform some or all of the motor analysis steps of the MUF unit as described herein in detail with reference to Fig. 4A, and hold historical data independently, thus minimizing the additional load on the host CPU.
  • the present invention may include some or all of the following subsystems, e.g. as described in detail below: a. MSDG (Motor actions monitoring, Sensorimotor redirecting, Data transfer and Graphical user interface) routine 40 of Fig. 2A b. Main Control unit/program 60 of Fig. 2A c. One or more MAP (i.e., Motor Analysis Function) units 70 of Fig. 2A, e.g. as shown in Fig. 2B, one for each dynamic data source analysis d. Internal/Virtual memory unit 80 of Fig. 2A.
  • MSDG Motor actions monitoring, Sensorimotor redirecting, Data transfer and Graphical user interface
  • Main Control unit/program 60 of Fig. 2A c.
  • Another subsystem unit, the Communication Controller may comprise a standard USB communication controller or any other type of communication (e.g., parallel/serial port, Blue-Tooth etc.) controller for communication between the Main Control unit and the MSDG unit over the communication Channel.
  • a standard USB communication controller or any other type of communication (e.g., parallel/serial port, Blue-Tooth etc.) controller for communication between the Main Control unit and the MSDG unit over the communication Channel.
  • the MSDG routine 40 may be designated to perform some or all of the following processes, e.g. as shown in Fig. 12:
  • main input data is used herein to include some or all of dynamic data received from the host DMMIDs controlled by the user motor movements/forces, and/or dynamic data which defines the behavior of the elements/actions (e.g., the on screen cursor) controlled by those devices.
  • the dynamic data refers to a motion in space, by handling the relative changes in the spatial coordinates, the force vector or the velocity vector.
  • tracking the screen cursor movements may be obtained by sampling and recording the cursor 2D spatial coordinates versus time: t, x(t) & y(t) .
  • the sampling rate of the dynamic data may vary, e.g. based on a nominal value of 1 OmSec, e. g. 100 Hz.
  • the recorded data may be stored in the Memory unit 80 of Fig. 2A.
  • the MSDG routine applies subliminal perturbations (e.g., spatial angle shift perturbations) to the normal behavior of the elements/actions (e.g., screen cursor) described above.
  • subliminal perturbations e.g., spatial angle shift perturbations
  • These perturbations may, according to certain embodiments, compel the user to perform online corrections to his planned actions through the DMMID, in order to obtain the original targets the user wishes to obtain by his/her original actions using the input device (e.g. in order to move the on screen cursor onto a specific icon and double click).
  • a user's Original target' may be to move the screen cursor to a specific point on the screen in order to close a window session; or to turn down the volume of a loudspeaker by turning a knob or a steering wheel to a desired volume.
  • the rate of perturbation insertion e.g. the rate of inserting perturbation into each sampling of the dynamic data which defines the behavior of the element/action controlled by the host's DMMID
  • the rate of perturbation insertion is typically selected to be similar to the rate of the dynamic data sampling (e.g., 100 Hz) in order to achieve optimal synchronization between the data recording module 1210 and the sensorimotor redirecting module 1220 , but the rate is typically not low enough to cause user awareness to the perturbations.
  • the user's motor adaptation process following the perturbations may be examined constantly by the MAP unit 70 of Fig. 2B.
  • Example Data transfer methods are now described. Several types of data are transferred through a bidirectional data channel between the MSDG routine 40 and the Main Control unit/program 60.
  • the data derived from the MSDG unit 40 may include some or all of:
  • Raw data recordings of the dynamic real-time data e.g., time based 2D coordinates
  • the host DMMIDs e.g., the pc mouse or joystick movements
  • Raw data recordings of the dynamic real-time data (e.g., time based 2D coordinates) received from the host controllers of the elements controlled by the host DMMIDs.
  • the MSDG routine 40 receives some or all of the following data: a. Description of the dynamic data to be transferred.
  • GUI Graphical User Interface
  • F majyp typical rate of the dynamic motor actions (e.g., pc mouse movements) of a likely user.
  • T majyp time duration of a typical dynamic motor action of a likely user.
  • Ti a _ pre _ ma _ m j n minimum time gap of low activity after the ending of one motor action, before the beginning of the next motor action.
  • Tgui _ma_pre_vp_max maximum time period between motor action start and the time it reaches its peak velocity/force.
  • UMSE i.e. user motor action start/stop event characteristics (e.g. mouse left button click).
  • C av g threshold level for filtering the user motor actions, according to their velocity/force function correlation result with a template time-based velocity/force function of the specific user.
  • the desired sampling rates of the dynamic data acquired from each source predefined by default to a value of 100 Hz.
  • characteristics of the sensorimotor redirecting perturbation, of each element controlled by any DMMID e.g. some or all of the following:
  • the type of perturbation activation paradigm e.g., cyclic on/off activation with or without alternating the perturbation course as related to the original course of the element controlled by the DMMID.
  • the rate of the perturbation activation in sense of the number of UMSEs e.g. as defined herein.
  • the duty cycle of the activation rate may be specified by the numbers of UMSEs pairs (e.g. start & stop events) occurrences throughout the perturbation active period N pa j rs _p e rt_on and throughout the perturbation inactive period Npairs_pert_off- Both numbers may be no less than 4.
  • the time-space relation between the perturbation and the action to be effected e.g., simple geometrical phase-shift of the original space path of the element (e.g., screen cursor) whose motion may be controlled by the DMMID.
  • Maximum perturbation intensity e.g. 10° phase shift in the case of the perturbation example described above.
  • R ur j the ratio between the rate of the perturbation activation e.g. as defined herein and the rate of the periodic storage of the user related parameters tables, referring to each dynamic data source.
  • AAMS adaptation analysis method selection e.g. averaging or exponential regression as described in detail herein.
  • AAMS may be set to '0' or '1' respectively.
  • AEMS attentiveness evaluation method selection e.g. averaging or exponential regression (as explained herein).
  • AEMS may be set to '0' or T respectively.
  • K ⁇ attentiveness grading factor set to a value of 8 by default or can be reassigned to a different value in the range 5 ⁇ K ⁇ ⁇ 9.
  • ⁇ T time duration between motor action start and the time of measuring the user motor error, which may be mostly caused by the sensorimotor perturbation manipulation.
  • ⁇ T value can be set only within the range of 40mSec - 200mSec.
  • the GUI may display a small status window on the host screen, indicating the attentiveness evaluation results as a graphic up to date figure of the UAE versus time, or as a numeric number. The GUI may assert warnings whenever the ratio between the actual and the expected rate of the
  • motor actions is too high or too low, e.g., —— > 2 or —— ⁇ 0.5 ,
  • Main Control Unit 60 typically controls operations and coordinates the functionality of internal units.
  • the Main Control unit 60 which may be based on a micro-processor, and/or ASIC/FPGA/DSP, typically performs some or all of the following tasks, including interfacing as appropriate e.g. with some or all of the MSDG routine, host operating system, MAF units and/or memory unit: a. Controlling real-time operation in accordance with the administrator GUI input data.
  • the Main Control unit may control operation based on the operational attributes values entered through the administrator GUI.
  • the Main Control unit controls the system shown and described e.g. as to one or more of: activation and deactivation, the list of the DMMIDs (i.e., the dynamic data sources) to be analyzed and the list of host elements whose behavior is controlled by these devices.
  • the Main Control unit may coordinate the timing of some or all of the other units. It manages the timing of initiation and activation of the units and the timing of the data transfers.
  • Fig. 3 displays the timeline of certain tasks according to certain embodiments of the present invention, coordinated by the Main Control unit, referring to a condition of acquiring and analyzing one dynamic data source. Each additional data source requires the control of additional but identical tasks timeline.
  • Raw data acquisition timing Upon activation of apparatus shown and described herein, dynamic data may be acquired constantly by the MSDG unit from the sources chosen by the administrator through the MSDG GUI. The same list of chosen sources may be handled and controlled by the Main Control unit 60 of Fig. 2A.
  • the dynamic data may be acquired at a sampling rate predefined separately for each data source through the MSDG GUI.
  • the raw data acquisition of each data source typically includes both the dynamic data acquired through the host's DMMID, and the dynamic data acquired through the host's controller of the element/action controlled by the same DMMID; both types of data may characterize the behavior of both the host's DMMID and the element/action controlled by it.
  • Raw data partitioning timing The acquired raw data of each data source may be stored in a suitable Memory unit e.g. as shown in Figs. 2 A— 2B, and may be divided into consecutive blocks with varied size, in a typical range of 10k-100kByte.
  • the timing which defines the end of each block and the start of the following one may be managed by the Main Control unit.
  • Each block may contain the same number of UMSEs e.g. as defined herein which equals to the rate of the sensorimotor perturbation as defined herein.
  • MAP timing control Following the storage of each raw data block, it may be read and analyzed by the MAP unit while simultaneously the subsequent data block may be acquired and stored in a Memory unit. Ahead of the attentiveness analysis, the MAP produces out of each raw data block a first stage analysis data block e.g. by performing steps 430, 440 and 450 of Fig. 4A, described in detail herein.
  • the first stage analysis data block may be written into the Memory unit as a condensed replacement for the original raw data block.
  • the Main Control unit controls various types of data transfers, such as but not limited to some or all of the following:
  • the Main Control unit manages the sensorimotor redirecting instructions, applied by the MSDG routine for each of the
  • the targeted elements/actions to be effected e.g., the on screen cursor
  • the DMMIDs specified through the administrator GUI as described herein are the sensory elements controlled by the DMMIDs specified through the administrator GUI as described herein.
  • Timing of activation and deactivation e.g. the rate of the perturbation activation as described herein.
  • N t o_ ma _pert_on or N t o_ ma _pert_off is found lower than 4, then N pairs _pert_on or Npairs_pert_off; respectively, is increased by one.
  • c Intensity. Maximal intensity level is normally defined so as not to arouse user awareness to the sensorimotor redirecting perturbation.
  • the contents of the instructions are defined based on predefined variables, updated setup data received through the administrator GUI before startup, and real-time analysis of acquired data received from the MAP units during operation.
  • Each user has different characteristics of dynamic behavior (e.g. hand movements) while controlling the DMMIDs of the host computer with or without sensorimotor redirecting effects. These characteristic parameters may assist for better analysis of the user attentiveness, by identifying the individual ranges of the user dynamic behavior.
  • MAP units generating structured tables e.g. as illustrated in Figs. 1OA, 1OB, 1 IA and 1 IB, for each of the dynamic sources analyzed.
  • the tables are stored in the Memory unit and updated periodically based on the value of R ur j, as defined through the administrator GUI of the MSDG routine 40 of Fig. 2A, under the control of the Main Control unit.
  • the MAP units execute the main motor analysis generating real-time evaluation of the user attentiveness and other user related parameters which are utilized to achieve better fine tuning of the attentiveness estimation.
  • one MAP unit may be assigned for each data source analysis, and the final attentiveness evaluation may be determined as the weighted averaging of each MAP result.
  • Fig. 4A is a simplified flowchart illustration of an MAP method provided in accordance with certain embodiments of the present invention.
  • the method typically includes some or all of the steps shown, suitably ordered e.g. as shown.
  • first stage block analysis and storage of Fig. 3 corresponds to steps 430, 440 & 450 in Fig. 4 A, and the data generated by these steps may be stored in the Memory unit of Figs. 2A - 2B under the control of the Main Control unit of Figs. 2A - 2B.
  • the user-related parameters analysis of Fig. 3 correspond to step 460 in Fig. 4A.
  • the attentiveness analysis of Fig. 3 corresponds to step 450 in Fig. 4A.
  • the MAP typically receives two inputs from the memory unit, and produces three outputs as described below:
  • MAP inputs may include some or all of:
  • a A block of raw data received immediately following its acquisition and storage completed, including the times of the UMSEs relevant to the data source; and/or b. Predefined attributes and constants, and the parameters of the sensorimotor perturbation applied, as defined through the administrator GUI.
  • MAP outputs may include some or all of:
  • Attentiveness evaluation result referring to the processed data block.
  • First stage analysis data block to be stored as a condensed replacement for the original raw data block. It comprises the UMEs, the adaptation computation results and the attentiveness evaluation results of each identified group of filtered motor actions of the analyzed raw data block.
  • Predefined attributes of UMSE definition e.g. mouse click
  • Tstart ⁇ start/stop time of each motor action
  • TstopQ start/stopQ
  • the input to units 520, 530 and 550 typically comprises a single data block which may include:
  • Unit 530 is typically operative to find the time of UMSEs as received through IN 2 : TstartQ) and TstopQ) (e.g., when cursor velocity exceeds V /a _ pre _ ma _ max after being slower for at least Ti a _ pre _ ma _ m j n time period)
  • Unit 520 is typically operative to find the time of UMSEs as received through INi: Tst ⁇ rt(i) and Tstop(i) (e.g., time of mouse right/left button clicks)
  • unit 570 In order to filter only the valid goal-oriented motor actions, unit 570 typically verifies that each possible motor action k follows a set of suitable rules such as but not limited to some or all of the following:
  • the motor velocity/force exceeds Vi ⁇ _pre jn ⁇ _ m ⁇ x value, or more generally, the intensity/power of the motor action exceeds a predefined value suggesting it is not merely 'noise'.
  • Tst ⁇ rt(k)-Tstop(k-1) is longer than the period T ⁇ ⁇ _pre_ m ⁇ _mm, during which the motor velocity/force does not exceed V ⁇ ⁇ _ pre m ⁇ m ⁇ value. More generally, a considerable time gap exists between each two consecutive motor actions.
  • the total time period of the k motor action, Tstop(k)-Tst ⁇ rt(k), is shorter than Tm ⁇ j n ⁇ xj i m but longer than T m ⁇ mm j im, or more generally, each motor action duration does not deviate from a predefined expected time range for the motor action in question.
  • Rules iv and v may be provided in combination or in isolation and may be replaced or augmented by any rule which ensures that the nature/form of the motor- action intensity/power function over time is similar to the form of an exemplary intensity/power function of a target oriented motor action of the individual user.
  • the rules ii - v may be replaced by any suitable method or set of rules useful in identifying a specific pattern within a continuous sampled raw data, where the 'pattern' is the target-oriented motor-action.
  • user-related parameters e.g. as received through Memory unit 560 in Fig. 4B are used to identify the motor-action pattern of the individual user by adapting the filtering rules according to the behavior of the individual user.
  • Fig. 13 is a graph of an example motor-action of a screen cursor movement.
  • step 570 may comprise the dynamic data per target-oriented motor action m, during the period Tstart(m) ⁇ t ⁇ Tstop(m), e.g., [xm(0 > ym(t)] 2D coordinates and the velocity v m (t) of the m target-oriented cursor movement, while given: Tstart(m) ⁇ t ⁇ Tstop(m).
  • Time Length Sorting The optimal time length, T ma _ opt of each motor action may be based on predefined time constant defined through the administrator GUI (e.g., T ma t yp, as described herein) and based on the user-related updated variable
  • the TLW i.e., time length weight
  • T ma is computed, e.g. in step 640 of Fig. 4C, as:
  • the spatial distance of a motor action may be defined as the shortest motor distance for arriving from the original
  • the spatial distance of the onscreen movement may be the direct 2D distance (in cm or pixels) between the movement starting point and the movement end point.
  • the 'combined sorting coefficient' of each motor action may be computed as in step 660 of Fig. 4C:
  • Each motor action may be analyzed in order to compute its related UME, i.e. user motor error.
  • the UME indicates the size of the user error in directing the motion of the element/action controlled by the DMMID effectively towards the intended target, e.g., moving the on-screen cursor to point over a specific icon on the PC monitor. Therefore, the time- space paths of all the filtered motor actions may be identified and mainly the spatial coordinates of the starting and ending points.
  • TAN ma of each motor action path may be typically defined and computed as the tangent of the spatial angle ⁇ between two straight lines, Linel and Line2, or may be computed by any other suitable function which identifies the deviation between the following Linel and Line2.
  • Line2 The imaginary spatial distance line connecting the starting point A to the motor action path position at a Af period following the timing of the starting
  • the UME of each motor action may be defined as (TAN ma ) .
  • Fig. 4E displays an example for the location of the spatial angle ⁇ of the onscreen cursor motor action path between its starting point A and its ending point B.
  • Adaptation Computation step 440 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration of Fig. 4F.
  • the user motor adaptation to the control of each DMMID may be expressed by two time based functions: a.
  • the time variable t may be a discrete variable
  • T n holding the values of the averaged times of the target oriented motor actions of group n (step 910 in Fig. 4F), i.e. t e ⁇ 1 ' Tl ' 7 ⁇ ' " ⁇ .
  • Fig. 5 displays exemplary timings of motor actions obtained by the user (also termed herein “motor action events"), such as cursor movements or loudspeaker's volume reduction/increase actions, and the T n time of each group.
  • motor action events such as cursor movements or loudspeaker's volume reduction/increase actions
  • T n time of each group the T n time of each group.
  • each data block includes two groups: one which includes no sensorimotor perturbation and one which includes sensorimotor perturbation.
  • the analysis technique may be based on two feasible methods, averaging and exponential regression. Through the administrator GUI, it can be predefined which method to be used according the value of AAMS, e.g. either of the following 2 methods
  • E Q ( ⁇ ) may be obtained by averaging the UMEs of the m target oriented motor actions of group n, considering the combined sorting coefficient of each motor action
  • ⁇ (n) computation may be unavailable according to this method, b.
  • Exponential Regression Method Eft(n) and ⁇ n) are obtained by fitting of simple exponential regression curve, m
  • Attentiveness Evaluation step 450 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration on Fig. 4G.
  • This step receives as input the E 0 '( ⁇ ) and ⁇ '( ⁇ ) values computed in the previous step.
  • the real-time evaluation of user attentiveness may be accomplished by computing the ratio between the real-time adaptation values with and without the presence of the sensorimotor redirection perturbation.
  • Each group of consecutive filtered motor actions relates to a period of the same state (e.g., on/off) of sensorimotor perturbation, and each group holds its own motor adaptation analysis results E o ' ( «) , ⁇ '(n) and T n .
  • E o ' «
  • ⁇ '(n) and T n .
  • the averaged motor adaptation results of all the groups which relate to the same perturbation state e.g. E 570 , E 8n , ⁇ s ⁇ o and X 8n
  • the averaged adaptation results of all the groups e.g. E Tol ⁇ l and ⁇ j . ot ⁇ l
  • E 8T0 me ⁇ n[E o ' (H)] he all the groups which relates to STO perturbation state.
  • E 8n me ⁇ n[E o ' (Zz)] he all the groups which relates to STl perturbation state.
  • E r ot ⁇ i ⁇ mean[E 0 ' (h)] he all the groups.
  • ⁇ 8T0 mean[ ⁇ (h)] he all the groups which relates to STO perturbation state.
  • ⁇ 8n mean[ ⁇ (h)] he all the groups which relates to STl perturbation state.
  • X 7010 mean[ ⁇ (h)] he all the groups. n (X ( Kr n I ⁇ )-— n Ui * ⁇ W ⁇ ! L + r Q ⁇ -, ' E 0 '(n)-E sn r (A-i * E 0 '(n)- E Total r ' E ⁇ ' ⁇ n)-E 0 ' ⁇ n- ⁇ )
  • E o p mean[E o ' (Zz)] he all the groups which relates to perturbation 'off state.
  • E 0n mean[E o ' (h)] he all the groups which relates to perturbation 'on' state.
  • E Tolal mean[E o ' (Zz)] he all the groups.
  • ⁇ off mean[ ⁇ (h)] he all the groups which relates to perturbation 'off state.
  • ⁇ On mean[ ⁇ (h)] he all the groups which relates to perturbation 'on' state.
  • ⁇ Total mean[ ⁇ (h)] he all the groups.
  • S ⁇ may be equal to 0 or 1, predefined e.g. in accordance with the administrator GUI.
  • K ⁇ 8 or can be set to a different value e.g. in accordance with the administrator GUI.
  • Each individual user has some typical values related to his/her own characteristic movement parameters and his/her own quality of adaptation. These time-dependent parameters are initially defined to hold specific values, but later they are updated periodically (based on R ur ⁇ value) e.g. in accordance with the specific user performance, referring to the same dynamic data source.
  • Figs. 9A— 9B taken together, define user-related parameters for each data source, some or all of which may be computed or otherwise provided in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
  • Figs. 1OA - 1OB taken together, form a table describing computation or determination of user-related parameters, some or all of which may be performed in the user-related parameter computation step 460 of Fig. 4A in accordance with certain embodiments of the present invention, all in accordance with certain embodiments of the present invention.
  • Fig. 1OA lists user-related parameters of each data source according to certain embodiments of the present invention. As illustrated in Fig. 1 OB, some of these parameters may be defined as constants, some predefined through the administrator GUI, and some statistically computed in real-time based on the MAP unit analysis results of the latest data blocks acquired.
  • Figs. HA - HC taken together, form a table suggesting example values or example computations useful in providing user-related parameters in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
  • the Memory unit 80 of Fig. 2A is now described in detail, according to certain embodiments of the present invention.
  • This unit represents a real or virtual memory space.
  • this memory space can be implemented on a RAM/FLASH element.
  • this memory space may be actually a virtual space allocated by the host operating system.
  • This memory space may hold some or all of the following data as illustrated in Fig. 6 which is a diagram of an example of the memory contents, which may hold different values related to each dynamic data source.
  • the term dynamic data source refers to the dynamic data acquired through any of the host's DMMID in addition to the dynamic data acquired through the host's controllers of the element/action controlled by the same DMMID.
  • the memory contents typically include some or all of:
  • the raw data block most recently stored including the raw data acquired through the host's DMMID and the through the host's controller of the element/action controlled by the same DMMID) , and all the 1 st stage analysis data blocks (e.g. as generated by steps 430, 440 and 450 of Fig. 4A) corresponding respectively to the previous raw data blocks recordings of dynamic real-time data acquired by the MSDG unit; operational setup parameters defined by the administrator GUI; all the user-related parameters computed by the MAP unit; the latest attentiveness evaluation result, obtained by the MAP unit.
  • Raw Data Stream Read/ Write Control for the internal/virtual memory 80 of Fig. 2A is now described in detail, according to certain embodiments of the present invention. Memory read/write transactions are controlled by the Main Control unit.
  • Blocks of real-time raw data monitored through the host DMMIDs are constantly written into the memory. Subsequently, each written block is then read by the MAP unit and analyzed in order to produce its first stage analysis data block, the attentiveness results and the computed user-related parameters.
  • the user-related parameters may be computed only if required according to T ur j time rate and not otherwise.
  • each block of raw data may be exchanged by the first stage analysis data block of the same original data block.
  • Training Simulators Use of simulation for training has been proven effective as a means for developing human competencies by military organizations, over the years. Real time evaluation of a trainee's attentiveness may be utilized during real time operation of the simulator and post simulation.
  • a pilot's decision-making is affected by his attentiveness to key events occurring prior to the time a challenging episode takes place.
  • a simulation system constructed according to certain embodiments of the present invention can:
  • Control Systems and Command & Control Systems Large control centers such as in airports and telecommunication companies involve human interaction with multiple monitors and control systems. In order to assure high quality control, employees must remain attentive throughout their shifts, monitoring the various parameters and interacting with the systems. Lack of attention in control centers could lead to catastrophic results, so the need for attention level identification is crucial. Certain embodiments provide real-time detection of low attention levels of the control personnel. The system may alert these personnel when their attention has wandered, creating a higher level of security with regard to human control.
  • This action may be operative to notify the operator he should pay more attention to the operation of the control system, hi addition, if the operator's attentiveness is still measured low later on, then an alarming notification may be sent to his supervisor, hi this case the supervisor may contact the operator and ensure that his attention to the control system is not disturbed by an irrelevant factor, and consider the need for additional human resources to deal with the present circumstances,
  • the ability to identify a gamer's attention level during a game, in real time, may be provided according to certain embodiments. If a gamer is losing interest in the game, enhancement of gaming conditions, e.g. introducing competition or other computerized community features, may help regain the gamer's interest and lengthen the time spent on the game.
  • Courseware, eLearning or brain training programs are used by the user to either learn a subject or exercise brain function.
  • programs geared towards ADD, ADHD or autism it is important to hold and correlate with the users' attention levels.
  • These systems can shift strategies when the user's concentration level is sub-par.
  • Certain embodiments of the present invention may be incorporated in courseware, e-learning and brain training programs.
  • the system may "signal" when the user's attention level has dropped allowing automatic adaption of the teaching or training strategy used.
  • new language educational courseware applications are based on different activity screens and tasks. Individually, each student may prefer certain activities over others, and may avoid learning significant materials if presented in one format or scheme rather than another.
  • the system shown and described herein may provide online, real time assessment of the users' attention level, ensuring that they are actually engaged in the application, enabling real time shifting of formats and schemes, e.g. activity screens, to enhance learning of each educational topic.
  • Websites employ various schemes for retaining visitors and keeping them engaged. Website usability is a key factor in increasing retention. Usability encompasses a number of things such as ease of use, and user attraction or attentiveness.
  • One of the current methods of assessing a website's usability is to conduct custom user experience research during the design process. This method requires a number of iterations where a website design is tested by the custom users; their user experience is evaluated by researchers; and the design is altered accordingly.
  • An example of a firm providing such a consulting service is AnswerLab. Alternate methods of improving retention rates are by employing customer experience analytics that analyze the usability of a website based on data gathered on visitors' surfing habits.
  • Another example is the effect of pop-up displays which may appear at different screen spots, on the user's interaction with the website. If the time of banner pop-up has high correlation with a long period of elevated attentiveness measured, it may suggest that the pop-up banners may have grabbed the users' attention for longer periods than desired. In this case, the website designer may redesign the characteristics of the pop-up banners, minimizing their negative side effects.
  • Fig. 8A is a simplified semi-pictorial semi-functional block diagram illustration of an attentiveness-responsive computerized system constructed and operative in accordance with certain embodiments of the present invention.
  • Fig. 8B is a simplified flowchart illustration of an example method of operation for the apparatus of Fig. 8A.
  • INl, IN2 and OUTl in Fig. 8A and in Fig. 8B are one and the same.
  • each interactive application which includes a motor-controlled HMI device such as but not limited to PC mouse, keyboard or joystick, is characterized by a transfer function relating the control directives of this device to a specific sensory element/action such as but not limited to an on-screen cursor.
  • the interactive application in conjunction with one or more functional units e.g. as described herein, performs some or all of the following actions:
  • the evaluated attentiveness of the subjects to the visual-motor task based on the system shown and described herein was validated by comparing the results associated with the periods during which the subjects handled two simultaneous tasks, to the results associated with the periods during which the subjects handled the visual-motor task only.
  • the attentiveness results of the first type were consistently lower than the results of the second type. The difference was statistically significant (p ⁇ 0.01).
  • software components of the present invention including programs and data may, if desired, be implemented in ROM (read only memory) form including CD-ROMs, EPROMs and EEPROMs, or may be stored in any other suitable computer-readable medium such as but not limited to disks of various kinds, cards of various kinds and RAMs.
  • ROM read only memory
  • EEPROM electrically erasable programmable read-only memory
  • Components described herein as software may, alternatively, be implemented wholly or partly in hardware, if desired, using conventional techniques.
  • components described herein as hardware may, alternatively, be implemented wholly or partly in software, if desired, using conventional techniques.
  • Any computer-readable or machine-readable media described herein is intended to include non-transitory computer- or machine-readable media.
  • Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any step described herein may be computer-implemented.
  • the invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally include at least one of a decision, an action, a product, a service or any other information described herein that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.

Abstract

A method for evaluating attentiveness of a user of an interactive computerized application, the application employing a motor-controlled (e.g.) user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the method comprising imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs; generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences; and accommodating for the indication of user attentiveness by modifying at least one aspect of subsequent operation of the on-line computerized application vis a vis the user.

Description

System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input
REFERENCE TO CO-PENDING APPLICATIONS
Priority is claimed from US provisional application No. 61/228,975, entitled
"Real-Time Evaluation of User Attentiveness on Interactive computerized applications" and filed 07/28/2009.
FIELD OF THE INVENTION
The present invention relates generally to interactive systems and more particularly to monitoring performance of a user thereof.
BACKGROUND OF THE INVENTION
Conventional technology pertaining to certain embodiments of the present invention is described in the following publications inter alia:
US Patent 7,310,609 to Middleton III et al describes a system for tracking user micro-interactions with web page advertising.
Published US Application No. US 2007/0139362 to Colton et al describes a health management system for personal computer users which monitors real time interactions of a user with a keyboard, mouse or other to determine strain induced in a user.
Published PCT Application WO 01/67214 describes a system and method for tracking user interaction with a graphical user interface.
The disclosures of all publications and patent documents mentioned in the specification, and of the publications and patent documents cited therein directly or indirectly, are hereby incorporated by reference. SUMMARY OF THE INVENTION
Certain embodiments of the present invention seek to provide a system and methods which monitor for attentiveness by altering transfer function governing user- perceptible behaviors vs. user input.
Certain embodiments of the present invention seek to provide apparatus and methods for evaluating a user of a computing device, comprising receiving user inputs via a MMI of the computing device; providing sensory feedback to the user via the MMI in response to the user inputs; introducing one or more perturbations into the sensory feedback; processing the user inputs so as to measure a response of the user to the perturbations; and analyzing the measured response so as to provide an indication of a level of attentiveness of the user.
Certain embodiments of the present invention seek to provide an interactive computerized application typically having hardware and software components including data storage functionality such as memory, the application including:
a. a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the user input device including or operating in conjunction with an input- application behavior correspondence manipulator operative for imperceptibly modifying the pre-defined initial correspondence based on which the user input device is operative to generate the user-perceptible behaviors, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs;
b. a user compensation monitor, which may be a module in a processor, operative for generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences and computing an indication of user attentiveness; and
c. an attentiveness accommodator, which may operate in conjunction with or integrally with or within at least one output or display device, operative to accommodate for the indication of user attentiveness by adapting at least one aspect of subsequent operation, typically including at least one output or display-generating operation of the interactive application. There is thus provided, in accordance with at least one embodiment of the present invention, a method for evaluating attentiveness of a user of an interactive application, the application employing a motor-controlled user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the method comprising imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs; generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences; and accommodating for the indication of user attentiveness by modifying at least one aspect of subsequent operation of the on-line computerized application vis a vis the user.
Also provided, in accordance with at least one embodiment of the present invention, is a computer-implemented method for evaluating attentiveness of a user of an interactive application, the application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user- perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the method comprising using at least one processor for:
a. imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs;
b. generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences; and
c. accommodating for the indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
Optionally, one or more processors may be employed to perform each of the above steps. Further in accordance with at least one embodiment of the present invention, at least one aspect of the user's compensation includes an aspect of quality of compensation.
Still further in accordance with at least one embodiment of the present invention, at least one aspect of the user's compensation includes an aspect of speed of compensation.
Additionally in accordance with at least one embodiment of the present invention, the user input device comprises a user input generator controlled by the user, such as but not limited to : a mouse, keyboard, joystick, steering wheel; pedal. It is appreciated that user input may also be generated by a user without resorting to an input generator, e.g. if a user input device is voice-activated.
Further in accordance with at least one embodiment of the present invention, the imperceptibly modifying step is differentially activated at a differentiation rate which prevents a user from reaching full adaptation. For example, the imperceptibly modifying step may be alternately activated and de-activated at a rate which is fast enough to prevent a user from reaching full adaptation. Alternatively, the imperceptibly modifying step may be activated to produce different e.g. opposite effects, at a rate which is fast enough to prevent a user from reaching full adaptation. For example, tone volume may be alternately increased and decreased at a rate which is fast enough to prevent a user from reaching full adaptation. If the 'modifying' operation is activated continuously, the adaptation process takes place only once and the user eventually adapts his movements/actions reaching good quality and speed as time goes on. On the other hand, if the 'modifying' operation is alternately activated/deactivated, the user adapts his movements/actions repeatedly, never reaching full adaptation state. Thus, the user's up-to-date adaptation characteristics may be computed continuously in real-time.
Also provided, in accordance with at least one embodiment of the present invention, is a computerized system for evaluating attentiveness of a user of an interactive computerized application, the application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, the system comprising an input-application behavior correspondence manipulator operative for imperceptibly modifying the pre-defined initial correspondence based on which the user input device is operative to generate the user-perceptible behaviors, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs; a user compensation monitor operative for generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences and using a processor to compute an indication of user attentiveness; and an attentiveness accommodator operative to accommodate for the indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
Further in accordance with at least one embodiment of the present invention, the application comprises a training simulator and the attentiveness accommodator is operative to change characteristics of a warning indication provided to a user to alert the user to potential dangers.
Still further in accordance with at least one embodiment of the present invention, the attentiveness accommodator is operative to compute threshold warning indication characteristics which are sufficient to cause the user to be attentive to potential dangers.
Additionally in accordance with at least one embodiment of the present invention, the application comprises a control center controlling a population of human operators and the attentiveness accommodator is operative to alert each individual human operator each time the individual operator is suffering from low attentiveness.
Further in accordance with at least one embodiment of the present invention, once the attentiveness accommodator has sent a first alert to an individual human operator that the individual operator is suffering from low attentiveness, the attentiveness accommodator is also operative to send a second alert to a supervisor that the individual operator is suffering from low attentiveness, if the first alert does not sufficiently improve the individual operator's attentiveness.
Still further in accordance with at least one embodiment of the present invention, the application comprises a computerized game environment and the attentiveness accommodator is operative to enhance attractiveness of at least one characteristic of the computerized game environment if a user of the computerized game environment is found by the user compensation monitor to have a low attentiveness level.
Additionally in accordance with at least one embodiment of the present invention, the application comprises an e-learning/courseware environment and the attentiveness accommodator is operative to classify learning paradigms presented to a learning user as a function of attentiveness which they engender in the user and to replace, in a learning session with an individual user, learning paradigms which have engendered low attentiveness for the individual user, with learning paradigms which engender high attentiveness for the individual user.
Further in accordance with at least one embodiment of the present invention, the application comprises a website and the attentiveness accommodator is employed to identify a user's attraction to specific website locations during the engagement session between the individual user and the website.
Additionally in accordance with at least one embodiment of the present invention, the application comprises a website developers' environment and the input-application behavior correspondence manipulator and the user compensation monitor are utilized during pre-testing of a website under development and the attentiveness accommodator is operative to alert an individual developer of problematic locations within the website which are characterized by low attentiveness so as to allow the website developer to modify the problematic locations.
Further in accordance with at least one embodiment of the present invention, the imperceptibly modifying comprises acquiring real-time data characterizing at least one characteristic of the input device; acquiring real-time data characterizing a behavior of a sensory component controlled by the input device; and breaking the direct control of the interactive application over the sensory component controlled by the input device by imperceptibly perturbing the sensory component's normal behavior.
Still further in accordance with at least one embodiment of the present invention, the imperceptibly modifying also comprises quantifying the human user attentiveness related to the interactive application as a function of his/her control over the input device in response to the subliminal perturbations activated.
Additionally in accordance with at least one embodiment of the present invention, the at least one characteristic of the input device comprises at least one of the following: motion affecting the input device, force affecting the input device, user's voice affecting the input device, indication of input device button clicks, indication of keyboard keys' tapping.
Further in accordance with at least one embodiment of the present invention, the user's up-to-date adaptation characteristics are computed continuously in realtime.
Still further in accordance with at least one embodiment of the present invention, the user-perceptible application behavior comprises at least one of: cursor position, characteristic such as size and/or position of a displayed element which may not include a cursor, tone volume; tone pitch; color.
Additionally in accordance with at least one embodiment of the present invention, the adapting comprises modifying at least one aspect of subsequent operation of the interactive application vis a vis the user. Also provided is a computer program product, comprising a computer usable medium or computer readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. It is appreciated that any or all of the computational steps shown and described herein may be computer-implemented. The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a computer readable storage medium.
Any suitable processor, display and input means may be used to process, display e.g. on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor, display and input means including computer programs, in accordance with some or all of the embodiments of the present invention. Any or all functionalities of the invention shown and described herein may be performed by a conventional personal computer processor, workstation or other programmable device or computer or electronic computing device, either general-purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine-readable memory such as optical disks, CDROMs, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse or steering wheel or pedal or any other input device for accepting. The term "process" as used above is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside e.g. within registers and /or memories of a computer. The term processor includes a single processing unit or a plurality of distributed or remote such units.
The above devices may communicate via any conventional wired or wireless digital communication means, e.g. via a wired or cellular telephone network or a computer network such as the Internet.
The apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements some or all of the apparatus, methods, features and functionalities of the invention shown and described herein. Alternatively or in addition, the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program such as but not limited to a general purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may, wherever suitable, operate on signals representative of physical objects or substances.
The embodiments referred to above, and other embodiments, are described in detail in the next section.
Any trademark occurring in the text or drawings is the property of its owner and occurs herein merely to explain or illustrate one example of how an embodiment of the invention may be implemented.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions, utilizing terms such as, "processing", "computing", "estimating", "selecting", "ranking", "grading", "calculating", "determining", "generating", "reassessing", "classifying", "generating", "producing", "stereo-matching", "registering", "detecting",
"associating", "superimposing", "obtaining" or the like, refer to the action and/or processes of a computer or computing system, or processor or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories, into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices. The term "computer" should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, computing system, communication devices, processors (e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices.
The present invention may be described, merely for clarity, in terms of terminology specific to particular programming languages, operating systems, browsers, system versions, individual products, and the like. It will be appreciated that this terminology is intended to convey general principles of operation clearly and briefly, by way of example, and is not intended to limit the scope of the invention to any particular programming language, operating system, browser, system version, or individual product.
BRIEF DESCRIPTION OF THE DRAWINGS
Certain embodiments of the present invention are illustrated in the following drawings:
Fig. l is a simplified flowchart illustration of a computer-implemented method for evaluating attentiveness of a user of an interactive device, operative in accordance with certain embodiments of the present invention.
Fig. 2A is a simplified functional block diagram of a tool for Real-Time Evaluation of User Attentiveness on Interactive applications constructed and operative in accordance with a first embodiment of the present invention.
Fig. 2B is a simplified functional block diagram of a tool for Real-Time Evaluation of User Attentiveness on Interactive applications constructed and operative in accordance with a second embodiment of the present invention.
Fig. 3 is a timeline of certain tasks, some or all of which may be performed by the systems of Fig. 2A - 2B where tasks are represented by textured rectangles and rectangles having similar texture representing tasks being performed on a single (the same) data block.
Fig. 4A is a simplified flowchart illustration of a motor action analysis method which may be performed by the MAP unit of Fig. 2A or Fig. 2B and is operative in accordance with certain embodiments of the present invention.
Fig. 4B is a simplified flowchart illustration of a method for performing the motor action filtering step of Fig. 4A in accordance with certain embodiments of the present invention.
Fig. 4C is a simplified flowchart illustration of a method for performing the combined sorting coefficient computation step of Fig. 4 A in accordance with certain embodiments of the present invention.
Fig. 4D is a simplified flowchart illustration of a method for performing the user motor error computation step of Fig. 4A in accordance with certain embodiments of the present invention.
Fig. 4E is a pictorial illustration of an example location of a spatial angle θ of an on-screen cursor motor action path between starting and ending points thereof, which is useful in understanding the method of Fig. 4D. Fig. 4F is a simplified flowchart illustration of a method for performing the adaptation computation step of Fig. 4A in accordance with certain embodiments of the present invention.
Fig. 4G is a simplified flowchart illustration of a method for performing the attentiveness evaluation step of Fig. 4A in accordance with certain embodiments of the present invention.
Fig. 5 is a graph of suitable timings of motor actions in accordance with certain embodiments of the present invention.
Fig. 6 is a simplified diagram of a data structure which may be utilized by the memories of Figs. 2 A - 2B in accordance with certain embodiments of the present invention.
Fig. 7 is a table of some of the many applications in which certain embodiments of the present invention may be implemented and of the added functionality which may be provided in accordance with certain embodiments of the present invention.
Fig. 8A is a simplified functional block diagram of a system for Real-Time Evaluation of User Attentiveness on Interactive applications which may incorporate the tools of Fig. 2 A or Fig. 2B and which is constructed and operative in accordance with a first embodiment of the present invention.
Fig. 8B is a simplified flowchart illustration of an example method of operation for the system of Fig. 8A in accordance with certain embodiments of the present invention.
Figs. 9A - 9B, taken together, define user-related parameters for each data source, some or all of which may be computed or otherwise provided in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
Figs. 1OA - 1OB, taken together, form a table describing computation or determination of user-related parameters, some or all of which may be performed in the user-related parameter computation step of Fig. 4A in accordance with certain embodiments of the present invention, all in accordance with certain embodiments of the present invention.
Figs. HA - HC, taken together, form a table suggesting example values or example computations useful in providing user-related parameters in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
Fig. 12 is a diagram of example functionalities of the MSDG routine of Figs. 2A - 2B.
Fig. 13 is a graph of an example motor-action of a screen cursor movement.
DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS
Fig. 1 is a simplified flowchart illustration of a computer-implemented method for evaluating attentiveness of a user of an interactive device, the method including some or all of the following steps, suitably ordered e.g. as shown:
Step 10: providing an interactive application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user-perceptible application behaviors which have a pre-defined initial correspondence with user inputs in the stream of user inputs, and at least one processor, which may or may not be integrated with the application, for performing at least one of the following steps 12 - 16.
Step 12: imperceptibly modifying the pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to the user inputs, and the user inputs, wherein optionally, the imperceptibly modifying step is differentially activated at a differentiation rate which prevents a user from reaching full adaptation.
Step 14: generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between the modified and initial correspondences, e.g. an aspect of quality of compensation and/or an aspect of speed of compensation, and using a processor to compute an indication of attentiveness based on the measurements.
Step 16: accommodating for the indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
A system for Real-Time Evaluation of User Attentiveness on Interactive applications according to an embodiment of the present invention is described herein, including examples of suitable architecture, functionality and external interfaces, with reference to Figs. 1 - 2B.
An example MSDG routine, suitable for implementing the MSDG routine 40 of Fig. 2A, is next described, including examples of methods for Movement monitoring and dynamic data recording, examples of methods for Sensorimotor redirecting, example Data transfer methods and an example graphical user interface.
A diagram of example functionalities of the MSDG routine 40 is depicted in Fig. 12.
An example Main Control unit/program, suitable for implementing the Main Control unit/program 60 of Fig. 2A, is next described, including examples of suitable operation control according to certain embodiments of the present invention, Timing control, Raw data acquisition timing, Raw data partitioning timing, MAP timing control (with reference to Fig. 3), Data transfer control, Sensorimotor redirecting instructions, and methods for Periodic restoring of user dynamic behavior characteristic parameters.
The MAP units of Figs. 2A - 2B are typically operative to perform the method of Fig. 4A, e.g. as described in detail herein with reference to Fig. 4A generally and with reference to Figs. 4B - 4G by way of example. An example MAP unit 70 of Fig. 2A, also suitable for implementing the MAP unit of Fig. 2B, is described hereinbelow, including inputs and outputs thereof, example methods for Motor action filtering (step 410 of Fig. 4A), Motor actions sorting (step 420) including Time Length sorting and Space distance sorting, User motor error computation (step 430), and methods for computing Adaptation (step 440) according to an Averaging method or Exponential regression method both as described herein. Attentiveness evaluation methods suitable for implementing step 450 of Fig. 4, and, with reference to Fig. 6, User-related parameters computation suitable for implementing step 460 of Fig. 4A, are next described.
An example Interna I/Virtual Memory suitable for implementing memory 80 of Fig. 2A is next described with reference to Fig. 6, including Raw data stream read/write control methods therefor.
The following acronyms are used herein:
AAMS Adaptation analysis method selection
AEMS Attentiveness evaluation method selection CPU Central processing unit
CSC Combined sorting coefficient
DMMID Dynamic MMI device (e.g., pc mouse, joystick, etc.)
GUI Graphical user interface
HW Hardware
IP Intellectual property
MAP Motion analysis process
MMI Man to machine interface
MSDG Motor actions monitoring, sensorimotor redirecting, data transfer & graphical user interface
SDW Spatial distance weight
SW Software
TLW Time length weight
UAE User attentiveness evaluation
UME User motor error
UMSE User motor action start/stop event
Certain embodiments of the present invention seek to provide a tool for typically concealed and generic real-time evaluation of user attentiveness on interactive computerized applications such as the tools shown and described herein. The functional components, data structures and methods described herein for evaluating user attentiveness based on interaction with the MMI of a computer are merely exemplary and are not intended to be limiting. The present invention may alternatively be implemented using other components, data structures and methods for presenting stimuli to the user through the MMI and for measuring the user's response.
The system shown and described herein typically evaluates the user level of attentiveness referring to the host current active application, while the user is unaware of its operation; and acquires and analyzes in real-time dynamic data recorded through any DMMID and through the sensory element controlled by the DMMID (e.g. the PC mouse and the on screen cursor), operated by the human user during his/her interaction with the current active application. The system shown and described herein may function as a significant diagnostic tool within various types of hosts of different types of applications as motor simulators, courseware applications or computer games, in order to enhance their knowledge of the human user characteristics in real-time.
Certain embodiments of the invention generate significant improvements for a wide variety of applications e.g. some or all of those set out in the table of Fig. 7.
The system shown and described herein typically examines user attentiveness referring to the current active host application, based on the data of the dynamic interaction between the user and the application, acquired through any DMMID (e.g. PC mouse, joystick, paddle or wheel) used to interact with the application.
The system shown and described herein typically utilizes relationships between user attentiveness and the process of motor adaptation. It may function constantly or for a predefined time period, and the user is unaware of its operation.
The system shown and described herein typically handles its interaction with the active host application typically through one or both of the following routes:
a. Real-time acquisition of dynamic data received from any of the DMMIDs and from the sensory elements/actions (e.g. computer mouse and the on-screen cursor) controlled by those devices. PC mouse, joystick, steering wheel and pedal are all examples of input devices that may be placed under the category of dynamic MMI devices (i.e., DMMID). On the other hand, the on-screen cursor, Pac-Man movements and speaker volume are all examples of sensory elements/actions (e.g., visualized on a display screen, or sounded by a speaker) that are usually controlled by the behavior of a DMMID according to human user actions.
b. Real-time redirecting the behavior of the sensory elements/actions controlled by the DMMID. Typically, redirecting includes quantifying the human user attentiveness related to the interactive application as a function of his/her control over the input device in response to the subliminal perturbations activated.
Through the host operating system, the system shown and described herein typically implements subliminal perturbations (e.g. some or all of the following: undersized angular perturbations of the cursor motion, below the conscious sensory threshold of the human user) upon the host screen cursor or any other sensory element/action controlled by a DMMID, and analyses the user motor adaptation characteristics which lead to the evaluation of the user attentiveness in real-time. The user may be unaware of the perturbations effects, nevertheless, the common user adapts his/her behavior accordingly, unconsciously. As the user attention to the host current running application decreases, the quality of the user feed-forward motor adaptation decreases.
The perturbation type and parameters are typically selected to fit the type of properties controlled by the input device, and the characteristics of the input device's control-function over the above properties.
For example, a suitable perturbation of a cursor is an angular perturbation. If the 2D cursor was originally set to move R pixels in a direction defined by the angle alpha (or based on Cartesian coordinates: x pixels right and y pixels up) then a suitable perturbation causes the cursor to move R pixels in a direction defined by alpha+beta. In this case, the perturbation angle beta may be low enough to prevent user awareness to the existence of the perturbation e.g. 10 degrees or less.
Alternatively, if for example the input device controls the brightness of a colored element or controls the frequency of a sound which the user wishes to adjust, then the perturbation can be expressed by slightly increasing/decreasing the strength of the input device's affect on the above properties. In this case, the maximum strength of the perturbation may be predefined according to the characteristics of the input device's control-function over the above properties.
The system shown and described herein typically produces a real-time evaluation of user attentiveness as a discrete time based function which ranges from a low to high endpoints, e.g. 0-10, which may correlate with low-high attentiveness, respectively.
Thesystem shown and described herein typically applies a graphical user interface (i.e., GUI) to enable the host administrator a basic control over its functionality.
Example block diagrams and external interfaces are illustrated in Figs. 2A - 2B.
As shown in Fig. 2A, thesystem shown and described herein may, in certain applications, comprise a hardware device (formed with FPGA/DSP or ASIC) and associated software, particularly when host CPU utilization may impede functionality of the host application. The hardware device may perform some or all of the motor analysis steps of the MUF unit as described herein in detail with reference to Fig. 4A, and hold historical data independently, thus minimizing the additional load on the host CPU.
The present invention may include some or all of the following subsystems, e.g. as described in detail below: a. MSDG (Motor actions monitoring, Sensorimotor redirecting, Data transfer and Graphical user interface) routine 40 of Fig. 2A b. Main Control unit/program 60 of Fig. 2A c. One or more MAP (i.e., Motor Analysis Function) units 70 of Fig. 2A, e.g. as shown in Fig. 2B, one for each dynamic data source analysis d. Internal/Virtual memory unit 80 of Fig. 2A.
Another subsystem unit, the Communication Controller, may comprise a standard USB communication controller or any other type of communication (e.g., parallel/serial port, Blue-Tooth etc.) controller for communication between the Main Control unit and the MSDG unit over the communication Channel.
The MSDG routine 40 may be designated to perform some or all of the following processes, e.g. as shown in Fig. 12:
a. Monitoring and recording (functionality 1210 in Fig. 12) of any motor data received from DMMIDs e.g. as shown in Fig. 2A, and from the sensory elements/actions controlled by those devices, through the host operating system 30, also as shown in Fig. 2 A.
b. Slightly perturbing normal behavior of some/all of the sensory elements/actions controlled by the system e.g. by DMMIDs 1220 in Fig. 12. This operation is also termed herein 'sensorimotor redirecting'.
c. Data transfer 1230 towards and from the Main Control unit/program 60.
d. Applying a graphical user interface 1240 for the administrator, e.g. as described hereinbelow. Certain embodiments of the above processes are now described in detail, according to certain embodiments of the present invention.
Methods for monitoring and recording dynamic motor data according to certain embodiments of the present invention, are now described. The term "main input data" is used herein to include some or all of dynamic data received from the host DMMIDs controlled by the user motor movements/forces, and/or dynamic data which defines the behavior of the elements/actions (e.g., the on screen cursor) controlled by those devices. This typically comprises real time data recorded by the MSDG routine as a time based function for each of the dynamic resources, separately.
Generally, the dynamic data refers to a motion in space, by handling the relative changes in the spatial coordinates, the force vector or the velocity vector.
For example, tracking the screen cursor movements may be obtained by sampling and recording the cursor 2D spatial coordinates versus time: t, x(t) & y(t) .
The sampling rate of the dynamic data may vary, e.g. based on a nominal value of 1 OmSec, e. g. 100 Hz.
The recorded data may be stored in the Memory unit 80 of Fig. 2A.
Sensorimotor redirecting methods are now described. Normally the host DMMIDs control the behavior of the elements (e.g. screen cursor) affected by the user's dynamic actions (e.g. movements, force) in one, two or three dimensional space.
When the system shown and described herein is activated, it takes partial control over those elements in order to insert subliminal perturbations into the typical behavior of these elements. Based on instructions established by the Main Control unit, the MSDG routine applies subliminal perturbations (e.g., spatial angle shift perturbations) to the normal behavior of the elements/actions (e.g., screen cursor) described above. These perturbations may, according to certain embodiments, compel the user to perform online corrections to his planned actions through the DMMID, in order to obtain the original targets the user wishes to obtain by his/her original actions using the input device (e.g. in order to move the on screen cursor onto a specific icon and double click). For example, a user's Original target' may be to move the screen cursor to a specific point on the screen in order to close a window session; or to turn down the volume of a loudspeaker by turning a knob or a steering wheel to a desired volume. The rate of perturbation insertion (e.g. the rate of inserting perturbation into each sampling of the dynamic data which defines the behavior of the element/action controlled by the host's DMMID) is typically selected to be similar to the rate of the dynamic data sampling (e.g., 100 Hz) in order to achieve optimal synchronization between the data recording module 1210 and the sensorimotor redirecting module 1220 , but the rate is typically not low enough to cause user awareness to the perturbations.
The user's motor adaptation process following the perturbations may be examined constantly by the MAP unit 70 of Fig. 2B.
Example Data transfer methods are now described. Several types of data are transferred through a bidirectional data channel between the MSDG routine 40 and the Main Control unit/program 60. The data derived from the MSDG unit 40 may include some or all of:
a. Raw data recordings of the dynamic real-time data (e.g., time based 2D coordinates) received from the host DMMIDs (e.g., the pc mouse or joystick movements).
b. Raw data recordings of the dynamic real-time data (e.g., time based 2D coordinates) received from the host controllers of the elements controlled by the host DMMIDs.
c. Setup parameters values initially defined through the administrator GUI e.g. as described herein.
Conversely, the MSDG routine 40 receives some or all of the following data: a. Description of the dynamic data to be transferred.
b. Management instructions of the sensorimotor redirecting task.
c. The attentiveness evaluation results and, optionally, additional control data.
An example of a Graphical User Interface is now described in detail. The GUI typically affords an administrator before-startup control over some or all of the following operational setup parameters:
a. Active time period (start & stop operation time).
b. Expected user/s identity/ies. c. Indication of the active DMMIDs and the elements/actions controlled by those devices, to be monitored for applying the attentiveness analysis, and their expected functional characteristics such as but not limited to some or all of the following:
a. Fmajyp, typical rate of the dynamic motor actions (e.g., pc mouse movements) of a likely user.
b. Tmajyp, time duration of a typical dynamic motor action of a likely user. c. Tia_pre_ma_mjn, minimum time gap of low activity after the ending of one motor action, before the beginning of the next motor action.
d. Tgui _ma_pre_vp_max, maximum time period between motor action start and the time it reaches its peak velocity/force.
e. UMSE, i.e. user motor action start/stop event characteristics (e.g. mouse left button click).
f. Cavg, threshold level for filtering the user motor actions, according to their velocity/force function correlation result with a template time-based velocity/force function of the specific user.
d. The desired sampling rates of the dynamic data acquired from each source, predefined by default to a value of 100 Hz.
e. Memory space definitions: size & location.
f. read-only memory-space address holding the real time attentiveness evaluation results.
g. characteristics of the sensorimotor redirecting perturbation, of each element controlled by any DMMID, e.g. some or all of the following:
a. The type of perturbation activation paradigm, e.g., cyclic on/off activation with or without alternating the perturbation course as related to the original course of the element controlled by the DMMID.
b. The rate of the perturbation activation, in sense of the number of UMSEs e.g. as defined herein. The duty cycle of the activation rate may be specified by the numbers of UMSEs pairs (e.g. start & stop events) occurrences throughout the perturbation active period Npajrs_pert_on and throughout the perturbation inactive period Npairs_pert_off- Both numbers may be no less than 4.
c. The time-space relation between the perturbation and the action to be effected, e.g., simple geometrical phase-shift of the original space path of the element (e.g., screen cursor) whose motion may be controlled by the DMMID.
d. Maximum perturbation intensity, e.g. 10° phase shift in the case of the perturbation example described above.
Rurj, the ratio between the rate of the perturbation activation e.g. as defined herein and the rate of the periodic storage of the user related parameters tables, referring to each dynamic data source. The default value is Rurι = 4 , but it can be increased/decreased through a range of 1 till 20.
f. Desired number of levels of the attentiveness evaluation.
h. AAMS, adaptation analysis method selection e.g. averaging or exponential regression as described in detail herein. AAMS may be set to '0' or '1' respectively. i. AEMS, attentiveness evaluation method selection e.g. averaging or exponential regression (as explained herein). AEMS may be set to '0' or T respectively.
j. Attentiveness evaluation equation constants:
S\ adaptation factor, a Boolean value reset to '0' by default.
K\ attentiveness grading factor, set to a value of 8 by default or can be reassigned to a different value in the range 5 < K\ < 9.
Ci1 & b, attentiveness evaluation factors ( / = 1••• 4 ) as defined herein, can
4 4
be set to a value between 0 and 1, and may fit:∑at = 1 and∑bt = 1. By default, ax = 1 α2 = a3 = aA = 0 bx = b2 = O3 = b4 - 0. k. ΔT, time duration between motor action start and the time of measuring the user motor error, which may be mostly caused by the sensorimotor perturbation manipulation. ΔT value can be set only within the range of 40mSec - 200mSec. If required, during real-time operation, the GUI may display a small status window on the host screen, indicating the attentiveness evaluation results as a graphic up to date figure of the UAE versus time, or as a numeric number. The GUI may assert warnings whenever the ratio between the actual and the expected rate of the
F F
motor actions is too high or too low, e.g., —— > 2 or —— < 0.5 ,
^ma_typ ^ma_typ respectively. These warnings may suggest that some of the parameters described herein as being registered through the administrator GUI are inaccurate and may be updated.
An example Main Control Unit 60 is now described in detail, according to certain embodiments of the present invention. This unit typically controls operations and coordinates the functionality of internal units. The Main Control unit 60, which may be based on a micro-processor, and/or ASIC/FPGA/DSP, typically performs some or all of the following tasks, including interfacing as appropriate e.g. with some or all of the MSDG routine, host operating system, MAF units and/or memory unit: a. Controlling real-time operation in accordance with the administrator GUI input data.
b. Timing control over some or all of the other units/routines.
c. Controlling some or all data transfers.
d. Establishing the sensorimotor redirecting instructions.
e. Periodic storing of updated parameters of the user characteristic behavior.
Operation Control: The Main Control unit may control operation based on the operational attributes values entered through the administrator GUI. The Main Control unit controls the system shown and described e.g. as to one or more of: activation and deactivation, the list of the DMMIDs (i.e., the dynamic data sources) to be analyzed and the list of host elements whose behavior is controlled by these devices.
Timing Control: As part of controlling operations, the Main Control unit may coordinate the timing of some or all of the other units. It manages the timing of initiation and activation of the units and the timing of the data transfers. Fig. 3 displays the timeline of certain tasks according to certain embodiments of the present invention, coordinated by the Main Control unit, referring to a condition of acquiring and analyzing one dynamic data source. Each additional data source requires the control of additional but identical tasks timeline.
Raw data acquisition timing: Upon activation of apparatus shown and described herein, dynamic data may be acquired constantly by the MSDG unit from the sources chosen by the administrator through the MSDG GUI. The same list of chosen sources may be handled and controlled by the Main Control unit 60 of Fig. 2A. The dynamic data may be acquired at a sampling rate predefined separately for each data source through the MSDG GUI. The raw data acquisition of each data source typically includes both the dynamic data acquired through the host's DMMID, and the dynamic data acquired through the host's controller of the element/action controlled by the same DMMID; both types of data may characterize the behavior of both the host's DMMID and the element/action controlled by it.
Raw data partitioning timing: The acquired raw data of each data source may be stored in a suitable Memory unit e.g. as shown in Figs. 2 A— 2B, and may be divided into consecutive blocks with varied size, in a typical range of 10k-100kByte. The timing which defines the end of each block and the start of the following one may be managed by the Main Control unit. Each block may contain the same number of UMSEs e.g. as defined herein which equals to the rate of the sensorimotor perturbation as defined herein.
MAP timing control: Following the storage of each raw data block, it may be read and analyzed by the MAP unit while simultaneously the subsequent data block may be acquired and stored in a Memory unit. Ahead of the attentiveness analysis, the MAP produces out of each raw data block a first stage analysis data block e.g. by performing steps 430, 440 and 450 of Fig. 4A, described in detail herein. The first stage analysis data block may be written into the Memory unit as a condensed replacement for the original raw data block.
In addition, the MAP computes the user-related parameters periodically at a rate defined by Rurj. In the example illustrated in Fig. 3, which is an example of a simplified timeline of tasks according to certain embodiments of the present invention, Rurf = 2 , therefore the user-related parameters are computed for each two consecutive acquired data blocks. Data Transfer Control: The Main Control unit controls various types of data transfers, such as but not limited to some or all of the following:
a. Storing some or all of the dynamic raw data monitored by the MSDG routine, into the Memory unit as consecutive data blocks with varied size.
b. Transferring some or all of the data registered through the administrator GUI, to the Main Control unit and to the Memory unit.
c. Transferring blocks of raw data from the Memory unit to the MAP, for sequential analysis.
d. Exchanging each stored raw data block with a first stage analysis data block of the same data, in order to reduce the accumulative size of the total data stored inside the Memory unit.
e. Storing the attentiveness evaluation results of each data block from the MAP into the Memory, and passing these results to the MSDG routine for graphic/numeric real-time display.
f. Updating parameters values of the user characteristic behavior restored in the
Memory unit.
Sensorimotor Redirecting Instructions: The Main Control unit manages the sensorimotor redirecting instructions, applied by the MSDG routine for each of the
DMMIDs to be analyzed. These instructions, tailored differently for each affected element, typically include some or all of the following guidelines for the sensorimotor redirection functionality:
a. The targeted elements/actions to be effected (e.g., the on screen cursor), which are the sensory elements controlled by the DMMIDs specified through the administrator GUI as described herein.
b. Timing of activation and deactivation, e.g. the rate of the perturbation activation as described herein.
During real-time operation, if one of the user-related parameters Nto_ma_pert_on or Nto_ma_pert_off is found lower than 4, then Npairs_pert_on or Npairs_pert_off; respectively, is increased by one. c. Intensity. Maximal intensity level is normally defined so as not to arouse user awareness to the sensorimotor redirecting perturbation.
d. Time-space relation between the redirecting perturbation and the element/action to be effected.
e. The contents of the instructions are defined based on predefined variables, updated setup data received through the administrator GUI before startup, and real-time analysis of acquired data received from the MAP units during operation.
Periodic Restoring of User Dynamic Behavior Characteristic Parameters is now described in detail, according to certain embodiments of the present invention. Each user has different characteristics of dynamic behavior (e.g. hand movements) while controlling the DMMIDs of the host computer with or without sensorimotor redirecting effects. These characteristic parameters may assist for better analysis of the user attentiveness, by identifying the individual ranges of the user dynamic behavior.
All the user-related parameters of Fig. 9 A and 9B may be computed by the
MAP units, generating structured tables e.g. as illustrated in Figs. 1OA, 1OB, 1 IA and 1 IB, for each of the dynamic sources analyzed. The tables are stored in the Memory unit and updated periodically based on the value of Rurj, as defined through the administrator GUI of the MSDG routine 40 of Fig. 2A, under the control of the Main Control unit.
The MAP units execute the main motor analysis generating real-time evaluation of the user attentiveness and other user related parameters which are utilized to achieve better fine tuning of the attentiveness estimation.
If two or more dynamic data sources are employed for simultaneous analysis, then one MAP unit may be assigned for each data source analysis, and the final attentiveness evaluation may be determined as the weighted averaging of each MAP result.
Fig. 4A is a simplified flowchart illustration of an MAP method provided in accordance with certain embodiments of the present invention. The method typically includes some or all of the steps shown, suitably ordered e.g. as shown. According to certain embodiments, first stage block analysis and storage of Fig. 3 corresponds to steps 430, 440 & 450 in Fig. 4 A, and the data generated by these steps may be stored in the Memory unit of Figs. 2A - 2B under the control of the Main Control unit of Figs. 2A - 2B. The user-related parameters analysis of Fig. 3 correspond to step 460 in Fig. 4A. The attentiveness analysis of Fig. 3 corresponds to step 450 in Fig. 4A.
The MAP typically receives two inputs from the memory unit, and produces three outputs as described below:
MAP inputs may include some or all of:
a. A block of raw data received immediately following its acquisition and storage completed, including the times of the UMSEs relevant to the data source; and/or b. Predefined attributes and constants, and the parameters of the sensorimotor perturbation applied, as defined through the administrator GUI.
MAP outputs may include some or all of:
a. Attentiveness evaluation result, referring to the processed data block.
b. First stage analysis data block, to be stored as a condensed replacement for the original raw data block. It comprises the UMEs, the adaptation computation results and the attentiveness evaluation results of each identified group of filtered motor actions of the analyzed raw data block.
c. Updated user-related parameters.
Motor Actions Filtering step 410 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration on Fig. 4B. All the dynamic data (e.g. raw data comprising the pc mouse behavior and the screen cursor movements), monitored by the MSDG, of one raw data block may be filtered in order to screen out non target oriented motor actions. This process is typically based on some or all of:
a. Predefined attributes of UMSE definition (e.g. mouse click), as received through the GUI of the MSDG unit 510 in Fig. 4B - required to define the start/stop time of each motor action, Tstartβ) and TstopQ) .
b. Predefined characteristic time constants.
c. User related time constants.
d. Correlation between the examined motor action velocity and a template time- based velocity function of the specific user under the same condition, with or without the presence of the sensorimotor perturbation, e.g. Vma_pert_on(t) or Vma_pert_ofi(t) respectively.
The input to units 520, 530 and 550 typically comprises a single data block which may include:
a. INi - Block of data acquired through the Host's DMMID, e.g., [x(t),y(t)] 2D coordinates of the mouse movements, and the times of the mouse buttons push/release events; and
b. IN2 - Block of data acquired through the host's controller of the element/action controlled by the same DMMID (e.g., [x(t),y(t)] 2D coordinates and the velocity v(t) of the on-screen cursor movements).
Unit 530 is typically operative to find the time of UMSEs as received through IN2: TstartQ) and TstopQ) (e.g., when cursor velocity exceeds V/a_pre_ma_max after being slower for at least Tia_pre_ma_mjn time period)
Unit 520 is typically operative to find the time of UMSEs as received through INi: Tstαrt(i) and Tstop(i) (e.g., time of mouse right/left button clicks)
In order to filter only the valid goal-oriented motor actions, unit 570 typically verifies that each possible motor action k follows a set of suitable rules such as but not limited to some or all of the following:
i. Following the motor action's start event, the motor velocity/force exceeds Viα_prejnα_mαx value, or more generally, the intensity/power of the motor action exceeds a predefined value suggesting it is not merely 'noise'.
ii. The time gap following the ending of the k-1 motor action prior to the beginning of the k motor action, Tstαrt(k)-Tstop(k-1), is longer than the period Tια_pre_mα_mm, during which the motor velocity/force does not exceed Vια_pre mα m^ value. More generally, a considerable time gap exists between each two consecutive motor actions.
iii. The total time period of the k motor action, Tstop(k)-Tstαrt(k), is shorter than Tmαjnαxjim but longer than T mmjim, or more generally, each motor action duration does not deviate from a predefined expected time range for the motor action in question.
iv. The time period between the start of the k motor action and the time it reaches its peak velocity/force is shorter than T_pre_vPjnαx- v. The mathematical correlation between the motor velocity/force during [Tstart(k), Tstop(k-l)] period, and Vma_pert_off(0 or Vma_pert_on(t) depending on the current perturbation status e.g. as received through MSDG unit 540 in Fig. 4B, off or on, respectively, is higher than Cavg/2.
Rules iv and v may be provided in combination or in isolation and may be replaced or augmented by any rule which ensures that the nature/form of the motor- action intensity/power function over time is similar to the form of an exemplary intensity/power function of a target oriented motor action of the individual user.
Typically, the rules ii - v may be replaced by any suitable method or set of rules useful in identifying a specific pattern within a continuous sampled raw data, where the 'pattern' is the target-oriented motor-action.
Typically, user-related parameters e.g. as received through Memory unit 560 in Fig. 4B are used to identify the motor-action pattern of the individual user by adapting the filtering rules according to the behavior of the individual user.
Fig. 13 is a graph of an example motor-action of a screen cursor movement.
Generally, motor action graphs depend both on the specifics of the motor action and on the user's individual behavior. The start time and the stop time of the motor action are identified based on the UMSE definitions described herein. Referring again to Fig. 4B, the output of step 570 may comprise the dynamic data per target-oriented motor action m, during the period Tstart(m)<t<Tstop(m), e.g., [xm(0>ym(t)] 2D coordinates and the velocity vm(t) of the m target-oriented cursor movement, while given: Tstart(m)<t<Tstop(m).
The computation of the 'Combined Sorting Coefficient' step 420 is now described in detail e.g. in accordance with the flowchart illustration in Fig. 4C. Some or all of the filtered motor actions of the same processed data block may be sorted by investigating their total time length, and by their spatial distance, as described below in detail, according to certain embodiments of the present invention.
Time Length Sorting: The optimal time length, Tma_opt of each motor action may be based on predefined time constant defined through the administrator GUI (e.g., Tma typ, as described herein) and based on the user-related updated variable
Wl(X tVp TYl(X OVS
Tma_avg : Tma _opt = ~ n =— ' e-§- as Per steP 630 of Fig- 4C- Too fast or too slow motor actions are treated with lower weights (within the range of 0 till 1) through the further analysis process thus minimizing their outcome. The TLW (i.e., time length weight) of each motor action whose time length is Tma, is computed, e.g. in step 640 of Fig. 4C, as:
1
Figure imgf000030_0001
Spatial Distance Sorting (step 650 in Fig. 4C): The spatial distance of a motor action may be defined as the shortest motor distance for arriving from the original
tB tB
starting point A to the original end point B, i.e. min( f Vdt) or min( \Fdt) , where V
Α
stands for the velocity vector of the element controlled by the DMMID and F stands for the force operated on the element controlled by the DMMID. For example, referring to the on-screen cursor motor movements, the spatial distance of the onscreen movement may be the direct 2D distance (in cm or pixels) between the movement starting point and the movement end point.
Motor actions which have longer spatial distances are treated with higher weights (within the range of 0 till 1) through the further analysis process thus maximizing their outcome. If Lmax is the maximal spatial distance of all the motor actions of the current analyzed data block, then the SDW (i.e., spatial distance weight) of each motor action which its spatial distance is Lma, is computed as: SDW = -^L
^m ax
The 'combined sorting coefficient' of each motor action may be computed as in step 660 of Fig. 4C:
CSC = LTW SDW User Motor Error Computation step 430 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration of Fig. 4D (steps 710 and 730). Each motor action may be analyzed in order to compute its related UME, i.e. user motor error. The UME indicates the size of the user error in directing the motion of the element/action controlled by the DMMID effectively towards the intended target, e.g., moving the on-screen cursor to point over a specific icon on the PC monitor. Therefore, the time- space paths of all the filtered motor actions may be identified and mainly the spatial coordinates of the starting and ending points.
TANma of each motor action path may be typically defined and computed as the tangent of the spatial angle θ between two straight lines, Linel and Line2, or may be computed by any other suitable function which identifies the deviation between the following Linel and Line2.
Linel : The imaginary spatial distance line connecting the starting point A to the
*end tend ending point B, i.e. the line whose length equals min( \Vdt) or min( \Fdt) .
1 start l start
Line2: The imaginary spatial distance line connecting the starting point A to the motor action path position at a Af period following the timing of the starting
* start +AT t start +AT point A, e.g. the line whose length equals min( Wdt) or min( ΪFdt) .
* start t start
40mSec≤ Δτ≤ 200mSec . The UME of each motor action may be defined as (TANma ) .
Fig. 4E displays an example for the location of the spatial angle θ of the onscreen cursor motor action path between its starting point A and its ending point B.
Adaptation Computation step 440 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration of Fig. 4F. The user motor adaptation to the control of each DMMID may be expressed by two time based functions: a. The 'minimum adaptation error', or the 'asymptotic minimum error ,', E0(t) b. The 'adaptation rate', or the 'rate of error reduction', ^' .
These functions are obtained (e.g., through step 920 in Fig. 4F) by analyzing the UMEs of each group n of consecutive motor actions which had included no sensorimotor redirection perturbation or, on the contrary, included the same sensorimotor redirection perturbation. The time variable t may be a discrete variable
Tn holding the values of the averaged times of the target oriented motor actions of group n (step 910 in Fig. 4F), i.e. t e ^1 ' Tl ' 7^ ' " ^ .
Fig. 5 displays exemplary timings of motor actions obtained by the user (also termed herein "motor action events"), such as cursor movements or loudspeaker's volume reduction/increase actions, and the Tn time of each group. Typically each data block includes two groups: one which includes no sensorimotor perturbation and one which includes sensorimotor perturbation.
Assuming that group n includes m target oriented motor actions, the analysis technique may be based on two feasible methods, averaging and exponential regression. Through the administrator GUI, it can be predefined which method to be used according the value of AAMS, e.g. either of the following 2 methods
("Averaging" and "Exponential Regression") may be used:
a. Averaging Method:
EQ(Π) may be obtained by averaging the UMEs of the m target oriented motor actions of group n, considering the combined sorting coefficient of each motor action
UMEn (1)• C-SCn (1) + UMEn (2)• CSCn (2) +••• + UMEn (m) CSCn (m)
E0(n) =
Figure imgf000032_0001
λ(n) computation may be unavailable according to this method, b. Exponential Regression Method Eft(n) and λ{n) are obtained by fitting of simple exponential regression curve, m
for the theoretical function UMEn (m) = E0(n) e λ^n\ The final adaptation results include fine tuning correction based on the individual user range of adaptation values μ\ ,σ\ , //2 and σj , as in step 950 of Fig. 4F:
F ( } —
EM = EM - [0.9 + 0.1 - (-^n
Mx λ'(n) = λ(n) [0.9 + 0Λ - (^-)σ2 ]
M2
If group n of motor actions relates to the active period of the sensorimotor perturbation, then μ\≡ μ£ ; σ\≡ σ^ ; μj≡ μχ , ' &2≡σλ > otherwise
MlME_ref> σ\≡σE_ref \ MlMλ_ref> σ2≡σλ_ref -
Attentiveness Evaluation step 450 is now described in detail, according to certain embodiments of the present invention, e.g. in accordance with the flowchart illustration on Fig. 4G. This step receives as input the E0'(ή) and λ'(ή) values computed in the previous step. The real-time evaluation of user attentiveness may be accomplished by computing the ratio between the real-time adaptation values with and without the presence of the sensorimotor redirection perturbation.
Each group of consecutive filtered motor actions relates to a period of the same state (e.g., on/off) of sensorimotor perturbation, and each group holds its own motor adaptation analysis results Eo' («) , λ'(n) and Tn . Based on historical data stored in the memory unit, the averaged motor adaptation results of all the groups which relate to the same perturbation state (e.g. E570 , E8n , λsτo and X8n ), and the averaged adaptation results of all the groups (e.g. ETolαl and λj.otαl ) are obtained:
E8T0 = meαn[Eo' (H)] he all the groups which relates to STO perturbation state. E8n = meαn[Eo' (Zz)] he all the groups which relates to STl perturbation state.
E rotαi ~ mean[E0' (h)] he all the groups. λ8T0 = mean[λ(h)] he all the groups which relates to STO perturbation state. λ8n = mean[λ(h)] he all the groups which relates to STl perturbation state.
X7010, = mean[λ(h)] he all the groups. n (X(KrnIΛ)-—n Ui *≤W^!L + r Q Λ-, ' E0'(n)-Esn r (A-i * E0'(n)- ETotal r ' Eϋ'{n)-E0'{n-\)
E0'(n) E0'(n) E0'(n) E0'(ri) β(n) = b -V)
Figure imgf000034_0001
4 4
The α, and δ, factors indicated above may fit: ^a, = 1 and ^b1 = 1.
For example, if the sensorimotor perturbation has only two possible states, e.g., on/off, then the following values are computed:
E op = mean[Eo' (Zz)] he all the groups which relates to perturbation 'off state. E0n = mean[Eo' (h)] he all the groups which relates to perturbation 'on' state.
ETolal = mean[Eo' (Zz)] he all the groups. λoff = mean[λ(h)] he all the groups which relates to perturbation 'off state. ^On = mean[λ(h)] he all the groups which relates to perturbation 'on' state.
λTotal = mean[λ(h)] he all the groups.
E0'{n)-Eoff KJn)-E0n Eo' (n) - ETotal EQ'(n)- E0'(n-l)
(JCXrI)— * "t" U-) * "T* Ci-J * i Qt *
E0'(n) E0'(n) E0'(n) E0'(n) β(n) = m-X°» +bl W-f» +bi «*>-*>" +b4.Mz^zR
1 λ{ή) 2 λ{ή) 3 λ{ή) 4 λ{ή) cc{ri) and β(ή) values are outputs of step 1020 in Fig.4G. The UAE (i.e., user attentiveness evaluation) at time Tn of a single dynamic data source analysis may be computed (step 1040 in Fig.4G) e.g. in accordance with the following equation, if ^EMS=O :
Figure imgf000034_0002
= 1
S\ may be equal to 0 or 1, predefined e.g. in accordance with the administrator GUI.
K\ = 8 or can be set to a different value e.g. in accordance with the administrator GUI. The UAE can be computed (step 1040 in Fig. 4G) by any other function fitting the cc(ri) value and the β(n) value within the desired scale (e.g. 0-10), as the following function, if AEMS=I : round {10• [1 - (- )] 5 } S1 = O
UAE(Tn) =
round {10 - [! - (-—)] 5 •[! - (—-)] 5 } S1 = 1
User-Related parameters Computation step 460 in Fig. 4 is now described in detail, according to certain embodiments of the present invention. . Each individual user has some typical values related to his/her own characteristic movement parameters and his/her own quality of adaptation. These time-dependent parameters are initially defined to hold specific values, but later they are updated periodically (based on Rurι value) e.g. in accordance with the specific user performance, referring to the same dynamic data source.
Figs. 9A— 9B, taken together, define user-related parameters for each data source, some or all of which may be computed or otherwise provided in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
Figs. 1OA - 1OB, taken together, form a table describing computation or determination of user-related parameters, some or all of which may be performed in the user-related parameter computation step 460 of Fig. 4A in accordance with certain embodiments of the present invention, all in accordance with certain embodiments of the present invention.
The table of Fig. 1OA lists user-related parameters of each data source according to certain embodiments of the present invention. As illustrated in Fig. 1 OB, some of these parameters may be defined as constants, some predefined through the administrator GUI, and some statistically computed in real-time based on the MAP unit analysis results of the latest data blocks acquired.
Figs. HA - HC, taken together, form a table suggesting example values or example computations useful in providing user-related parameters in the context of the user-related provisional step 460 of Fig. 4A, all in accordance with certain embodiments of the present invention.
The Memory unit 80 of Fig. 2A is now described in detail, according to certain embodiments of the present invention. This unit represents a real or virtual memory space.
In the embodiment of Fig. 2A, this memory space can be implemented on a RAM/FLASH element. On the other hand, in the embodiment of Fig. 2B, this memory space may be actually a virtual space allocated by the host operating system.
This memory space may hold some or all of the following data as illustrated in Fig. 6 which is a diagram of an example of the memory contents, which may hold different values related to each dynamic data source. The term dynamic data source refers to the dynamic data acquired through any of the host's DMMID in addition to the dynamic data acquired through the host's controllers of the element/action controlled by the same DMMID.
The memory contents typically include some or all of: The raw data block most recently stored (including the raw data acquired through the host's DMMID and the through the host's controller of the element/action controlled by the same DMMID) , and all the 1st stage analysis data blocks (e.g. as generated by steps 430, 440 and 450 of Fig. 4A) corresponding respectively to the previous raw data blocks recordings of dynamic real-time data acquired by the MSDG unit; operational setup parameters defined by the administrator GUI; all the user-related parameters computed by the MAP unit; the latest attentiveness evaluation result, obtained by the MAP unit.
Raw Data Stream Read/ Write Control for the internal/virtual memory 80 of Fig. 2A is now described in detail, according to certain embodiments of the present invention. Memory read/write transactions are controlled by the Main Control unit.
Blocks of real-time raw data monitored through the host DMMIDs are constantly written into the memory. Subsequently, each written block is then read by the MAP unit and analyzed in order to produce its first stage analysis data block, the attentiveness results and the computed user-related parameters. The user-related parameters may be computed only if required according to Turj time rate and not otherwise. Thus, in order to reduce the overall data storage, each block of raw data may be exchanged by the first stage analysis data block of the same original data block.
Example environments in which certain embodiments of the invention may be useful include but are not limited to the following:
a. Training Simulators: Use of simulation for training has been proven effective as a means for developing human competencies by military organizations, over the years. Real time evaluation of a trainee's attentiveness may be utilized during real time operation of the simulator and post simulation.
For example, referring to flight simulators, a pilot's decision-making is affected by his attentiveness to key events occurring prior to the time a challenging episode takes place.
If a pilot is facing challenging weather conditions (say), his maneuvering activities may consider these facts. However, if the pilot's attentiveness was low when these conditions were detected, it may weaken his decision-making later on by performing less appropriate maneuvers. In this case, if the pilot's attentiveness during detection is measured to be low, a simulation system constructed according to certain embodiments of the present invention can:
put more emphasis on the warning message issued to herald the dangerous condition (e.g., by increasing the size or the sound level of the warning message) in order to engage the trainee's attention in real time; and/or
perform post simulation analysis in order to estimate the threshold level of the size/sound of the warning message issued when the enemy's aircraft was detected, which is cabled to engage the specific trainee's attention.
Similarly, in case there is more than one key event towards which the trainee's attentiveness is necessary for specific decision-making, additional post simulation analysis can be obtained. By examining the correlation between the trainee's attention levels to each one of the key events and his decision making quality, it may be possible to set a priority list suggesting which one of the key events is probably most valuable for better decision making.
b. Control Systems and Command & Control Systems: Large control centers such as in airports and telecommunication companies involve human interaction with multiple monitors and control systems. In order to assure high quality control, employees must remain attentive throughout their shifts, monitoring the various parameters and interacting with the systems. Lack of attention in control centers could lead to catastrophic results, so the need for attention level identification is crucial. Certain embodiments provide real-time detection of low attention levels of the control personnel. The system may alert these personnel when their attention has wandered, creating a higher level of security with regard to human control.
In airports, for example, correct and precise management of arrivals and departures over airstrips holds major importance. The attentiveness of the human operator is crucial especially when a scenario of some equipment malfunction occurs and people's safety is at risk. If while an operator is interacting with the control system, a phone conversation takes place, and his real-time attentiveness to the control system is measured below a predefined threshold value for more than a specific time period, then an alarming sound or a pop-up warning message window can be initiated. This action may be operative to notify the operator he should pay more attention to the operation of the control system, hi addition, if the operator's attentiveness is still measured low later on, then an alarming notification may be sent to his supervisor, hi this case the supervisor may contact the operator and ensure that his attention to the control system is not disturbed by an irrelevant factor, and consider the need for additional human resources to deal with the present circumstances,
c. Online Gaming Platforms
The ability to identify a gamer's attention level during a game, in real time, may be provided according to certain embodiments. If a gamer is losing interest in the game, enhancement of gaming conditions, e.g. introducing competition or other computerized community features, may help regain the gamer's interest and lengthen the time spent on the game.
d. E-learning and computerized Courseware and Cognitive Training systems:
Courseware, eLearning or brain training programs are used by the user to either learn a subject or exercise brain function. In programs geared towards ADD, ADHD or autism, it is important to hold and correlate with the users' attention levels. These systems can shift strategies when the user's concentration level is sub-par.
Certain embodiments of the present invention may be incorporated in courseware, e-learning and brain training programs. The system may "signal" when the user's attention level has dropped allowing automatic adaption of the teaching or training strategy used. For example, new language educational courseware applications are based on different activity screens and tasks. Individually, each student may prefer certain activities over others, and may avoid learning significant materials if presented in one format or scheme rather than another. The system shown and described herein may provide online, real time assessment of the users' attention level, ensuring that they are actually engaged in the application, enabling real time shifting of formats and schemes, e.g. activity screens, to enhance learning of each educational topic.
e. Websites: Websites employ various schemes for retaining visitors and keeping them engaged. Website usability is a key factor in increasing retention. Usability encompasses a number of things such as ease of use, and user attraction or attentiveness. One of the current methods of assessing a website's usability is to conduct custom user experience research during the design process. This method requires a number of iterations where a website design is tested by the custom users; their user experience is evaluated by researchers; and the design is altered accordingly. An example of a firm providing such a consulting service is AnswerLab. Alternate methods of improving retention rates are by employing customer experience analytics that analyze the usability of a website based on data gathered on visitors' surfing habits.
The ability to determine when a user is losing attention and apply a retention geared scheme could significantly improve retention rates and ultimately conversion rates. Similarly, the ability to analyze a user's attention level as it pertains to disengagement, abandonment or conversion, would allow designers to improve the website thereby minimizing abandonment and improving its usability.
For example, it may happen that during a particular user-site interaction, there is a recurring phenomenon of users' abandonment. Prior to the abandonment occurrence there may be a specific recurring time point from which a significant decrease in the user's attentiveness is identified. This may suggest that most of the users lost interest and later on decided to abandon participation in the interaction process. As a result, the website designer may try to identify the specific cause for the users' attention drop (e.g., due to a cumbersome interaction scheme), and may improve the usability of this process.
Another example is the effect of pop-up displays which may appear at different screen spots, on the user's interaction with the website. If the time of banner pop-up has high correlation with a long period of elevated attentiveness measured, it may suggest that the pop-up banners may have grabbed the users' attention for longer periods than desired. In this case, the website designer may redesign the characteristics of the pop-up banners, minimizing their negative side effects.
Fig. 8A is a simplified semi-pictorial semi-functional block diagram illustration of an attentiveness-responsive computerized system constructed and operative in accordance with certain embodiments of the present invention. Fig. 8B is a simplified flowchart illustration of an example method of operation for the apparatus of Fig. 8A. Typically, INl, IN2 and OUTl in Fig. 8A and in Fig. 8B are one and the same.
As shown, each interactive application which includes a motor-controlled HMI device such as but not limited to PC mouse, keyboard or joystick, is characterized by a transfer function relating the control directives of this device to a specific sensory element/action such as but not limited to an on-screen cursor. According to certain embodiments, the interactive application, in conjunction with one or more functional units e.g. as described herein, performs some or all of the following actions:
Acquires real-time data characterizing the behavior of the DMMID as controlled by the human user.
- Acquires real-time data characterizing the behavior of the sensory element/action (such as, in the illustrated example, an on-screen cursor) controlled by the DMMID.
- Takes partial control over the sensory element/action controlled by the
DMMID, by slightly perturbing its normal behavior; and
- Analyzes the human user behavior as being expressed by his/her control over the behavior of the DMMID, in response to the subliminal sensory perturbations activated.
Two experiments performed to validate real-time evaluation of user's attentiveness on interactive applications, performed in accordance with certain embodiments of the present invention, are now described.
First Validation Experiment: Subjects were instructed to cope with an interactive task on a laptop PC which involved holding and moving a computer mouse for 8 minutes. The task included 200 movements of the screen cursor from side to side between circles positioned at random locations over the screen. Simultaneously, only during minutes 3-4 and 7-8 (or only during minutes 1-2 and 5-6), the subjects were asked to verbally answer auditory questions, in order to divide their attention between the visual-motor task and the auditory task. Subjects' attentiveness was evaluated in accordance with certain embodiments of the present invention, for the whole experimental period per subject. Eleven subjects participated in this study, 5/6 male/female, 14-42 years of age, all right-handed. The evaluated attentiveness of the subjects to the visual-motor task based on the system shown and described herein was validated by comparing the results associated with the periods during which the subjects handled two simultaneous tasks, to the results associated with the periods during which the subjects handled the visual-motor task only. The attentiveness results of the first type were consistently lower than the results of the second type. The difference was statistically significant (p<0.01). These results suggest correct functionality of embodiments of the present invention distinguishing between times when subjects' attention should have not or should have been divided.
Second Validation Experiment: The experimental trials were conducted at an EEG lab equipped for diagnosis of ADD/ ADHD patients. The EEG device was a Deymed - TruScan acquisition version 6.4.2.2914. Subjects were asked to cope with interactive tasks on a laptop PC holding a computer mouse, for variable time lengths: 5, 10 or 15 minutes. Simultaneously, the subjects' EEG were measured and recorded, and their frontal β/θ wave ratios were obtained for each elapsed minute. Since the subjects were instructed to focus only on the computer tasks, their frontal β/θ waves ratio may be regarded as a rough measure for their attention level referring to the active PC application. Six subjects participated in this study, 5/1 male/female, 16-42 years of age, all right handed. The evaluated attentiveness of the subjects was compared to their frontal β/θ wave ratios once per minute for each individual subject. Results showed high correlation between attentiveness evaluation of each subjects found by the methods shown and described herein and his/her frontal β/θ waves ratios, suggesting correct functionality of the methods shown and described herein, relative to an unrelated method for attentiveness evaluation based on an EEG device.
It is appreciated that terminology such as "mandatory", "required", "need" and "must" refer to implementation choices made within the context of a particular implementation or application described herewithin for clarity and are not intended to be limiting, since, in an alternative implantation, the same elements might be defined as not mandatory, and may not be required or might even be eliminated altogether.
It is appreciated that software components of the present invention including programs and data may, if desired, be implemented in ROM (read only memory) form including CD-ROMs, EPROMs and EEPROMs, or may be stored in any other suitable computer-readable medium such as but not limited to disks of various kinds, cards of various kinds and RAMs. Components described herein as software may, alternatively, be implemented wholly or partly in hardware, if desired, using conventional techniques. Conversely, components described herein as hardware may, alternatively, be implemented wholly or partly in software, if desired, using conventional techniques.
Included in the scope of the present invention, inter alia, are electromagnetic signals carrying computer-readable instructions for performing any or all of the steps of any of the methods shown and described herein, in any suitable order; machine- readable instructions for performing any or all of the steps of any of the methods shown and described herein, in any suitable order; program storage devices readable by machine, tangibly embodying a program of instructions executable by the machine to perform any or all of the steps of any of the methods shown and described herein, in any suitable order; a computer program product comprising a computer useable medium having computer readable program code, such as executable code, having embodied therein, and/or including computer readable program code for performing, any or all of the steps of any of the methods shown and described herein, in any suitable order; any technical effects brought about by any or all of the steps of any of the methods shown and described herein, when performed in any suitable order; any suitable apparatus or device or combination of such, programmed to perform, alone or in combination, any or all of the steps of any of the methods shown and described herein, in any suitable order; electronic devices each including a processor and a cooperating input device and/or output device and operative to perform in software any steps shown and described herein; information storage devices or physical records, such as disks or hard drives, causing a computer or other device to be configured so as to carry out any or all of the steps of any of the methods shown and described herein, in any suitable order; a program pre-stored e.g. in memory or on an information network such as the Internet, before or after being downloaded, which embodies any or all of the steps of any of the methods shown and described herein, in any suitable order, and the method of uploading or downloading such, and a system including server/s and/or client/s for using such; and hardware which performs any or all of the steps of any of the methods shown and described herein, in any suitable order, either alone or in conjunction with software. Any computer-readable or machine-readable media described herein is intended to include non-transitory computer- or machine-readable media.
Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any step described herein may be computer-implemented. The invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally include at least one of a decision, an action, a product, a service or any other information described herein that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.
Features of the present invention which are described in the context of separate embodiments may also be provided in combination in a single embodiment. Conversely, features of the invention, including method steps, which are described for brevity in the context of a single embodiment or in a certain order may be provided separately or in any suitable subcombination or in a different order, "e.g." is used herein in the sense of a specific example which is not intended to be limiting. Devices, apparatus or systems shown coupled in any of the drawings may in fact be integrated into a single platform in certain embodiments or may be coupled via any appropriate wired or wireless coupling such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery. It is appreciated that in the description and drawings shown and described herein, functionalities described or illustrated as systems and sub-units thereof can also be provided as methods and steps therewithin, and functionalities described or illustrated as methods and steps therewithin can also be provided as systems and sub-units thereof. The scale used to illustrate various elements in the drawings is merely exemplary and/or appropriate for clarity of presentation and is not intended to be limiting.

Claims

CLAIMS L A computer-implemented method for evaluating attentiveness of a user of an interactive application, the application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user- perceptible application behaviors which have a pre-defined initial correspondence with user inputs in said stream of user inputs, the method comprising using at least one processor for:
a. imperceptibly modifying said pre-defined initial correspondence, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to said user inputs, and said user inputs;
b. generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between said modified and initial correspondences; and
c. accommodating for said indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
2. A method according to claim 1 wherein said at least one aspect of the user's compensation includes an aspect of quality of compensation.
3. A method according to claim 1 wherein said at least one aspect of the user's compensation includes an aspect of speed of compensation.
4. A method according to claim 1 wherein said user input device comprises a user input generator controlled by the user.
5. A method according to claim 1 wherein said imperceptibly modifying step is differentially activated at a differentiation rate which prevents a user from reaching full adaptation.
6. A computerized system for evaluating attentiveness of a user of an interactive computerized application, the application employing a user input device operative to accept a stream of user inputs and responsively, to generate a stream of user- perceptible application behaviors which have a pre-defined initial correspondence with user inputs in said stream of user inputs, the system comprising:
an input-application behavior correspondence manipulator operative for imperceptibly modifying said pre-defined initial correspondence based on which said user input device is operative to generate said user-perceptible behaviors, thereby to define a modified correspondence between user-perceptible application behaviors generated responsive to said user inputs, and said user inputs;
a user compensation monitor operative for generating an ongoing indication of user attentiveness including measuring at least one aspect of the user's compensation for differences between said modified and initial correspondences and using a processor to compute an indication of user attentiveness; and
an attentiveness accommodator operative to accommodate for said indication of user attentiveness by adapting at least one aspect of subsequent operation of the interactive application.
7. A system according to claim 6 wherein said application comprises a training simulator and said attentiveness accommodator is operative to change characteristics of a warning indication provided to a user to alert the user to potential dangers.
8. A system according to claim 7 wherein said attentiveness accommodator is operative to compute threshold warning indication characteristics which are sufficient to cause the user to be attentive to potential dangers.
9. A system according to claim 6 wherein said application comprises a control center controlling a population of human operators and wherein said attentiveness accommodator is operative to alert each individual human operator each time the individual operator is suffering from low attentiveness.
10. A system according to claim 9 wherein once said attentiveness accommodator has sent a first alert to an individual human operator that the individual operator is suffering from low attentiveness, the attentiveness accommodator is also operative to send a second alert to a supervisor that the individual operator is suffering from low attentiveness, if the first alert does not sufficiently improve the individual operator's attentiveness.
11. A system according to claim 6 wherein said application comprises a computerized game environment and wherein said attentiveness accommodator is operative to enhance attractiveness of at least one characteristic of the computerized game environment if a user of the computerized game environment is found by said user compensation monitor to have a low attentiveness level.
12. A system according to claim 6 wherein said application comprises an e- learning/courseware environment and wherein said attentiveness accommodator is operative to classify learning paradigms presented to a learning user as a function of attentiveness which they engender in the user and to replace, in a learning session with an individual user, learning paradigms which have engendered low attentiveness for the individual user, with learning paradigms which engender high attentiveness for the individual user.
13. A system according to claim 6 wherein said application comprises a website and wherein said attentiveness accommodator is employed to identify a user's attraction to specific website locations during the engagement session between the individual user and the said website.
14. A system according to claim 6 wherein said application comprises a website developers' environment and wherein said input-application behavior correspondence manipulator and said user compensation monitor are utilized during pre-testing of a website under development and wherein said attentiveness accommodator is operative to alert an individual developer of problematic locations within the website which are characterized by low attentiveness so as to allow the website developer to modify said problematic locations.
15. A method according to claim 1 wherein said imperceptibly modifying comprises:
acquiring real-time data characterizing at least one characteristic of the input device;
acquiring real-time data characterizing a behavior of a sensory component controlled by the input device; and
breaking the direct control of the interactive application over the sensory component controlled by the input device by imperceptibly perturbing the sensory component's normal behavior.
16. A method according to claim 1 wherein said imperceptibly modifying also comprises quantifying the human user attentiveness related to the interactive application as a function of his/her control over the input device in response to the subliminal perturbations activated.
17. A method according to claim 15 wherein said at least one characteristic of the input device comprises at least one of the following: motion affecting the input device, force affecting the input device, user's voice affecting the input device, indication of input device button clicks, and indication of keyboard keys' tapping.
18. A method according to claim 5 wherein the user's up-to-date adaptation characteristics are computed continuously in real-time.
19. A method according to claim 1 wherein said user-perceptible application behavior comprises at least one of: cursor position, characteristic such as size and/or position of a displayed element which may not include a cursor, tone volume, tone pitch, and color.
20. A method according to claim 1 wherein said adapting comprises modifying at least one aspect of subsequent operation of the interactive application vis a vis the user.
21. A computer program product, comprising a computer usable medium having a computer readable program code embodied therein, said computer readable program code adapted to be executed to implement a method according to claim 1.
PCT/IL2010/000586 2009-07-28 2010-07-22 System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input WO2011013117A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US22897509P 2009-07-28 2009-07-28
US61/228,975 2009-07-28

Publications (1)

Publication Number Publication Date
WO2011013117A1 true WO2011013117A1 (en) 2011-02-03

Family

ID=43242511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2010/000586 WO2011013117A1 (en) 2009-07-28 2010-07-22 System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input

Country Status (1)

Country Link
WO (1) WO2011013117A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103514339A (en) * 2012-06-15 2014-01-15 上海蓝卓教育信息科技有限公司 Courseware evaluation system
US10552183B2 (en) 2016-05-27 2020-02-04 Microsoft Technology Licensing, Llc Tailoring user interface presentations based on user state
CN114452506A (en) * 2022-01-28 2022-05-10 清华大学深圳国际研究生院 Method for guiding user to be minded and meditation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001067214A2 (en) 2000-03-03 2001-09-13 Merinta, Inc. System and method for tracking user interaction with a graphical user interface
US20070139362A1 (en) 2005-12-15 2007-06-21 James Colton Health management system for personal computer users
US7310609B2 (en) 1997-09-11 2007-12-18 Unicast Communications Corporation Tracking user micro-interactions with web page advertising

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7310609B2 (en) 1997-09-11 2007-12-18 Unicast Communications Corporation Tracking user micro-interactions with web page advertising
WO2001067214A2 (en) 2000-03-03 2001-09-13 Merinta, Inc. System and method for tracking user interaction with a graphical user interface
US20070139362A1 (en) 2005-12-15 2007-06-21 James Colton Health management system for personal computer users

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A generic computer with a graphical user interface and user input peripherals such as a mouse or a keyboard *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103514339A (en) * 2012-06-15 2014-01-15 上海蓝卓教育信息科技有限公司 Courseware evaluation system
US10552183B2 (en) 2016-05-27 2020-02-04 Microsoft Technology Licensing, Llc Tailoring user interface presentations based on user state
CN114452506A (en) * 2022-01-28 2022-05-10 清华大学深圳国际研究生院 Method for guiding user to be minded and meditation
CN114452506B (en) * 2022-01-28 2023-10-13 清华大学深圳国际研究生院 Method for guiding user to positively think about meditation

Similar Documents

Publication Publication Date Title
Järvenoja et al. Supporting groups’ emotion and motivation regulation during collaborative learning
Ahuja et al. Development of a comprehensive model of social entrepreneurial intention formation using a quality tool
Payne et al. Adaptive interaction: A utility maximization approach to understanding human interaction with technology
Pascual et al. Evidence of naturalistic decision making in military command and control
Van Leeuwen et al. Comparing teachers' use of mirroring and advising dashboards
Chiang et al. You’d better stop! Understanding human reliance on machine learning models under covariate shift
Miller et al. Automated detection of proactive remediation by teachers in Reasoning Mind classrooms
Cannon-Bowers et al. Improving tactical decision making under stress: Research directions and applied implications
Mansoor et al. Data visualization literacy and visualization biases: Cases for merging parallel threads
Chun Doing autoethnography of social robots: Ethnographic reflexivity in HRI
Ley et al. Which user interactions predict levels of expertise in work-integrated learning?
Trapp et al. App icon similarity and its impact on visual search efficiency on mobile touch devices
Schmitz et al. When height carries weight: Communicating hidden object properties for joint action
Renawi et al. A simplified real-time camera-based attention assessment system for classrooms: pilot study
WO2011013117A1 (en) System and methods which monitor for attentiveness by altering transfer function governing user-perceptible behaviors vs. user input
Morris et al. Studies of dynamic task allocation in an aerial search environment
McCusker et al. Intelligent assessment and content personalisation in adaptive educational systems
Capiola et al. “Is something amiss?” Investigating individuals’ competence in estimating swarm degradation
US20220198952A1 (en) Assessment and training system
Vangsness et al. More isn’t always better: when metacognitive prompts are misleading
US20230215288A1 (en) Haptic feedback for influencing user engagement level with remote educational content
Shum et al. Personalised learning through context-based adaptation in the serious games with gating mechanism
Bremgartner et al. Using agents and open learner model ontology for providing constructive adaptive techniques in virtual learning environments
Gruenwald et al. Augmenting human performance in remotely piloted aircraft
Bell et al. Helping instructor pilots detect and respond to engagement lapses in simulations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10760442

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10760442

Country of ref document: EP

Kind code of ref document: A1