WO2015050545A1 - Improved multi-screen management - Google Patents

Improved multi-screen management Download PDF

Info

Publication number
WO2015050545A1
WO2015050545A1 PCT/US2013/063181 US2013063181W WO2015050545A1 WO 2015050545 A1 WO2015050545 A1 WO 2015050545A1 US 2013063181 W US2013063181 W US 2013063181W WO 2015050545 A1 WO2015050545 A1 WO 2015050545A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
cursor
monitor
display device
computer
Prior art date
Application number
PCT/US2013/063181
Other languages
French (fr)
Inventor
William Gibbens Redmann
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to PCT/US2013/063181 priority Critical patent/WO2015050545A1/en
Publication of WO2015050545A1 publication Critical patent/WO2015050545A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • This invention relates to a technique for managing multiple screens of a computer workstation
  • Computer work stations can support multiple monitors.
  • Most computer workstations of this type include graphics circuitry, either as part of the processor itself, or a separate circuit board or "card,” which has the ability to support two monitors. More advanced graphics circuitry can support three or more monitors.
  • Having multiple monitors allows the user of the computer workstation to have multiple computer applications or programs open at the same time, with each program displayed on a separate monitor, For example, using multiple monitors, a user running a single instance of a word processing program, for example Microsoft Word®, can have one or more different word processing documents open on each of the monitors. In this way, the user can easily interact with multiple documents without the document windows being too small, or having to pick which document will he the one currently viewable. Using a computer workstation with multiple monitors, a user can have different programs all open at the same time, with enough room for each to be clearly visible.
  • a word processing program for example Microsoft Word®
  • a user can have a word processing program open on at least one monitor and a spreadsheet program (e.g., Microsoft Excel®) open on at least one other monitor.
  • a spreadsheet program e.g., Microsoft Excel®
  • a user can readily copy information from one document (e.g., a spreadsheet) and "paste" that information into another document (e.g. a report) displayed on another monitor.
  • a method for controi!ing a computer workstation operating a plurality of monitors each display device displaying at least a portion of the user interface for a corresponding compuier application commences by rendering active at least a first portion of a user interface for a first compuier application displayed by a first one of the monitors upon detecting that the user observes that first monitor.
  • a first portion of the user interface for the first computer application is then rendered inactive.
  • At least a second portion of a user interface is rendered active for a second compuier application displayed by the second monitor.
  • the first and second application could represent different programs or different instances of the same program or even different instances of the same window.
  • FIG 1 depicts a block diagram of a first embodiment of a computer workstation for managing multiple monitors in accordance with the present principles
  • FIG 2 depicts in flow chart form the steps of a process executed by compuier workstation of FIG. 1 to manage the mu!iip!e monitors;
  • FIGS 3-12 show consecutive images displayed by the monitors of the computer workstation of FIG. I in connection with the multiple screen management technique of the present principles
  • FIG 13 depicts a block diagram of a second embodiment of a computer workstation for managing multiple monitors in accordance with the present principles
  • FIG 14 depicts a state transition diagram thai describes the behavior of the compuier workstation of FIG. 13 with respect to each monitor;
  • FIG 15 depicts in flow chart form the steps of a process for tracking and restoring the cursor position on each monitor of the workstation of FIG. 33;
  • FIG 16 depicts in flow chart form the steps of a first exemplary process for tracking and restoring the cursor position and compuier application on each monitor of the workstation of FIG. 13 based on the user focusing on that monitor;
  • FIG 17 depicts in flow chart form the steps of a second exemplar)' process for tracking and restoring the computer application on each monitor of the workstation of FIG. 13 based on the user focusing on that monitor
  • FIG. 1 depicts a block schematic diagram of first embodiment of a compuier workstation 100 for practicing the multi-screen management technique according to the present principles.
  • the workstation 100 of FIG. 1 comprises a compuier 130 running an operating system represented by the operating system module 190 and cursor control module 143 for controlling the multiple monitors, illustratively depicted by monitors 3 iO nd 120, although the workstation 100 could include three or more monitors (not shown).
  • the operating system embodied by the operating system module 190 provides a graphical user interface (GUI), most widely typified by "Mac OS X" provided Apple, Inc.
  • GUI graphical user interface
  • the computer 130 includes various well-known elements, such as a processor, a memory, and power supply, all omitted from FIG. 1 for the sake of clarity.
  • Other well know elements within the computer workstation 130 include a monitor module 160, a cursor control interface 142, and a camera interface 152, which warrant further discussion in connection with the multi-screen management technique of the present principles.
  • monitor module 160 further allows the operating system, embodied by the operating system module 190 associated with the computer ⁇ 30, to support a Graphical User interface (GUI) on the monitors 1 10 and 120 connected to monitor module 360 by connections 1 1 1 and 121 , respectively.
  • GUI Graphical User interface
  • the cursor control 140 typically comprises a mouse, touch pad, joystick, trackball, touchscreen, or other device, all well known in the art, for enabling a user to control movement of a cursor on the screen of each monitor.
  • mouse will collectively refer to any variety of cursor control devices.
  • the mouse 140 communicates with the cursor control interface 142 via a connection 141 .
  • the cursor controi state moduie 143 typically comprising part of the computer operating system embodied by the module 190, includes a basic input/output system (BIOS) for receiving information from cursor control interface 142 indicative of manipulation by the user of the mouse 140.
  • BIOS basic input/output system
  • the cursor control interface 342 when the user moves mouse 140 a particular distance to the left, the cursor control interface 342 will register a corresponding number of counts reported to the cursor control state moduie 143 for processing.
  • the cursor control state module 143 provides the cursor movement details to the operating system moduie 390, which
  • the computer workstation 100 includes at least one television camera 150, which sits in this embodiment on top of the monitor 1 10 for observing the image of the user of the workstation 100.
  • the computer 130 includes a facing detection module 153 connected to the camera interface 152, which in turn is connected to television camera 150 by a connection 151.
  • the facing detection module 153 processes images obtained from the television camera 150 to determine whether a user directly faces the monitor 1 10.
  • the connection 151 along with the connections 1 1 1 , 123 , and 141 can comprise either wired or wireless links,
  • the computer 130 of the workstation 100 includes a user control state processor 180.
  • the operating system module 190 has the capability of performing several tasks at the same time (and thus possesses the capability of "multi- tasking ,")
  • the operating system module 390 can manage multiple applications, illustratively depicted in FIG. ! as appiicarions 191 and 192.
  • the operating system module 1 0 can not only manage. the applications 19! and 192 but can also manage the tasks related to, or implemented by. the modules 3 3 and 180,
  • a single application might have different windows (e.g., one for each of several different documents) on each of the two monitors, or a singfe application may have two portions of sis user interface situated on each of the two monitors,
  • ihe system merely manages the cursor position, and the status and behaviors of the applications running thereon, however many, merely react according to the cursor state.
  • the illustration discussed here has been selected for its particular clarity, but those skilled in the an will recognized, in light of this comment, that the. present invention applies with equal effectiveness to such other systems.
  • the monitor i SO displays an image ! 12, associated with execution of the first application 191
  • the monitor 120 displays an image 122 associated with execution of the second application 192.
  • the portion of the images S 12 and 1 12 not occupied by the applications 191 and 192 represent the "desktop" provided by the operating system associated with the operating system module 1 0.
  • the user control stare processor 180 performs several functions associated with managing multiple monitors in accordance with the present principles. To that end, the user controi state processor 180 receives a signal from the facing detection module 153 and processes that signal to determine whether the user faces the monitor 110. In addition, the user control state processor 180 reads the current cursor position from either the cursor control state module 143 or the operating system module 190 and, whenever the user is facing a particular monitor, stores at least one most recent cursor position for the corresponding faced one of the monitors 110 and 120 in a cursor state store 181. Typically, the cursor store 181 comprises a part of the memory (not otherwise shown) of the computer 130, but could comprise a separate memory.
  • the user control state process 180 could communicate with operating system module 190 to determine which application currently has the focus of a user, as determined by the face detection module 153.
  • the computer application currently having the focus of a user corresponds to the application currently receiving events representing mouse clicks, keystrokes, or other user interface gestures.
  • the application having the focus is further recorded in cursor state store 181.
  • the user control state processor 180 can recall a cursor position, if available, for a particular monitor from the control state store 181 , when the user turns to face that moniior. Following recal! of the saved cursor position, the user control state processor 180 will set the current cursor position through either the cursor control state module 143 or the operating system module 1 0 to correspond to the saved cursor position when the user returns to observing the monitor associated with that stored cursor position.. In embodiments where state store 181 records the focus, the user control state processor 180 can communicated with the operating system module 190 to restore focus to the application so noted. Based on the user focus, the operating system module will render the application having the user focus active and inactivate the application(s) that no longer have the user's focus.
  • GUI Graphical User interface
  • an application becomes active whenever the cursor points to one of that application's windows, in such a case, the cursor position as detected, stored, recalled, and set as discussed above will fully determine which application the operating system deems active and has the focus, and no distinei communication with the operating system becomes necessary to activate that application,
  • Another exemplary mode of interaction in the GUIs provided by other operating systems includes a process of separately maintaining a hierarchy of which applications become activated or become "foremost.” Only the foremost application will receive user input through the GUI until a different application becomes explicitly activated.
  • the Mac OS X operating system from Apple Computer of Cupertino, California represents an example of an operating system that functions in this manner, A user can activate an application currently not foremost by an explicitly entering a command or message directing an application to become foremost. Often, the user can activate an inactive application by moving the cursor over a window of that inactive application and clicking on it.
  • the operating system module 1 0 detects a click on an application made by a user and notifies the target application to become active. Thus, the application now becomes the foremost application. However, the operating system module 1 0 does not otherwise deliver the click event to this application. This previously inactive application, which now becomes the foremost application, can now receive subsequent user interactions, but the activating click becomes lost. In some special cases, applications may designate that the operating system should resubmit the click to the application once the application has become foremost. However, far more often, the operating system vviii discard such activating clicks so as not to inadvertently trigger an accidental action. The discarding of activating clicks remains the usual case, will be discussed hereinafter with regard several examples wherein large portions of an application's user interface appear hidden while the application remains inactive.
  • the activating click merely constitutes a signal from the user intended to activate the application and not initiate any other action. Indeed, the location of the click might correspond to a portion of the screen that would become covered by sub-menus or control panels once such items become restored by the application as that application becomes foremost, in such cases, saving and recalling an active application corresponding each of the plurality of monitors 1 10 and 120 can prove useful as discussed hereinafter.
  • This status can include which window or windows belonging to the application carry special designations. For example, in Mac OS X, an application can have a main window (the application's foremost document), in certain situations, e.g..
  • the application when saving a document to a file, the application can have a distinct key window, that is, an application window that will receive keystrokes from a keyboard whether or not the cursor lies over that window.
  • the main and key window could comprise the same window.
  • the main window and key window of an application running under OS X whether they represent two distinct windows or represent a single window carrying both designations, provides an illustrative example of the value of the user control state processor 180 querying operating system module 190 about the current application and determining which windows carry which designations, and which correspond to such designations.
  • the computer 130 can restore these settings as described further below. Other settings corresponding to an active/foremost application may warrant saving and recalling, for example when the application currently operates in "full screen" mode (if supported by the operating system).
  • FiG. 2 depicts, in flow chart form, the steps of a first embodiment of a cursor management process 200 performed by user control state processor module ISO of FIG. 1 .
  • the cursor management initialization process 200 begins upon execution of step 203 once the operating system module 190, already running, receives a user input from the mouse 140 through cursor control interface 142 for receipt by the cursor control state module 143.
  • the operating system module 390 will receive a stream of images from the camera 150 via the camera interface 152 for processing by the facing detector module 153.
  • the facing detector module 153 Upon execution of step 202, the facing detector module 153 detects the current user facing, that is, with which monitor the user is currently or most recently engaged. The facing detector module 153 supplies that information to the user control state processor 180, which then records the current user facing in a memory (not shown, but represented in FIG. 2 by a register named "CurrenLFacing"). The current user facing must correspond to one and only- one of the monitors 1 10 and 120. (In other words, the user only focuses on one monitor at a time.) During step 203, the user contro! state processor module 180 obtains information to determine on which one of the monitors 1 10 and 120 the cursor currently resides.
  • the user control state processor module 180 then saves this information in a memory (not shown, but represented in FIG. 2 by a register named "Current.. Cursor 1 ').
  • the user control state processor module 180 clears any saved cursor positions 210 stored in the control state store 18 ! (These positions are distinct from the "Current_ Cursor" register and correspond to specific monitors.)
  • the process described with respect to FIG. 2 determines and records the cursor position
  • the computer 130 of FIG. could also obtain and save t e application and window status (e.g., including designations such as main window, key window, foremost window, active window, and full screen mode), during step 203 for later use.
  • a portion of the control state store 181 like the saved cursor positions memory 210, can retain this application and window-related information.
  • the multi-screen management technique of the present principles can operate with just the cursor position information and still offer significant advantages.
  • the cursor management process 220 which comprises a loop, begins during step 221 whereupon the initialization process 200 ends. Following step 221 , step 222 undergoes execution, at which time, the computer 130 of FIG. 1 copies the contents of the memory, represented by the registers named Curreni hingeFacing and Current_Cursor, to another memory location (not shown) represented by the registers named LastJRacing and Last technicalCursor, respectively.
  • This copying operation enables detection of the user facing or cursor location transitions from one of the monitors 1 10 and 120 to the other.
  • the computer 130 determines whether the values in the registers named LastJ3 ⁇ 4eing and Lasi_Cursor correspond to the same, single monitor. If so, the computer 130 sets a single-bit flag, represented in FIG. 2 by the register named Same_Screen, to a "true" condition during step 224. Otherwise, the computer 330 will set the flag to a "false” condition during step 225. Regardless of the condition indicated by the Same conditioningScreen flag, during step 226, the process 220 awaits a signal to indicate the current user facing as detected by the facing detector module 153 of FiG. 1.
  • Step 226 performs the same operation as during step 202, setting the register
  • step 227 performs the same function as during step 203, setting the register Current_ €ursor appropriately.
  • the computer i 30 determines and then stores the application and window status during step 203, then, the computer likewise performs these functions during step 227 too.
  • step 2208 the computer 130 checks whether the user facing has changed. If Last ConstantFacing and Current_Facing values correspond to the same one of the monitors 1 10 and 120, then no facing change has occurred taken place and the process 220 restarts the loop at step 222,
  • step 2208 If during step 228, the registers Last_Facing and Current CruFacing show that the user now faces a different screen, then process 220 proceeds to step 229, During step 229, the computer ! 30 determines whether the cursor has moved from one monitor to the other, if the Last contextCursor corresponds to the same monitor as the Current_Cursor, then the user has left the cursor on ihe monitor where it had appeared previously. Under such circumstances, the process 220 proceeds to step 230.
  • step 230 the computer ⁇ 30 tests the Same_Screen flag, A true condition for the Sam6_$creers flag indicates that a moment ago, the user faced the same monitor on which the cursor appeared, but now faces elsewhere, presumably a different monitor, and thai the user has turned and left the cursor behind. In response to a true condition for the
  • step 220 the process 220 proceeds to step 231.
  • the computer 130 obtains the current position of the cursor from either the operating system module 1 0, or the cursor control state module 143 (both of FIG. 1), which one depending on the specific implementations of the computer 130, the cursor control interface 142, the device dri vers (e.g., cursor control state module 143), and the details of the operating system.
  • the cursor position can exist relative to the corresponding monitor, in other words
  • the cursor position can exist as a value in a global coordinate system that encompasses the images 1 12 and 122 on the monitors 110 and 120, respectively.
  • the form of the cursor position depends on the operating system and other elements etc.
  • step 229 determination during step 229, is saved in a cursor positions memory 210 in correspondence with the monitor on which the cursor appears, which in FIG. 2 is called
  • Screen[Current practiceCursor] remains identical to Current_Cursor, but if Current context €ursor includes information other than an identifier corresponding to a single one of monitors 1 10 and 120, then the operation Screenfxj returns such an identifier.
  • the computer 1 0 makes a determination whether there exists a saved cursor position for the monitor the user now faces. (Note that during step 228, the computer 1 0 of FIG, determined that the facing of the user had changed because Current proceduraFacing and Last...FaC!ng now have different values). If the user now faces a moniior represented by the operator Screen[Current_Facing] for which the computer 130 previously stored the cursor position in memory 210, then during step 233, the computer 130 will recall this cursor position. The. computer 130 will impose this cursor position on the GUI by causing the user control state processor module 180 to direct either the cursor control state module 143 or the operating system module 190 to make the change. Generally speaking, the computer 130 will direct the same module to move the cursor to the current cursor position during step 23 1 (i.e., -l ithe cursor control state module or the operating system module) as was previously
  • step 233 process 220 loops back to step 222, but now the operating system module 190 will react to the newly recalled cursor position, fn some embodiments, during step 233, the computer 130 can momentarily embellish the cursor e.g., by drawing the cursor at the newly recalled position in a large size or drawing the cursor with a color sure to attract attention, and then quickly shrinking the cursor to normal size, or fading to a normal color, to make the jump to the new position more clearly visible.
  • step 229 the computer 130 has determined that the cursor has changed monitors, that is, the value of Last_Cursor no longer corresponds to the same monitor Current CruCursor, then a race condition exists: The user has looked to a different monitor at substantially the same time the user manually moved the cursor user to the different monitor. In this situation, no valid current cursor position exists for saving during step 231 , so process 220 skips to step 232.
  • step 230 the computer 130 determines that the cursor and facing corresponded to two different monitors (i.e., the user has not looked away from the cursor), then, the current cursor position does not warrant saving, and again process 200 skips to step 232.
  • step 232 The determinations made during steps 229 and 230 that lead to step 232 while skipping step 231 allows for situations wherein the computer 130 can recall a cursor position even when there is no cursor position suitable for saving. Examples include the user turning from a first monitor toward a second monitor at the same time as the cursor transitions from the first monitor to the second monitor, in which case the cursor will move under manual control to the second monitor, but immediately jump to a previously saved position on that second monitor. This circumstance offers the advantage of allowing a user to become familiar with the experience afforded by the process 220, but stills allows the user to exercise his or her old reflexes to move the cursor manually with or ahead of the user's gaze.
  • the loop 220 needs to repeat no more often than once each time the facin detector module 153 updates the facing data (i.e., processes a new camera image from camera interface 152 to determine the facing of the user).
  • the overa!! process sis U provides substantial benefits even if the process loop 200 only undergoes execution when the facing detection signal from facing detector module 153 changes.
  • FIG. 2 depicts the process 220 as a continuous loop, the process can spend a substantial amount of time idling during step 226, waiting for the results of the next facing detection from the facing detector module 153, or waking for a change in the state of the facing detection signal the facing detector module 153 produces. Of the two, the former remains preferable, since little computational burden exists to the balance of process 220.
  • those steps that result in collection of the cursor position can also collect and store the application and window status.
  • the computer 130 recalls and sets the cursor position, the computer can reassert the application and window status too, such as during step 233.
  • determining the monitor on which the cursor appears may require obtaining the current cursor position, and then applying a subsequent process (e.g., Screen Current_Cursor]).
  • a subsequent process e.g., Screen Current_Cursor
  • Embodiments that save this more elaborate value might obtain the current cursor position during step 232 by applying an appropriate operator to the value in the CurrenLCursor register.
  • FIGS. 3-V2 collectively illustrate a user 310 interacting with the workstation 100 while user control state processor module 180 performs the cursor management process 220 of FIG. 2.
  • the user 330 (designated here as 310L) faces the left monitor 1 10, and thus substantially faces the camera 150, The facing detection module ! 53 will detect this facing and indicate that the user faces monitor 1 10.
  • the user 310 makes use of the computer workstation 100 to create an educational slide, using an image editing application 191 (shown in FIG. 1 ) on the left monitor 1 10 and a spreadsheet application 192 (also illustrated in FIG. 1 ) on the right monitor 120.
  • image editing application 191 shown in FIG. 1
  • spreadsheet application 192 also illustrated in FIG. 1
  • the main window 314 for the image editing application 191 appears as currently active.
  • the computer 130 will configure the menu bar 313 for that application and corresponding auxiliary menu panels 315 now appear, in the screen image 322 displayed on the monitor 120, the main window 324 for spreadsheet application 192 appears as inactive, represented in these figures by the top of the window 324 being dotted.
  • the user 310 uses his or her hand 31 i to move the mouse 140 to direct the cursor 330 to a position the main image editing window 314 where the user wants to insert some text. To accomplish text insertion, the user 310 will selected the text insertion too! from one of the auxiliary menu panels 15 with the result that the cursor 330 now appears as an I-beam.
  • the user 310 can now make use of a keyboard (not shown) connected to computer 1 0 to type the first portion of the slide's heading, "Since e and pi are irrational numbers, decimal representations of them, like” , with the current text insertion point 331 being distinct from the position of cursor 330.
  • the keyboard keystrokes correspond to the first application 1 1 because its window 314 became designated as the key window, e.g., when the user entered a mouse click on cursor 330 to select the initial text insertion point.
  • the facing of the user has changed (as compared to FIG. 3) since the user now faces to the right, hence use of the designation 3 ! OR,
  • the camera 150 now provides an image of user 310 who no longer looks at the monitor 1 10, but now looks elsewhere,
  • the facing detector module 153 will indicate that the user no longer !ooks at the monitor 1 10.
  • the computer 130 will first detect this facing change during step 226 and will now consider the user as facing the second monitor 120. While the interpretation of "not facing monitor i 10" as "facing monitor 120" constitutes an over-generalization, the interpretation remains still workable.
  • An alternate embodiment, discussed hereinafter, makes use of a more sophisticated version of facing detector module 153 to provide results with greater precision.
  • the cursor still remains at the position 330 and the image editing application 1 1 still remains active (as shown in FIG. 3).
  • the value of the Current. . Cursor still corresponds to the first monitor i 10.
  • the computer ⁇ 30 detects the change in the user's facing because the value of Last gripFacing corresponds to the first monitor 1 10, but the value of CurrentJFacing corresponds to the second monitor 120, and the process 220 proceeds to the step 229 of FIG. 2.
  • step 229 the value Last_Cursor corresponds to the same monitor as the value of the Current._Cursor, i.e., first monitor 1 10, and the process proceeds to step 230, wherein the SarneJScreen flag is recalled as set to the "true" condition.
  • step 231 the computer 130 saves the position of cursor 330 in correspondence with the first monitor 1 10 in the saved cursor position memory 210.
  • step 232 there does not yet exist a saved cursor position corresponding to the second monitor 120, so process 220 branches back to the top of the loop, leaving the cursor 330 at its current position and the current application 191 (the image editor) remains active.
  • the values of the last and current facing will correspond to the monitor 120 and the last and current cursor will correspond to the first monitor 1 10.
  • the value of Last_Cursor will corresponds to the first monitor, which differs from the value of LasL Facing, so during step 225, the computer 130 will clear the Samejscreen flag (i.e., set the flag to a "false" condition.)
  • the user 310 seeks to copy some values from the spreadsheet application 192 and paste them into the text in fee slide image being edited with application 1 1. Since the cursor 330 does not currently lie in proximity to the spreadsheet window 424, the user 310 will begin making a gesture 440 with the mouse 140, causing the cursor 330 to traverse the screen image 412 along path 441 until the cursor reaches the edge of that screen image, and then continue onto the screen image 422.
  • the gesture 440 consists of several generally left-io-right strokes with the mouse pressed down (shown as solid Sines), interspersed with right-to-left movements with the mouse lifted (shown as dotted lines), and terminating with a mouse click, depicted by the circle at the end of the gesture.
  • Last_Facing correspond to the same monitor 120.
  • the computer 130 will set Same Artcreen Flag to a "true” condition, which then remains set for subsequent iterations. Movement of the cursor continues along the path 442 as the gesture 440 completes so the cursor reaches a final position 443, with the cursor now appearing in the shape of an arrow.
  • Many modem applications manipulate the cursor representation.
  • the arrow shape is the most common representation, but in the image editing application 191 , with the text insertion tool selected, the cursor appears I-beam shaped while the cursor position remains within the active window 314. The cursor shape would typically revert to the arrow shape as soon as the cursor leaves the active window 314 (from FIG. 3) along path 441 ,
  • the operating system module 190 When the user clicks the mouse at the end of gesture 440, with the position of the cursor 443 as shown, the operating system module 190 notifies the application 192 of the event, because application 192 is the "owner" of the window 424. resulting in the image editing application 191 becoming inactive, and the spreadsheet application 192 becoming active. As a result, application 19S hides the auxiliary menus 315 and the active window 314 (in FIG, 3) becomes the inactive window 414 (in FIG. 4), Immediately thereafter, the spreadsheet application 192 activates the window 424 and repopuiates menu bar 313 (of FIG. 3) with its own menu items, producing menu bar 414,
  • the spreadsheet cell immediately under the cursor may or may not become selected.
  • the user must further click the mouse to select the cell as shown in FIG. 5, which depicts the user as still facing the second monitor 120,
  • the cursor 543 now appears cross- shaped, and the spreadsheet cell B3 containing a decimal representation of e becomes selected.
  • the main window 514 for the image editing application 1 1 remains inactive in screen image 512, while the main window 524 for the spreadsheet application 192 remains active.
  • user 31 OR can use the keyboard (not shown) to issue a copy command, which operating system module 190 dispatches to the spreadsheet application 192, thus resulting in copying the decimal representation of e into a clipboard memory (well known, not shown).
  • the Sanie_Screen flag becomes set to a "true" condition (at step 224).
  • FIG 6 shows the next instant in this example.
  • the user 3 ! OL has turned to face the first monitor 1 10.
  • the change in the user's position, as depicted from FIG. 5 to the new position depicted in FIG. 6 will now trigger the fol!owings actions.
  • the next iteration of the process 220 will result in detecting that user now faces toward the camera 150, corresponding to the first monitor 1 10 during step 226. Thereafter, no change will occur in the position of cursor 543 (in FiG, 5) during step 227, as there is no gesture with mouse 140,
  • the values of Last_Facing and Curreni ⁇ Facing will correspond to different screens, causing the process 220 to branch toward step 229.
  • step 229 the computer 130 will determine that the cursor has not changed screens, so the process 200 will branch toward step 230.
  • the computer 1 0 detects that the Same servingScreen Flag is in the "true" condition (from step 224). Thereafter, during step 231 the computer i 30 will save the current position of cursor 543 (from FSG. 5). in correspondence with the second monitor 120, in the saved cursor positions memory 210.
  • the user 310 will experience the benefits of the multi-screen management technique of the present principles and will be relieved from having to make an elaborate gesture (such as gesture 440) in order to proceed with his work.
  • the computer 130 determines that there is a saved cursor position that corresponds to the first monitor 110, leading the computer in step 233 to recall that position from the saved cursor positions memory 210 and then return the cursor to that position, resulting in the cursor moving to the position of cursor 330 in FIG. 6.
  • the cursor may take on a default arrow shape, since the early cross- shape would not apply when outside the bounds of the active spreadsheet window 524 depicted in FIG. 5.
  • the cursor would quickly change to i- beam shape as the spreadsheet application 192 loses focus and the main window 624 becomes inactive in screen image 622, while in screen image 612, image editing application 191 gains focus.
  • the submenu panels 315 return along with the menu bar 313.
  • the text insertion point 331 once again appears and the window 614 now becomes the key window (i.e., the window that now becomes the target of keyboard input). Note that this time, the user does not need to make use of any complex, time- consuming, and fatigue-inducing gestures like the gesture 440:
  • the computer 130 has now automatically returned the cursor to its last location on the screen of the first monitor 1 10, In some instances, depending upon the precise implementation of the operating system
  • the user may need to enter a mouse click following the automatic cursor movement.
  • Some embodiments discussed further below, can obviate the need for this additional mouse click by noting and reestablishing the status of the applications and windows as the computer 130 saves and restores the cursor position.
  • the presumption exists thai no additional mouse click remains necessary.
  • second screen image 722 remains the same with the spreadsheet window 724 inactive.
  • the active image editing window 714 accepts a paste command via the keyboard (not shown), resulting in the value previously copied to the clipboard memory now being inserted into the slide image at the text insertion point, after which the user types "and", before leaving the text insertion point at 333 (in FIG. 7).
  • the computer 130 again relieves user 310 from having to make an elaborate gesture, like gesture 440, as the user 31 OR faces right, toward monitor 120.
  • the facing detector module 53 of FIG. 1 will detect the change in the user facing, whereupon during the next iteration of process 220, the computer 130 will record the position of cursor 330 (as depicted in FIG. 7) in conjunction with the first monitor 1 10 and restore the position of cursor 543 on the second monitor 120.
  • the computer 130 will deactivate the image editing application 191 and activate the spreadsheet application 192.
  • the main window 814 in the screen image 812 becomes inactive.
  • the displayed menu 413 will now correspond to the spreadsheet application, and the main window 824 in the screen image 822 now becomes active, with the cursor 543 now appearing cross-shaped.
  • FIG. 9 the user will make a tiny mouse gesture 940, resulting in the cursor 543 moving in the screen image 922 along the path 942 to become cursor 943.
  • the mouse click at the end of gesture 940 selects cell "B2", which contains a value for pi, and the user issues a copy command with the keyboard. Since the spreadsheet window 924 now has now become the active key window, the command copies the value for pi to the clipboard memory.
  • the inactive window 914 remains substantially unchanged, though the "Edit" item in menu bar 413 may flash briefly (not shown) in response to the copy command.
  • FIG. 10 depicts the circumstance when the user 310L turns back to the first monitor 1 10.
  • the facing detector module 153 of FfG. 1 will detect the change in the facing of the user and alert the computer 130 of FIG. 1 accordingly.
  • the computer 130 will save the position of cursor 943 in conjunction with the monitor 120, replacing the previous position of cursor 543 recorded during the transition from FIG. 5 io FIG. 6, Then, the computer 130 will restore the position of cursor 330 ⁇ which, in this example has not changed), to the monitor 1 10.
  • the spreadsheet application 192 has lost the user focus and the image editing application 191 has now gained the user focus.
  • the spreadsheet window 1024 becomes inactive so the cursor 943 (from FIG.
  • the window 1 1 14 is again the key window (i.e., the target of keyboard commands) and the user pastes the value for pi with a keyboard command along a with the text message, "are only approximations.”
  • the "Edit” item in the menu bar 13 may briefly flash in response to the paste command.
  • the Screen image 1 S 22 and the inactive window 1 124 remain unchanged.
  • FIG. 12 depicts the circumstance when the user 1 OR again turns toward second monitor 120.
  • the screen image 1222 shows the restoration of the cursor 943 to its previous position (illustrating thai the earlier cursor 543 is no longer recalled).
  • the spreadsheet applicaiion 192 and its main window 1224 now become active and the screen image 1212 shows the image editing application 191 and its main window 1214 as in active, while the menu bar 413 now responds to the spreadsheet application 192.
  • One possible variation within the scope of the invention relates to of the storage cursor position during step 231 as selected from a recent history of cursor positions (extending back, for example, for two seconds).
  • the computer ⁇ 30 could make an entry in the cursor storage position memory every time the user moves the mouse, which may require a time stamp for each entry.
  • the computer 130 of F!G. 1 could record the mouse position at regular intervals, either by making use of an independent clock, or based on each pass through the loop (e.g., during step 227) As described previously with respect to of FIG, 4, the user 310R, using caution, did not move cursor 330 until after he or she had looked toward monitor 120.
  • the computer 130 can take the cursor position from a predetermined interval back in history, e.g., 500 mS to one second earlier. Still another alternative would be for the computer 130 to take the last position of the cursor that had changed less than a predetermined distance over a predetermined amount of time, e.g., wherever the cursor had remained relatively still (having not moved more than an inch on the screen, or 72 pixe!s) over the course of one second.
  • Empirical testing can yield a refined choice of these values for the particular monitors and specific cursor control device 140 (whether or not a mouse) that exist within the computer workstation 100 of F!G. 1. This approach allows the computer 130 to ignore those mornenis during the gesture 440 when the user briefly lifts the mouse (the doited line portions) during which the cursor does not move.
  • the computer 130 can use a cursor position selected from the history to determine the appropriate value for Current proceduraCursor during step 227, in which case the value of €urrent cuisineCursor will lag behind the actual current position of the cursor.
  • that lag can vary depending on the approach used to select the historical cursor position. For example, if the cursor sits in a particular spot for a particular interval, the computer BO would select thai position from history, and the lag remains effectively zero, but as the user moves the cursor away from that location at some rate so the computer 130 still selects thai position from the history, the lag grows, potentially up to the maximum size of the history.
  • the workstation I! 00 relied on the facing signal to differentiate between the user substantially facing the camera 150 versus not facing the camera.
  • the facing detector module 153 could have greater capability to provide more precise information.
  • the facing detector module 153 could have the ability to indicate direction, e.g., the user substantially facing the camera 150 versus facing
  • the facing detector module 153 could estimate head pose, resolved in angles of pitch and/or yaw. imbuing the facing detector module 153 with even greater sophistication would allow it to identify and track the user's gaze, which would allow the facing detector to determine not only which screen the user faces but also the particular region of the screen observed by the user.
  • the workstation 100 of FIG. 1 could employ an eye-tracking sensor, which may comprise one or more cameras as well as one or more some infrared emitters (not shown) to better discern the user's eyes and the direction they face.
  • the user could wear an eye-tracking sensor on his or her head.
  • Tobii Technology AS of Sweden markets eye-tracking sensors of the type described above.
  • Tobii IS20 arsd IS30 eye-tracking sensors can attach to monitors 1 10 and 120 and connect to the computer 130, while Tobii Glasses illustrate a headwora mobile eye tracker (likewise requiring an interface to computer 130 to replace camera 150 and camera interface module 152).
  • the facin detection module 53 has the capability of identifying an artifact worn by user 130, for example something with a predetermined geometry such as a particular pair of glasses.
  • the facing detection module 153 can more easily detect the user's facing.
  • a pair of glasses frames no lens required
  • retroreflective markers not shown
  • the camera 150 could have a sensitivity to infrared (IR) light, and be co-located with one or more IR emitters (not shown),
  • IR infrared
  • the retroreflective markers on the glasses frames would return light from the IR emitters and the apparent relative positions of the retroreflective marker can serve to determine the user's facing, a process well known in the art which has the ability to produce a good, repeatab!e result for any user.
  • Wearing a special, machine-recognizable artifact, such as special glasses for example, for easy detection and recognition by the workstation 100 can afford a benefit when the faces of other individuals besides the user appear in the field of view of camera 50. For example, consider the situation (not shown) when a teacher looks over the shoulder of a student. Under such circumstances, the teacher's face should not serve as the controlling object. In one configuration, the workstation could ignore individuals wearing the artifact, or in an alternate configuration could ignore everyone except those wearing the artifact.
  • some image-based face detector modules can provide not only the location of the user's face (typically lying in region often described as a bounding box), but also the locations of specific facial features such as the eyes and mouth, where the ratios formed from those locations (eye-spacing in comparison to eye -mouth spacing) can provide a finer measure of the user's head orientation.
  • the user can train the facing detector module 153. For example, when a user makes cursor movements, e.g., clicking a button or selecting a menu item, the user will look at the cursor very precisely. Under such circumstances, the computer 130 can take account of information from the facing detector module 153, including the extracted facial features, in correspondence with the region in which the cursor action occurs. The computer 130 can presume that similar configurations of information subsequently obtained from the face detection and facial feature extracted from such face detection, relate to the same general region and, importantly, to the monitor corresponding to that region.
  • the two-monitor, one-camera configuration of workstation 100 need o ly provide a binary indication of whether or not the user looks at the first monitor 110.
  • a more sophisticated embodiment of the workstation 100 of FIG. 1 could include a facing detector module 153 having the capability of recognizing when the user looks at the first monitor 110, or in the direction of the second monitor 120, or in some other direction (i.e., above or below monitor 110, or to the left side of monitor 1 10, or away from monitor 120).
  • the computer 130 could execute the process 220 to pause during step 226 until detecting the user as likely facing one monitor or the other.
  • the computer 130 when executing step 226, could forego changing the value of Currem ⁇ Facing unless the facing deteciion module 153 detects the user as facing toward one of the monitors 1 10 and 120.
  • FIG. 13 where the workstation 1 00 depicted in that figure has two cameras 150 and 1315 mounted on monitors 1 10 and 1320, respectively, with the two cameras having connections 151 and 1352, respectively to a multi-camera interface 1353.
  • the Monitors 1 10 and 1320 have corresponding connections 1 1 1 and 1321 , respectively, to the monitor module 160.
  • a facing detector module 1354 analyzes images from the two cameras 150, 1315 to determine which of the two cameras ⁇ and by extension, which of the two corresponding monitors) a user likely faces.
  • a further extension of the workstation 1300 could include multiple cameras and a facing detection module having even greater sophistication. With such a workstation, a user could face any of one of a bank of monitors, several monitors wide and several monitors high (not shown). The facing detection module of such a workstation could manage determination of the user's facing by detecting whether a user faces an higher or lower rank of monitors, and whether the user faces a center, right or left monitor column.
  • FIG, 14 shows a state transition diagram 1400 indicating the manner in which the computer 130 determines when to save or recall cursor positions associated with a particular monitor, Sjn] for a workstation having M monitors, l ⁇ n M,
  • the workstation may have one or more sensors to detect facing, e.g., camera 150 and/or camera 1315.
  • the State transition diagram 1400 depicts eight states 1411-1414 and 142 S -4124, with a "current" state that corresponds to exactly one of these states at a given time.
  • An independent "current" state can correspond to each of the monitors 1...M, with the state transition diagram effectively being replicated for each for each of the M current states.
  • the computer 130 When the workstation is first in use, the computer 130 will initialize the "current" state to one of the four bold-circle states 141 1-1414, forming meia-siate 1410, which constitutes a state where there exists no previous cursor position stored for the corresponding monitor S[n].
  • the four different states correspond to the possible combinations in which the current facing and current cursor position do or do not correspond to the particular monitor S[nj. As one, or the other, or both of the current facing and current cursor position changes, a transition between states occurs, according to the labeled transitions.
  • a second meta-state 1420 consists of the four non-boSd-circie states 1421 -1424.
  • the top two circles represent the situation where the cursor appears on the monitor S[n]
  • the bottom two circles represent the situation where the cursor appears on another monitor.
  • the left two circles represent the situation where the user faces monitor S[n]
  • the right two circles represent the situation where the user faces another monitor (or in some embodiments, as previously discussed, merely facing somewhere other than screen of the monitor S[n]).
  • & transition from the one of the top circles to one of the bottom circles occurs whenever the cursor appeared on screen of the monitor S[n], but moves io the screen of a different monitor.
  • a transition from one of the bottom circles to one of the top circles occurs whenever the cursor moves to the screen of the monitor S[rt] from the screen of any other monitor, in the state diagram corresponding to monitor S[nj, no transition occurs merely because the cursor moves from some monitor different from monitor Sfn] to some other monitor different from the monitor S[n].
  • a transition occurs between meta-states 1410 and 1420 and more specifically from the state 141 1 to the state 1423,
  • the computer 1 0 saves the current cursor position In correspondence with the monitor Sfn] (or, as described previously, a cursor position selected from a recent history of cursor positions).
  • Tl is transition between meta-states occurs in a single direction: Once saved cursor position exists for the monitor S[n], all subsequent states lie within the meta-state 1420.
  • a similar transition and action occurs between states 1421 and 1423, but since a cursor position already exists for the monitor Sin], the computer 130 merely updates that position to the current cursor position (or, as above, a recent cursor position).
  • meta-state 1420 While in meta-state 1420, that is, while a cursor position is saved in correspondence with monitor Sf n], any change that results in the user's facing becoming associated with the monitor S[n], where the facing did not already correspond to the monitor S[n], produces the collateral action of the cursor position associated with monitor Sfn] being recalled.
  • the meta- state 1420 now transitions to the state. 1421 , In particular, regardless of whether or not the cursor appeared on the monitor Sfn], if the user turns to re-face the monitor S[n] and the stored cursor position corresponds to the monitor S[n], then the computer 130 will recall and then apply that cursor position. As a result, the cursor returns to the monitor Sin], in the previously saved position. Even if the cursor already appeared on monitor S[n], application of the recalled cursor position will cause the cursor to jump to the saved position.
  • the component states 1411-1414 of meta-state 1410 have similarities to the corresponding component states 1421-1424 of meta-state 1420, which for the most part, leads to the same transitions connecting them, with the following four exceptions:
  • Such a transition represents the first iransiiion thai causes saving of the cursor position for that particular monitor (S[n]) and constitutes the only transition between meta- states 1410 and 3420,
  • the facing changes may reflect that the user now faces toward a different monitor, i.e., SO] where 1 ⁇ - j ⁇ M and j on. in another embodiment.
  • “the facing changes” may mean that the user no longer faces toward the monitor S[n], in which case, turning away from the monitor altogether counts as a transition.
  • This transition from state 141 1 to state 1423 bear the designation as being triggered by a "facing change” but also identifies a particular action the transition triggers, namely, that the computer 130 now stores the current cursor position as corresponding to the monitor S[n].
  • the second exception is analogous to the 141 1 -to- 1423 transition, where 1421-to- !423 also results from the user turning away from monitor S[n] while the cursor position resides on S[n],
  • 1421-to- !423 also results from the user turning away from monitor S[n] while the cursor position resides on S[n]
  • the cursor position associated with monitor S[n] is updated.
  • the third exception, the transition from 1413 to 1412 and fourth exception, the transition from 1414 to 1412, are exceptions for the same reason. Both relate to situations where the facing returns to monitor S[n] while the cursor is on or moves to a different monitor, but in raeta-state 1420, whenever facing returns to monitor S[n), the transition is to state 1423 , because when facing returns to monitor S[n] calendered cursor position is recalled and applied.
  • FIG. S 5 shows a flowchart for a simplified multi screen management process 1500 that saves and recalls cursor positions, beginning during step 1501 wherein the user already faces a first monitor on which the cursor resides (corresponding to either of states 141 3 and 1421 of FIG. 14).
  • step 1502 of F!G. 15 the facing detector module 153 (or a similar such device) determines that the user has turned away f om the first monitor on which the cursor resides.
  • the computer 1 0 saves the first position of the cursor on the first monitor in a saved cursor position memory 1510.
  • Steps 1502 and 1503 together correspond to the transitions from either of states 1411 or 1423 to the state 1423.
  • the facing detector module 153 determines that the user again faces the first monitor, regardless of where the cursor appears.
  • the computer 130 recalls first position of the cursor from the memory 1510 and applies that position so that the cursor returns to the first position on the first monitor.
  • Steps 1504 arid 1505 together correspond to fee transitions from either of states 1423 and 1424 to the state 1421.
  • Process 1500 concludes during step 1506, but the process typically gets repeated many times,
  • FIG, 16 shows a flowchart for a similar process 1600 that further tracks and restores the user interface status of the various applications, such as applications 1 1 and 192.
  • Steps 1601 - 1603 and the memory 1610 of FIG . 16 correspond to the steps 1501 - 1503 and memory 1510, respectively, of FTG. ⁇ 5.
  • the computer 130 stores a first status of a first application having the focus of the GUI in the memory 1610.
  • first status may comprise any of: a window of the first application comprising the main window (e.g., the window containirsg the active document for the application), a window of the first application comprising the key window (i.e., the window of the first application designated to receive keyboard events), and whether or not the first application, or one of its windows runs in a full screen mode.
  • the status of the first application may also comprise the state of certain controls, for example, whether media plays or not.
  • the computer 130 saves the first status only if at least one of the acti ve windows or the menu bar for the first application appears on the. first monitor.
  • the facing detector module 153 of FIG. 1 determines that the user again faces the first monitor, as during step 1504.
  • step 1606 the computer 130 recalls the first cursor recalled from the memory 1610 and applies that information, thereby restoring the cursor to the saved position, as during step 1505.
  • step 1607 the computer recalls the first status of the first application and applies that information, thereby returning the first application to a similar status.
  • Process 1 00 concludes during step 1608, but gets repeated many times.
  • Process 1 00 has particular usefulness in conjunction with operating systems that do not typically dispatch mouse events (e.g., mouse move events) to the non-foremost application, unless the user clicks the mouse there.
  • the status saved during 5603 may merely reflect the foremost application.
  • restoring the focus may merely require that the application become foremost again, which, in some embodiments, can occur by issuing a mouse-click event (as if the user had clicked the mouse 140).
  • the computer 130 can generate another signal that induces the application of a window or control under the cursor's current position (restored during step 1606) to become active,
  • FIG. 17 shows a flowchart for a simplified multi-screen interface management process
  • step 1701 the facing detector module 153 determines thai the user has turned away from the first monitor at least partially representing a first application having the focus of the GUI.
  • step 1703 the compuier 130 stores a first status corresponding to the first screen monitor in a status memory 1 10,
  • step 1704 the facing detector module determines that the user again faces the first monitor.
  • step 1705 the computer 130 recalls the first status from the memory 1710 and applies that information to restore focus to the first application.
  • the process 1700 concludes during step 1706, typically gets repeated many times,
  • the foregoing describes a technique for managing multiple screens of a compuier workstation, While the technique has been described above in connection with switching between two different applications, based on the user's focus on a particular monitor, the technique, the screen management technique is equally applicable where the first and second application couid represent different instances of the same program or even different instances of the same window.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for controlling a computer workstation operating a plurality of monitors each display commences by rendering active at least a first portion of a user interface for a first computer application displayed by a first one of the monitors upon detecting that the user observes that first monitor. In response to detecting that the user has switched from observing the first monitor, a first portion of the user interface for the first computer application is then rendered inactive. At least a second portion of a user interface is rendered active for a second computer application displayed by the second monitor.

Description

IMPROVED MULTI-SCREEN MANAGEMENT
TECHNICAL FIELD
This invention relates to a technique for managing multiple screens of a computer workstation
BACKGROUND ART
Many desktop personal computers or "docked" laptop computers (hereinafter generically referred to as "computer work stations"), can support multiple monitors. Most computer workstations of this type include graphics circuitry, either as part of the processor itself, or a separate circuit board or "card," which has the ability to support two monitors. More advanced graphics circuitry can support three or more monitors.
Having multiple monitors allows the user of the computer workstation to have multiple computer applications or programs open at the same time, with each program displayed on a separate monitor, For example, using multiple monitors, a user running a single instance of a word processing program, for example Microsoft Word®, can have one or more different word processing documents open on each of the monitors. In this way, the user can easily interact with multiple documents without the document windows being too small, or having to pick which document will he the one currently viewable. Using a computer workstation with multiple monitors, a user can have different programs all open at the same time, with enough room for each to be clearly visible. For example, a user can have a word processing program open on at least one monitor and a spreadsheet program (e.g., Microsoft Excel®) open on at least one other monitor. In this way, a user can readily copy information from one document (e.g., a spreadsheet) and "paste" that information into another document (e.g. a report) displayed on another monitor.
When copying and pasting information from one document to another, the user will often spend most of his or her time moving the cursor from the copy point on one monitor to the pasting point on the other monitor and back again. The distance the user moves the cursor can constitute a substantial span, as much as two monitors or more if the workstation supports more than two monitors. Even just switching between applications could require that the user move the cursor to another screen to select the application of interest if the window(s) for such task appear on a separate monitor. This can impede user responsiveness for infrequent task changes (e.g., calling up a blank note whenever the phone rings), or reduce productivity if the user must frequently change between tasks positioned on different monitors.
Presently, most computer workstations employ a single monitor so relatively little effort has been directed to addressing the problem of minimizing cursor movement with multiple monitors. Computer workstations with a single monitor readily support moving the cursor to various locations on the monitor responsive to user commands. Some workstations even offer "cursor acceleration, whereby a slow mouse or trackpad movement by the user results in a commensurate, small movement of the cursor (I ), but a faster mouse or trackpad movement by the user results in a disproportionately larger movement by the cursor (e.g., 5x) on the monitor. However, this technique does not necessarily address ail the problems associated with the use multiple monitors.
Thus, a need exists for a technique for managing multiple monitors operated by a computer workstation that overcomes the aforementioned disadvantages.
BRIEF SUMMARY OF THE PRESENT PRINCIPLES
Briefly, in accordance with a preferred embodiment of the present principles, there is provided a method for controi!ing a computer workstation operating a plurality of monitors each display device displaying at least a portion of the user interface for a corresponding compuier application. The method commences by rendering active at least a first portion of a user interface for a first compuier application displayed by a first one of the monitors upon detecting that the user observes that first monitor. In response to detecting that the user has switched from observing the first monitor, a first portion of the user interface for the first computer application is then rendered inactive. (Note that the step of observing can include gazing or facing the first monitor.) At least a second portion of a user interface is rendered active for a second compuier application displayed by the second monitor. The first and second application could represent different programs or different instances of the same program or even different instances of the same window. BRIEF SUMMARY OF THE DRAWINGS
FIG 1 depicts a block diagram of a first embodiment of a computer workstation for managing multiple monitors in accordance with the present principles;
FIG 2 depicts in flow chart form the steps of a process executed by compuier workstation of FIG. 1 to manage the mu!iip!e monitors;
FIGS 3-12 show consecutive images displayed by the monitors of the computer workstation of FIG. I in connection with the multiple screen management technique of the present principles;
FIG 13 depicts a block diagram of a second embodiment of a computer workstation for managing multiple monitors in accordance with the present principles;
FIG 14 depicts a state transition diagram thai describes the behavior of the compuier workstation of FIG. 13 with respect to each monitor;
FIG 15 depicts in flow chart form the steps of a process for tracking and restoring the cursor position on each monitor of the workstation of FIG. 33;
FIG 16 depicts in flow chart form the steps of a first exemplary process for tracking and restoring the cursor position and compuier application on each monitor of the workstation of FIG. 13 based on the user focusing on that monitor; and,
FIG 17 depicts in flow chart form the steps of a second exemplar)' process for tracking and restoring the computer application on each monitor of the workstation of FIG. 13 based on the user focusing on that monitor
DETAILED DESCRIPTION FIG. 1 depicts a block schematic diagram of first embodiment of a compuier workstation 100 for practicing the multi-screen management technique according to the present principles. The workstation 100 of FIG. 1 comprises a compuier 130 running an operating system represented by the operating system module 190 and cursor control module 143 for controlling the multiple monitors, illustratively depicted by monitors 3 iO nd 120, although the workstation 100 could include three or more monitors (not shown). Like most modern operating systems for personal and professional general purpose computers, the operating system embodied by the operating system module 190 provides a graphical user interface (GUI), most widely typified by "Mac OS X" provided Apple, Inc. of Cupertino, CA and the "Windows" family of operating systems provided Microsoft, Inc. of Redmond, WA, which support more than one monitor. As will become better understood hereinafter, the multi-screen management technique of the present principles can easily operate with virtually any operating system provided such a system supports multiple monitors.
The computer 130 includes various well-known elements, such as a processor, a memory, and power supply, all omitted from FIG. 1 for the sake of clarity. Other well know elements within the computer workstation 130 include a monitor module 160, a cursor control interface 142, and a camera interface 152, which warrant further discussion in connection with the multi-screen management technique of the present principles. The monitor module 160» operating in conjunction with the operating system module ί 90, enables one or more computer applications (computer programs), such as applications 191 and 1 2, to display information to the user generated by each application. Further, the monitor module 160 further allows the operating system, embodied by the operating system module 190 associated with the computer ί 30, to support a Graphical User interface (GUI) on the monitors 1 10 and 120 connected to monitor module 360 by connections 1 1 1 and 121 , respectively.
The cursor control 140 typically comprises a mouse, touch pad, joystick, trackball, touchscreen, or other device, all well known in the art, for enabling a user to control movement of a cursor on the screen of each monitor. For the purposes of convenience, hereinafter, the term "mouse" will collectively refer to any variety of cursor control devices. The mouse 140 communicates with the cursor control interface 142 via a connection 141 . The cursor controi state moduie 143, typically comprising part of the computer operating system embodied by the module 190, includes a basic input/output system (BIOS) for receiving information from cursor control interface 142 indicative of manipulation by the user of the mouse 140.
In a typical embodiment, when the user moves mouse 140 a particular distance to the left, the cursor control interface 342 will register a corresponding number of counts reported to the cursor control state moduie 143 for processing. The cursor control state module 143 provides the cursor movement details to the operating system moduie 390, which
correspondingly modifies the cursor position on the appropriate one of the monitors 1 30 and 120. In other embodiments, e.g., those making use a touch pad, a user's finger movement may produce a different kind of signal, but ultimately, the operating system responds to the movement of the mouse ! 40 by monitoring the corresponding cursor movement on the appropriate one of the monitors 3 10 and 120. The computer workstation 100 includes at least one television camera 150, which sits in this embodiment on top of the monitor 1 10 for observing the image of the user of the workstation 100. The computer 130 includes a facing detection module 153 connected to the camera interface 152, which in turn is connected to television camera 150 by a connection 151. The facing detection module 153 processes images obtained from the television camera 150 to determine whether a user directly faces the monitor 1 10. The connection 151 , along with the connections 1 1 1 , 123 , and 141 can comprise either wired or wireless links,
in addition to the facing detection module 153, the computer 130 of the workstation 100 includes a user control state processor 180. In the exemplary embodiment of FIG. 1 , the operating system module 190 has the capability of performing several tasks at the same time (and thus possesses the capability of "multi- tasking ,") In this regard, the operating system module 390 can manage multiple applications, illustratively depicted in FIG. ! as appiicarions 191 and 192. In some embodiments, the operating system module 1 0 can not only manage. the applications 19! and 192 but can also manage the tasks related to, or implemented by. the modules 3 3 and 180,
By way of illustration and not of limitation, the following discussion considers a system in which two applications each have a significant portion of their respective user interfaces present on a different one of the two monitors 1 30 and i 20, In other embodiments, a single application might have different windows (e.g., one for each of several different documents) on each of the two monitors, or a singfe application may have two portions of sis user interface situated on each of the two monitors, In some embodiments, ihe system merely manages the cursor position, and the status and behaviors of the applications running thereon, however many, merely react according to the cursor state. Thus, the illustration discussed here has been selected for its particular clarity, but those skilled in the an will recognized, in light of this comment, that the. present invention applies with equal effectiveness to such other systems.
In the illustrative embodiment of FIG, I , the monitor i SO displays an image ! 12, associated with execution of the first application 191 , whereas the monitor 120 displays an image 122 associated with execution of the second application 192. The portion of the images S 12 and 1 12 not occupied by the applications 191 and 192 represent the "desktop" provided by the operating system associated with the operating system module 1 0.
The user control stare processor 180 performs several functions associated with managing multiple monitors in accordance with the present principles. To that end, the user controi state processor 180 receives a signal from the facing detection module 153 and processes that signal to determine whether the user faces the monitor 110. In addition, the user control state processor 180 reads the current cursor position from either the cursor control state module 143 or the operating system module 190 and, whenever the user is facing a particular monitor, stores at least one most recent cursor position for the corresponding faced one of the monitors 110 and 120 in a cursor state store 181. Typically, the cursor store 181 comprises a part of the memory (not otherwise shown) of the computer 130, but could comprise a separate memory. Additionally, or in the alternative, the user control state process 180 could communicate with operating system module 190 to determine which application currently has the focus of a user, as determined by the face detection module 153. The computer application currently having the focus of a user corresponds to the application currently receiving events representing mouse clicks, keystrokes, or other user interface gestures. The application having the focus, including in some embodiments which window of which application has the focus, is further recorded in cursor state store 181.
In accordance with the present principles, the user control state processor 180 can recall a cursor position, if available, for a particular monitor from the control state store 181 , when the user turns to face that moniior. Following recal! of the saved cursor position, the user control state processor 180 will set the current cursor position through either the cursor control state module 143 or the operating system module 1 0 to correspond to the saved cursor position when the user returns to observing the monitor associated with that stored cursor position.. In embodiments where state store 181 records the focus, the user control state processor 180 can communicated with the operating system module 190 to restore focus to the application so noted. Based on the user focus, the operating system module will render the application having the user focus active and inactivate the application(s) that no longer have the user's focus.
There exist different variations of user interface gestures, which depend on both the operating system and Graphical User interface (GUI), For example, a mode of interaction exists in the GUIs provided by some operating systems, notably the "X Window System" (commonly known as "X1 1" or just "X") as well as the UNIX and Linux operating systems, wherein the cursor position determines the active application. In other words, an application becomes active whenever the cursor points to one of that application's windows, in such a case, the cursor position as detected, stored, recalled, and set as discussed above will fully determine which application the operating system deems active and has the focus, and no distinei communication with the operating system becomes necessary to activate that application, Another exemplary mode of interaction in the GUIs provided by other operating systems includes a process of separately maintaining a hierarchy of which applications become activated or become "foremost." Only the foremost application will receive user input through the GUI until a different application becomes explicitly activated. The Mac OS X operating system from Apple Computer of Cupertino, California represents an example of an operating system that functions in this manner, A user can activate an application currently not foremost by an explicitly entering a command or message directing an application to become foremost. Often, the user can activate an inactive application by moving the cursor over a window of that inactive application and clicking on it.
In some embodiments, the operating system module 1 0 detects a click on an application made by a user and notifies the target application to become active. Thus, the application now becomes the foremost application. However, the operating system module 1 0 does not otherwise deliver the click event to this application. This previously inactive application, which now becomes the foremost application, can now receive subsequent user interactions, but the activating click becomes lost. In some special cases, applications may designate that the operating system should resubmit the click to the application once the application has become foremost. However, far more often, the operating system vviii discard such activating clicks so as not to inadvertently trigger an accidental action. The discarding of activating clicks remains the usual case, will be discussed hereinafter with regard several examples wherein large portions of an application's user interface appear hidden while the application remains inactive.
The activating click merely constitutes a signal from the user intended to activate the application and not initiate any other action. Indeed, the location of the click might correspond to a portion of the screen that would become covered by sub-menus or control panels once such items become restored by the application as that application becomes foremost, in such cases, saving and recalling an active application corresponding each of the plurality of monitors 1 10 and 120 can prove useful as discussed hereinafter. This status can include which window or windows belonging to the application carry special designations. For example, in Mac OS X, an application can have a main window (the application's foremost document), in certain situations, e.g.. when saving a document to a file, the application can have a distinct key window, that is, an application window that will receive keystrokes from a keyboard whether or not the cursor lies over that window. At other times, the main and key window could comprise the same window. The main window and key window of an application running under OS X, whether they represent two distinct windows or represent a single window carrying both designations, provides an illustrative example of the value of the user control state processor 180 querying operating system module 190 about the current application and determining which windows carry which designations, and which correspond to such designations. When pertinent, the computer 130 can restore these settings as described further below. Other settings corresponding to an active/foremost application may warrant saving and recalling, for example when the application currently operates in "full screen" mode (if supported by the operating system).
FiG. 2 depicts, in flow chart form, the steps of a first embodiment of a cursor management process 200 performed by user control state processor module ISO of FIG. 1 . The cursor management initialization process 200 begins upon execution of step 203 once the operating system module 190, already running, receives a user input from the mouse 140 through cursor control interface 142 for receipt by the cursor control state module 143.
Concurrently, the operating system module 390 will receive a stream of images from the camera 150 via the camera interface 152 for processing by the facing detector module 153.
Upon execution of step 202, the facing detector module 153 detects the current user facing, that is, with which monitor the user is currently or most recently engaged. The facing detector module 153 supplies that information to the user control state processor 180, which then records the current user facing in a memory (not shown, but represented in FIG. 2 by a register named "CurrenLFacing"). The current user facing must correspond to one and only- one of the monitors 1 10 and 120. (In other words, the user only focuses on one monitor at a time.) During step 203, the user contro! state processor module 180 obtains information to determine on which one of the monitors 1 10 and 120 the cursor currently resides. Note that the cursor need not appear visible, as applications often hide the cursor when not currently being manipulated by the user, e.g., when watching a video in an application window. The user control state processor module 180 then saves this information in a memory (not shown, but represented in FIG. 2 by a register named "Current.. Cursor1'). During step 204, the user control state processor module 180 clears any saved cursor positions 210 stored in the control state store 18 ! (These positions are distinct from the "Current_ Cursor" register and correspond to specific monitors.)
For the sake of clarity, the process described with respect to FIG. 2 determines and records the cursor position, However, the computer 130 of FIG. could also obtain and save t e application and window status (e.g., including designations such as main window, key window, foremost window, active window, and full screen mode), during step 203 for later use. A portion of the control state store 181 , like the saved cursor positions memory 210, can retain this application and window-related information. However, as described in detail hereinafter, the multi-screen management technique of the present principles can operate with just the cursor position information and still offer significant advantages.
The cursor management process 220, which comprises a loop, begins during step 221 whereupon the initialization process 200 ends. Following step 221 , step 222 undergoes execution, at which time, the computer 130 of FIG. 1 copies the contents of the memory, represented by the registers named Curreni„Facing and Current_Cursor, to another memory location (not shown) represented by the registers named LastJRacing and Last„Cursor, respectively. This copying operation enables detection of the user facing or cursor location transitions from one of the monitors 1 10 and 120 to the other.
During step 223, the computer 130 determines whether the values in the registers named LastJ¾eing and Lasi_Cursor correspond to the same, single monitor. If so, the computer 130 sets a single-bit flag, represented in FIG. 2 by the register named Same_Screen, to a "true" condition during step 224. Otherwise, the computer 330 will set the flag to a "false" condition during step 225. Regardless of the condition indicated by the Same„Screen flag, during step 226, the process 220 awaits a signal to indicate the current user facing as detected by the facing detector module 153 of FiG. 1.
Step 226 performs the same operation as during step 202, setting the register
CurrentJPacing to the determined value. Likewise, step 227 performs the same function as during step 203, setting the register Current_€ursor appropriately. As above, if the computer i 30 determines and then stores the application and window status during step 203, then, the computer likewise performs these functions during step 227 too.
During step 228, the computer 130 checks whether the user facing has changed. If Last„Facing and Current_Facing values correspond to the same one of the monitors 1 10 and 120, then no facing change has occurred taken place and the process 220 restarts the loop at step 222,
If during step 228, the registers Last_Facing and Current„Facing show that the user now faces a different screen, then process 220 proceeds to step 229, During step 229, the computer ! 30 determines whether the cursor has moved from one monitor to the other, if the Last„Cursor corresponds to the same monitor as the Current_Cursor, then the user has left the cursor on ihe monitor where it had appeared previously. Under such circumstances, the process 220 proceeds to step 230.
During step 230, the computer ί 30 tests the Same_Screen flag, A true condition for the Sam6_$creers flag indicates that a moment ago, the user faced the same monitor on which the cursor appeared, but now faces elsewhere, presumably a different monitor, and thai the user has turned and left the cursor behind. In response to a true condition for the
Sanie„Screen flag, the process 220 proceeds to step 231. During step 231 , the computer 130 obtains the current position of the cursor from either the operating system module 1 0, or the cursor control state module 143 (both of FIG. 1), which one depending on the specific implementations of the computer 130, the cursor control interface 142, the device dri vers (e.g., cursor control state module 143), and the details of the operating system.
The cursor position can exist relative to the corresponding monitor, in other
embodiments, the cursor position can exist as a value in a global coordinate system that encompasses the images 1 12 and 122 on the monitors 110 and 120, respectively. The form of the cursor position depends on the operating system and other elements etc. When obtained, the current cursor position, if it has not moved between monitors according to the
determination during step 229, is saved in a cursor positions memory 210 in correspondence with the monitor on which the cursor appears, which in FIG. 2 is called
Screen[Current_Cursorj. In some embodiments, Screen[Current„Cursor] remains identical to Current_Cursor, but if Current„€ursor includes information other than an identifier corresponding to a single one of monitors 1 10 and 120, then the operation Screenfxj returns such an identifier.
During step 232, the computer 1 0 makes a determination whether there exists a saved cursor position for the monitor the user now faces. (Note that during step 228, the computer 1 0 of FIG, determined that the facing of the user had changed because Current„Facing and Last...FaC!ng now have different values). If the user now faces a moniior represented by the operator Screen[Current_Facing] for which the computer 130 previously stored the cursor position in memory 210, then during step 233, the computer 130 will recall this cursor position. The. computer 130 will impose this cursor position on the GUI by causing the user control state processor module 180 to direct either the cursor control state module 143 or the operating system module 190 to make the change. Generally speaking, the computer 130 will direct the same module to move the cursor to the current cursor position during step 23 1 (i.e., -l ithe cursor control state module or the operating system module) as was previously
interrogated in step 227,
After step 233, process 220 loops back to step 222, but now the operating system module 190 will react to the newly recalled cursor position, fn some embodiments, during step 233, the computer 130 can momentarily embellish the cursor e.g., by drawing the cursor at the newly recalled position in a large size or drawing the cursor with a color sure to attract attention, and then quickly shrinking the cursor to normal size, or fading to a normal color, to make the jump to the new position more clearly visible.
However, if during step 229, the computer 130 has determined that the cursor has changed monitors, that is, the value of Last_Cursor no longer corresponds to the same monitor Current„Cursor, then a race condition exists: The user has looked to a different monitor at substantially the same time the user manually moved the cursor user to the different monitor. In this situation, no valid current cursor position exists for saving during step 231 , so process 220 skips to step 232. Similarly, if during step 230, the computer 130 determines that the cursor and facing corresponded to two different monitors (i.e., the user has not looked away from the cursor), then, the current cursor position does not warrant saving, and again process 200 skips to step 232.
The determinations made during steps 229 and 230 that lead to step 232 while skipping step 231 allows for situations wherein the computer 130 can recall a cursor position even when there is no cursor position suitable for saving. Examples include the user turning from a first monitor toward a second monitor at the same time as the cursor transitions from the first monitor to the second monitor, in which case the cursor will move under manual control to the second monitor, but immediately jump to a previously saved position on that second monitor. This circumstance offers the advantage of allowing a user to become familiar with the experience afforded by the process 220, but stills allows the user to exercise his or her old reflexes to move the cursor manually with or ahead of the user's gaze.
Note that the loop 220 needs to repeat no more often than once each time the facin detector module 153 updates the facing data (i.e., processes a new camera image from camera interface 152 to determine the facing of the user). The overa!! process sis U provides substantial benefits even if the process loop 200 only undergoes execution when the facing detection signal from facing detector module 153 changes. Thus, even though FIG. 2 depicts the process 220 as a continuous loop, the process can spend a substantial amount of time idling during step 226, waiting for the results of the next facing detection from the facing detector module 153, or waking for a change in the state of the facing detection signal the facing detector module 153 produces. Of the two, the former remains preferable, since little computational burden exists to the balance of process 220.
For embodiments that manage the application and window status as the user's facing changes, then those steps that result in collection of the cursor position, e.g., step 231 , can also collect and store the application and window status. When the computer 130 recalls and sets the cursor position, the computer can reassert the application and window status too, such as during step 233. In some embodiments, determining the monitor on which the cursor appears may require obtaining the current cursor position, and then applying a subsequent process (e.g., Screen Current_Cursor]). Embodiments that save this more elaborate value (rather than just the monitor identifier) might obtain the current cursor position during step 232 by applying an appropriate operator to the value in the CurrenLCursor register.
FIGS. 3-V2 collectively illustrate a user 310 interacting with the workstation 100 while user control state processor module 180 performs the cursor management process 220 of FIG. 2. In FIG, 3, the user 330 (designated here as 310L) faces the left monitor 1 10, and thus substantially faces the camera 150, The facing detection module ! 53 will detect this facing and indicate that the user faces monitor 1 10.
For purposes of illustration, assume the user 310 makes use of the computer workstation 100 to create an educational slide, using an image editing application 191 (shown in FIG. 1 ) on the left monitor 1 10 and a spreadsheet application 192 (also illustrated in FIG. 1 ) on the right monitor 120. In screen image 312 displayed on the monitor 1 10. the main window 314 for the image editing application 191 appears as currently active. Whenever the image editing application 191 appears foremosi, such as now, the computer 130 will configure the menu bar 313 for that application and corresponding auxiliary menu panels 315 now appear, in the screen image 322 displayed on the monitor 120, the main window 324 for spreadsheet application 192 appears as inactive, represented in these figures by the top of the window 324 being dotted.
The user 310 uses his or her hand 31 i to move the mouse 140 to direct the cursor 330 to a position the main image editing window 314 where the user wants to insert some text. To accomplish text insertion, the user 310 will selected the text insertion too! from one of the auxiliary menu panels 15 with the result that the cursor 330 now appears as an I-beam. The user 310 can now make use of a keyboard (not shown) connected to computer 1 0 to type the first portion of the slide's heading, "Since e and pi are irrational numbers, decimal representations of them, like" , with the current text insertion point 331 being distinct from the position of cursor 330. The keyboard keystrokes correspond to the first application 1 1 because its window 314 became designated as the key window, e.g., when the user entered a mouse click on cursor 330 to select the initial text insertion point.
In the situation described above, after one or two executions of the process 220 of FIG.
2, the values Last„Facing and Current J¾cing wi!l both correspond to the first monitor 1 30, as will the values of Last.. Cursor and Current_Cursor, since the facing detector module 153 detected the user 310L as substantially facing camera 150 and by extension, the monitor 1 10, so the cursor 330 now appears in the monitor image 312 on the monitor 1 10. During step 223, since both LastJFacing and Last_Cursor correspond to the same monitor (e.g., monitor 1 20), the computer 130 of FIG, 1 will set the SameJScreen flag set during step 224 of FIG. 2 to a "true" condition. Subsequent iterations of the process 220 during the situation depicted in FIG. 3 will result in repeated execution of the steps 222, 223, 224, 226, 227, and 228, after which the loop repeats so iong as the user's facing remains the same.
As depicted in FIG, 4, the facing of the user has changed (as compared to FIG. 3) since the user now faces to the right, hence use of the designation 3 ! OR, The camera 150 now provides an image of user 310 who no longer looks at the monitor 1 10, but now looks elsewhere, After processing ibis image, the facing detector module 153 will indicate that the user no longer !ooks at the monitor 1 10. As a result, during an iteration of the process 220, the computer 130 will first detect this facing change during step 226 and will now consider the user as facing the second monitor 120. While the interpretation of "not facing monitor i 10" as "facing monitor 120" constitutes an over-generalization, the interpretation remains still workable. An alternate embodiment, discussed hereinafter, makes use of a more sophisticated version of facing detector module 153 to provide results with greater precision.
In this phase of the example, the cursor still remains at the position 330 and the image editing application 1 1 still remains active (as shown in FIG. 3). During step 227, the value of the Current..Cursor still corresponds to the first monitor i 10. During step 228, the computer ί 30 detects the change in the user's facing because the value of Last„Facing corresponds to the first monitor 1 10, but the value of CurrentJFacing corresponds to the second monitor 120, and the process 220 proceeds to the step 229 of FIG. 2. Since the cursor has not yet moved, then during step 229, the value Last_Cursor corresponds to the same monitor as the value of the Current._Cursor, i.e., first monitor 1 10, and the process proceeds to step 230, wherein the SarneJScreen flag is recalled as set to the "true" condition. This leads to execution of step 231 , wherein the computer 130 saves the position of cursor 330 in correspondence with the first monitor 1 10 in the saved cursor position memory 210.
During step 232. there does not yet exist a saved cursor position corresponding to the second monitor 120, so process 220 branches back to the top of the loop, leaving the cursor 330 at its current position and the current application 191 (the image editor) remains active. In subsequent iterations, the values of the last and current facing will correspond to the monitor 120 and the last and current cursor will correspond to the first monitor 1 10. During these iterations of step 223, the value of Last_Cursor will corresponds to the first monitor, which differs from the value of LasL Facing, so during step 225, the computer 130 will clear the Samejscreen flag (i.e., set the flag to a "false" condition.)
in this example depicted in FIG. 4, the user 310 seeks to copy some values from the spreadsheet application 192 and paste them into the text in fee slide image being edited with application 1 1. Since the cursor 330 does not currently lie in proximity to the spreadsheet window 424, the user 310 will begin making a gesture 440 with the mouse 140, causing the cursor 330 to traverse the screen image 412 along path 441 until the cursor reaches the edge of that screen image, and then continue onto the screen image 422. As illustrated, the gesture 440 consists of several generally left-io-right strokes with the mouse pressed down (shown as solid Sines), interspersed with right-to-left movements with the mouse lifted (shown as dotted lines), and terminating with a mouse click, depicted by the circle at the end of the gesture.
At this point, an iteration of process 220 will detect that the values of Last_Cursor and
Last_Facing correspond to the same monitor 120. Thus, during step 224, the computer 130 will set Same„Screen Flag to a "true" condition, which then remains set for subsequent iterations. Movement of the cursor continues along the path 442 as the gesture 440 completes so the cursor reaches a final position 443, with the cursor now appearing in the shape of an arrow. Many modem applications manipulate the cursor representation. The arrow shape is the most common representation, but in the image editing application 191 , with the text insertion tool selected, the cursor appears I-beam shaped while the cursor position remains within the active window 314. The cursor shape would typically revert to the arrow shape as soon as the cursor leaves the active window 314 (from FIG. 3) along path 441 ,
When the user clicks the mouse at the end of gesture 440, with the position of the cursor 443 as shown, the operating system module 190 notifies the application 192 of the event, because application 192 is the "owner" of the window 424. resulting in the image editing application 191 becoming inactive, and the spreadsheet application 192 becoming active. As a result, application 19S hides the auxiliary menus 315 and the active window 314 (in FIG, 3) becomes the inactive window 414 (in FIG. 4), Immediately thereafter, the spreadsheet application 192 activates the window 424 and repopuiates menu bar 313 (of FIG. 3) with its own menu items, producing menu bar 414,
Depending on the. exact implementation details of the applications 191 and 192, the spreadsheet cell immediately under the cursor may or may not become selected. In this example, the user must further click the mouse to select the cell as shown in FIG. 5, which depicts the user as still facing the second monitor 120, The cursor 543 now appears cross- shaped, and the spreadsheet cell B3 containing a decimal representation of e becomes selected. The main window 514 for the image editing application 1 1 remains inactive in screen image 512, while the main window 524 for the spreadsheet application 192 remains active. At this point, user 31 OR can use the keyboard (not shown) to issue a copy command, which operating system module 190 dispatches to the spreadsheet application 192, thus resulting in copying the decimal representation of e into a clipboard memory (well known, not shown). During this time, repeated iterations of the loop in process 220 will detect that last and current facings and cursor position all correspond to the second monitor 120, and the Sanie_Screen flag becomes set to a "true" condition (at step 224).
FIG 6 shows the next instant in this example. At this time, the user 3 ! OL has turned to face the first monitor 1 10. The change in the user's position, as depicted from FIG. 5 to the new position depicted in FIG. 6 will now trigger the fol!owings actions. The next iteration of the process 220 will result in detecting that user now faces toward the camera 150, corresponding to the first monitor 1 10 during step 226. Thereafter, no change will occur in the position of cursor 543 (in FiG, 5) during step 227, as there is no gesture with mouse 140, Upon the next execution of step 228, the values of Last_Facing and Curreni^ Facing will correspond to different screens, causing the process 220 to branch toward step 229. During step 229, the computer 130 will determine that the cursor has not changed screens, so the process 200 will branch toward step 230. During step 230, the computer 1 0 detects that the Same„Screen Flag is in the "true" condition (from step 224). Thereafter, during step 231 the computer i 30 will save the current position of cursor 543 (from FSG. 5). in correspondence with the second monitor 120, in the saved cursor positions memory 210.
At this time, the user 310 will experience the benefits of the multi-screen management technique of the present principles and will be relieved from having to make an elaborate gesture (such as gesture 440) in order to proceed with his work. During step 232, the computer 130 determines that there is a saved cursor position that corresponds to the first monitor 110, leading the computer in step 233 to recall that position from the saved cursor positions memory 210 and then return the cursor to that position, resulting in the cursor moving to the position of cursor 330 in FIG. 6. Momentarily, the cursor may take on a default arrow shape, since the early cross- shape would not apply when outside the bounds of the active spreadsheet window 524 depicted in FIG. 5. The cursor would quickly change to i- beam shape as the spreadsheet application 192 loses focus and the main window 624 becomes inactive in screen image 622, while in screen image 612, image editing application 191 gains focus. As the corresponding window 614 becomes active, the submenu panels 315 return along with the menu bar 313. The text insertion point 331 once again appears and the window 614 now becomes the key window (i.e., the window that now becomes the target of keyboard input). Note that this time, the user does not need to make use of any complex, time- consuming, and fatigue-inducing gestures like the gesture 440: The computer 130 has now automatically returned the cursor to its last location on the screen of the first monitor 1 10, In some instances, depending upon the precise implementation of the operating system
1 0 and/or image the editing application 191 , the user may need to enter a mouse click following the automatic cursor movement. Some embodiments, discussed further below, can obviate the need for this additional mouse click by noting and reestablishing the status of the applications and windows as the computer 130 saves and restores the cursor position. For the present discussion, as would be typical while operating under the " I 1 " operating system mentioned above, the presumption exists thai no additional mouse click remains necessary.
Referring to FIG. 7, second screen image 722 remains the same with the spreadsheet window 724 inactive. As depicted in the first screen image 712, the active image editing window 714 accepts a paste command via the keyboard (not shown), resulting in the value previously copied to the clipboard memory now being inserted into the slide image at the text insertion point, after which the user types "and", before leaving the text insertion point at 333 (in FIG. 7).
Referring to FIG. 8, the computer 130 again relieves user 310 from having to make an elaborate gesture, like gesture 440, as the user 31 OR faces right, toward monitor 120. The facing detector module 53 of FIG. 1 will detect the change in the user facing, whereupon during the next iteration of process 220, the computer 130 will record the position of cursor 330 (as depicted in FIG. 7) in conjunction with the first monitor 1 10 and restore the position of cursor 543 on the second monitor 120. At the same time, the computer 130 will deactivate the image editing application 191 and activate the spreadsheet application 192. As a result, the main window 814 in the screen image 812 becomes inactive. The displayed menu 413 will now correspond to the spreadsheet application, and the main window 824 in the screen image 822 now becomes active, with the cursor 543 now appearing cross-shaped.
Now. assume the user 31 OR wants to copy the approximate value for pi. Referring to
FIG. 9, the user will make a tiny mouse gesture 940, resulting in the cursor 543 moving in the screen image 922 along the path 942 to become cursor 943. The mouse click at the end of gesture 940 selects cell "B2", which contains a value for pi, and the user issues a copy command with the keyboard. Since the spreadsheet window 924 now has now become the active key window, the command copies the value for pi to the clipboard memory. In the screen image 912, the inactive window 914 remains substantially unchanged, though the "Edit" item in menu bar 413 may flash briefly (not shown) in response to the copy command.
FIG. 10 depicts the circumstance when the user 310L turns back to the first monitor 1 10. The facing detector module 153 of FfG. 1 will detect the change in the facing of the user and alert the computer 130 of FIG. 1 accordingly. Now, during step 231 of FIG. 2, the computer 130 will save the position of cursor 943 in conjunction with the monitor 120, replacing the previous position of cursor 543 recorded during the transition from FIG. 5 io FIG. 6, Then, the computer 130 will restore the position of cursor 330 {which, in this example has not changed), to the monitor 1 10. Again, the spreadsheet application 192 has lost the user focus and the image editing application 191 has now gained the user focus. In the screen image 1022, the spreadsheet window 1024 becomes inactive so the cursor 943 (from FIG. 9) disappears, in the screen image 1012, the image-editing window 1014 becomes active, so cursor 330 becomes restored. The menu bar 313, representing the image editing application 391 , as weil the submenus 315, all return, and the text insertion point once again appears at the location 331 .
Referring now to FIG. 1 1 , in the screen image 1 1 12, the window 1 1 14 is again the key window (i.e., the target of keyboard commands) and the user pastes the value for pi with a keyboard command along a with the text message, "are only approximations." The "Edit" item in the menu bar 13 may briefly flash in response to the paste command. The Screen image 1 S 22 and the inactive window 1 124 remain unchanged.
FIG. 12 depicts the circumstance when the user 1 OR again turns toward second monitor 120. The screen image 1222 shows the restoration of the cursor 943 to its previous position (illustrating thai the earlier cursor 543 is no longer recalled). The spreadsheet applicaiion 192 and its main window 1224 now become active and the screen image 1212 shows the image editing application 191 and its main window 1214 as in active, while the menu bar 413 now responds to the spreadsheet application 192.
Many variants exist for of the screen management technique of the present principles described above. One possible variation within the scope of the invention relates to of the storage cursor position during step 231 as selected from a recent history of cursor positions (extending back, for example, for two seconds). The computer ί 30 could make an entry in the cursor storage position memory every time the user moves the mouse, which may require a time stamp for each entry. Alternatively, the computer 130 of F!G. 1 could record the mouse position at regular intervals, either by making use of an independent clock, or based on each pass through the loop (e.g., during step 227) As described previously with respect to of FIG, 4, the user 310R, using caution, did not move cursor 330 until after he or she had looked toward monitor 120. This deliberation by the user makes very clear the division between a change in the user facing from the first monitor 110 to the second monitor S 20, and the cursor transition between such monitors. Further, such deliberation will ensure thai the computer 130 will restore the position of cursor 330 in FIG. 4 later in FIG. 6 when the user 3 !OL looks back.
Users who first employ the multi-screen management technique of the present principles will likely react differently than experienced users. As the user's attention changes from the first monitor 1 10 to the second monitor 120, the user will likely initiate the mouse- controlled movement of the cursor (as along paths 441 and 442) in parallel. Generally, the facing change will proceed the cursor transition, but the cursor may leave its position before the facing detector module 153 of FIG. 1 has detected the facing change. In such a case, remembering the location where the cursor was ost-recently static, or relatively so. may represent a better choice for saving the cursor position during step 232.
Alternatively, the computer 130 can take the cursor position from a predetermined interval back in history, e.g., 500 mS to one second earlier. Still another alternative would be for the computer 130 to take the last position of the cursor that had changed less than a predetermined distance over a predetermined amount of time, e.g., wherever the cursor had remained relatively still (having not moved more than an inch on the screen, or 72 pixe!s) over the course of one second. Empirical testing can yield a refined choice of these values for the particular monitors and specific cursor control device 140 (whether or not a mouse) that exist within the computer workstation 100 of F!G. 1. This approach allows the computer 130 to ignore those mornenis during the gesture 440 when the user briefly lifts the mouse (the doited line portions) during which the cursor does not move.
in accordance with the technique described above, the computer 130 can use a cursor position selected from the history to determine the appropriate value for Current„Cursor during step 227, in which case the value of€urrent„Cursor will lag behind the actual current position of the cursor. Note that, depending on the approach used to select the historical cursor position, that lag can vary. For example, if the cursor sits in a particular spot for a particular interval, the computer BO would select thai position from history, and the lag remains effectively zero, but as the user moves the cursor away from that location at some rate so the computer 130 still selects thai position from the history, the lag grows, potentially up to the maximum size of the history. As soon as the cursor slows sufficiently, or stops for an adequate amount of time such the computer 130 will select that cursor position as the current position (or nearly so), the lag suddenly becomes short again. Note that this behavior has no adverse consequences. The value of returning the cursor to a previously saved cursor position depends on how intentional the cursor position was that was saved. A precise cursor placement generally corresponds to a slower cursor placement, and may remain static for significant intervals of time (as cursor 330 remain static throughout the editing of text during execution of the application 19! , which would typically take one or more seconds at each round).
Other embodiments can rely on enhancements to the facing detector module 153 of
FIG. 1. As discussed above, the workstation I! 00 relied on the facing signal to differentiate between the user substantially facing the camera 150 versus not facing the camera. However, the facing detector module 153 could have greater capability to provide more precise information. For example, the facing detector module 153 could have the ability to indicate direction, e.g., the user substantially facing the camera 150 versus facing
above beio left/right of the camera. Further, the facing detector module 153 could estimate head pose, resolved in angles of pitch and/or yaw. imbuing the facing detector module 153 with even greater sophistication would allow it to identify and track the user's gaze, which would allow the facing detector to determine not only which screen the user faces but also the particular region of the screen observed by the user. In some embodiments, the workstation 100 of FIG. 1 could employ an eye-tracking sensor, which may comprise one or more cameras as well as one or more some infrared emitters (not shown) to better discern the user's eyes and the direction they face. Alternatively, the user could wear an eye-tracking sensor on his or her head. Tobii Technology AS of Sweden markets eye-tracking sensors of the type described above. The Tobii IS20 arsd IS30 eye-tracking sensors can attach to monitors 1 10 and 120 and connect to the computer 130, while Tobii Glasses illustrate a headwora mobile eye tracker (likewise requiring an interface to computer 130 to replace camera 150 and camera interface module 152).
In some embodiments, the facin detection module 53 has the capability of identifying an artifact worn by user 130, for example something with a predetermined geometry such as a particular pair of glasses. By establishing an easily recognized head- mounted object on the user, the facing detection module 153 can more easily detect the user's facing. For example, a pair of glasses frames (no lens required), could include retroreflective markers (not shown) at geometrically significant positions on the glasses frames. The camera 150 could have a sensitivity to infrared (IR) light, and be co-located with one or more IR emitters (not shown), The retroreflective markers on the glasses frames would return light from the IR emitters and the apparent relative positions of the retroreflective marker can serve to determine the user's facing, a process well known in the art which has the ability to produce a good, repeatab!e result for any user. The 6D (six degree-of-freedom) Head Tracking package by SensoMotoric Instruments GmbH of Germany affords such a capability.
Wearing a special, machine-recognizable artifact, such as special glasses for example, for easy detection and recognition by the workstation 100 can afford a benefit when the faces of other individuals besides the user appear in the field of view of camera 50. For example, consider the situation (not shown) when a teacher looks over the shoulder of a student. Under such circumstances, the teacher's face should not serve as the controlling object. In one configuration, the workstation could ignore individuals wearing the artifact, or in an alternate configuration could ignore everyone except those wearing the artifact.
Typically, some image-based face detector modules (for example, those face detection modules built into certain Apple Computer company products embodying versions of Mac OS X later that version 1 .7) can provide not only the location of the user's face (typically lying in region often described as a bounding box), but also the locations of specific facial features such as the eyes and mouth, where the ratios formed from those locations (eye-spacing in comparison to eye -mouth spacing) can provide a finer measure of the user's head orientation.
In some embodiments, the user can train the facing detector module 153. For example, when a user makes cursor movements, e.g., clicking a button or selecting a menu item, the user will look at the cursor very precisely. Under such circumstances, the computer 130 can take account of information from the facing detector module 153, including the extracted facial features, in correspondence with the region in which the cursor action occurs. The computer 130 can presume that similar configurations of information subsequently obtained from the face detection and facial feature extracted from such face detection, relate to the same general region and, importantly, to the monitor corresponding to that region.
To carry out the multiple screen management technique of the present principles discussed above, the two-monitor, one-camera configuration of workstation 100 need o ly provide a binary indication of whether or not the user looks at the first monitor 110.
However, a more sophisticated embodiment of the workstation 100 of FIG. 1 could include a facing detector module 153 having the capability of recognizing when the user looks at the first monitor 110, or in the direction of the second monitor 120, or in some other direction (i.e., above or below monitor 110, or to the left side of monitor 1 10, or away from monitor 120). With the facing detector module 153 having such capability, the computer 130 could execute the process 220 to pause during step 226 until detecting the user as likely facing one monitor or the other. Alternatively, the computer 130, when executing step 226, could forego changing the value of Currem^Facing unless the facing deteciion module 153 detects the user as facing toward one of the monitors 1 10 and 120.
Other embodiments of the computer workstation 130 could include more monitors (not shown), for example a row of three monitors, with the camera 150 mounted on the center monitor. In this way, the signal from the facing detector module 153 could indicate thai the user faces the left, center, or right monitor. Further embodiments could include multiple cameras, as shown in FIG, 13, where the workstation 1 00 depicted in that figure has two cameras 150 and 1315 mounted on monitors 1 10 and 1320, respectively, with the two cameras having connections 151 and 1352, respectively to a multi-camera interface 1353. The Monitors 1 10 and 1320 have corresponding connections 1 1 1 and 1321 , respectively, to the monitor module 160. A facing detector module 1354 analyzes images from the two cameras 150, 1315 to determine which of the two cameras {and by extension, which of the two corresponding monitors) a user likely faces. A further extension of the workstation 1300 could include multiple cameras and a facing detection module having even greater sophistication. With such a workstation, a user could face any of one of a bank of monitors, several monitors wide and several monitors high (not shown). The facing detection module of such a workstation could manage determination of the user's facing by detecting whether a user faces an higher or lower rank of monitors, and whether the user faces a center, right or left monitor column.
FIG, 14 shows a state transition diagram 1400 indicating the manner in which the computer 130 determines when to save or recall cursor positions associated with a particular monitor, Sjn] for a workstation having M monitors, l<~ n M, The workstation may have one or more sensors to detect facing, e.g., camera 150 and/or camera 1315. The State transition diagram 1400 depicts eight states 1411-1414 and 142 S -4124, with a "current" state that corresponds to exactly one of these states at a given time. An independent "current" state can correspond to each of the monitors 1...M, with the state transition diagram effectively being replicated for each for each of the M current states.
When the workstation is first in use, the computer 130 will initialize the "current" state to one of the four bold-circle states 141 1-1414, forming meia-siate 1410, which constitutes a state where there exists no previous cursor position stored for the corresponding monitor S[n]. The four different states correspond to the possible combinations in which the current facing and current cursor position do or do not correspond to the particular monitor S[nj. As one, or the other, or both of the current facing and current cursor position changes, a transition between states occurs, according to the labeled transitions.
A second meta-state 1420 consists of the four non-boSd-circie states 1421 -1424. In each of meta-states 1410 and 1420, the top two circles represent the situation where the cursor appears on the monitor S[n], the bottom two circles represent the situation where the cursor appears on another monitor. The left two circles represent the situation where the user faces monitor S[n], and the right two circles represent the situation where the user faces another monitor (or in some embodiments, as previously discussed, merely facing somewhere other than screen of the monitor S[n]).
Within each meta-state, & transition from the one of the top circles to one of the bottom circles occurs whenever the cursor appeared on screen of the monitor S[n], but moves io the screen of a different monitor. Conversely, a transition from one of the bottom circles to one of the top circles occurs whenever the cursor moves to the screen of the monitor S[rt] from the screen of any other monitor, in the state diagram corresponding to monitor S[nj, no transition occurs merely because the cursor moves from some monitor different from monitor Sfn] to some other monitor different from the monitor S[n].
Similarly, within each meta-state, a transition from one of the left circles to one of the right circles occurs whenever the user previously faced the monitor S[n], but turns to face a screen of a different monitor, with one exception. Conversely, a transition from one of the right circles to one of the left circles occurs whenever the user turns to face the monitor S[n] after facing the screen of another monitor. In the state diagram corresponding to monitor Sf n], no transition occurs merely because the user changes facing between the screens of monitors neither of which constitutes Sfn].
The one exception noted above occurs during the state 141 1 , where the facing and cursor position are both on monitor S[n] (i.e., the Same_Screen flag would be set to "true" at step 224), when the user moves to face away from the monitor Sfn]. Under such
circumstances, a transition occurs between meta-states 1410 and 1420 and more specifically from the state 141 1 to the state 1423, When this particular transition occurs, the computer 1 0 saves the current cursor position In correspondence with the monitor Sfn] (or, as described previously, a cursor position selected from a recent history of cursor positions). Tl is transition between meta-states occurs in a single direction: Once saved cursor position exists for the monitor S[n], all subsequent states lie within the meta-state 1420. A similar transition and action occurs between states 1421 and 1423, but since a cursor position already exists for the monitor Sin], the computer 130 merely updates that position to the current cursor position (or, as above, a recent cursor position).
While in meta-state 1420, that is, while a cursor position is saved in correspondence with monitor Sf n], any change that results in the user's facing becoming associated with the monitor S[n], where the facing did not already correspond to the monitor S[n], produces the collateral action of the cursor position associated with monitor Sfn] being recalled. The meta- state 1420 now transitions to the state. 1421 , In particular, regardless of whether or not the cursor appeared on the monitor Sfn], if the user turns to re-face the monitor S[n] and the stored cursor position corresponds to the monitor S[n], then the computer 130 will recall and then apply that cursor position. As a result, the cursor returns to the monitor Sin], in the previously saved position. Even if the cursor already appeared on monitor S[n], application of the recalled cursor position will cause the cursor to jump to the saved position.
Consider the following example as illustrative of this condition. When the current state corresponds to state 14S2, a transition occurs if the facing changes at all, because the user previously faced the monitor S[n], but now a change has occurred and the user no longer faces that monitor. Whether a transition occurs to state 1413 or state 1414 depends on whether or not the cursor simultaneously changed to the monitor S[n] or not. When the user again faces toward ihe monitor SjnJ, or the cursor appears on the monitor S n], the transitions among the states occur when the cursor changes, or the facing changes, or both.
The component states 1411-1414 of meta-state 1410 have similarities to the corresponding component states 1421-1424 of meta-state 1420, which for the most part, leads to the same transitions connecting them, with the following four exceptions:
The first exception exists for the transition from the state 141 1 to the state 1423, where the facing and cursor position correspond to the same monitor (S[nj) but then the facing changes. Such a transition represents the first iransiiion thai causes saving of the cursor position for that particular monitor (S[n]) and constitutes the only transition between meta- states 1410 and 3420, In one embodiment, "the facing changes" may reflect that the user now faces toward a different monitor, i.e., SO] where 1 <- j <~ M and j on. in another embodiment. "the facing changes" may mean that the user no longer faces toward the monitor S[n], in which case, turning away from the monitor altogether counts as a transition. This transition from state 141 1 to state 1423 bear the designation as being triggered by a "facing change" but also identifies a particular action the transition triggers, namely, that the computer 130 now stores the current cursor position as corresponding to the monitor S[n].
The second exception is analogous to the 141 1 -to- 1423 transition, where 1421-to- !423 also results from the user turning away from monitor S[n] while the cursor position resides on S[n], Here, though, rather than a first-time cursor position being recorded for monitor S[n], instead the cursor position associated with monitor S[n] is updated.
The third exception, the transition from 1413 to 1412 and fourth exception, the transition from 1414 to 1412, are exceptions for the same reason. Both relate to situations where the facing returns to monitor S[n] while the cursor is on or moves to a different monitor, but in raeta-state 1420, whenever facing returns to monitor S[n), the transition is to state 1423 , because when facing returns to monitor S[n]„ the stored cursor position is recalled and applied.
FIG. S 5 shows a flowchart for a simplified multi screen management process 1500 that saves and recalls cursor positions, beginning during step 1501 wherein the user already faces a first monitor on which the cursor resides (corresponding to either of states 141 3 and 1421 of FIG. 14). During step 1502 of F!G. 15, the facing detector module 153 (or a similar such device) determines that the user has turned away f om the first monitor on which the cursor resides. During step 1503, the computer 1 0 saves the first position of the cursor on the first monitor in a saved cursor position memory 1510. Steps 1502 and 1503 together correspond to the transitions from either of states 1411 or 1423 to the state 1423, During step 1504, the facing detector module 153 determines that the user again faces the first monitor, regardless of where the cursor appears. During step 3505, the computer 130 recalls first position of the cursor from the memory 1510 and applies that position so that the cursor returns to the first position on the first monitor. Steps 1504 arid 1505 together correspond to fee transitions from either of states 1423 and 1424 to the state 1421. Process 1500 concludes during step 1506, but the process typically gets repeated many times,
FIG, 16 shows a flowchart for a similar process 1600 that further tracks and restores the user interface status of the various applications, such as applications 1 1 and 192. Steps 1601 - 1603 and the memory 1610 of FIG . 16 correspond to the steps 1501 - 1503 and memory 1510, respectively, of FTG. Ϊ 5. During step 1604, the computer 130 stores a first status of a first application having the focus of the GUI in the memory 1610. Considering thai different operating systems use different terminology and different window management paradigms, tills first status may comprise any of: a window of the first application comprising the main window (e.g., the window containirsg the active document for the application), a window of the first application comprising the key window (i.e., the window of the first application designated to receive keyboard events), and whether or not the first application, or one of its windows runs in a full screen mode. In some embodiments, the status of the first application may also comprise the state of certain controls, for example, whether media plays or not. In some embodiments, the computer 130 saves the first status only if at least one of the acti ve windows or the menu bar for the first application appears on the. first monitor. During step 1605 of FIG. 16, the facing detector module 153 of FIG. 1 determines that the user again faces the first monitor, as during step 1504.
During step 1606, the computer 130 recalls the first cursor recalled from the memory 1610 and applies that information, thereby restoring the cursor to the saved position, as during step 1505. During step 1607, the computer recalls the first status of the first application and applies that information, thereby returning the first application to a similar status. Process 1 00 concludes during step 1608, but gets repeated many times.
Process 1 00 has particular usefulness in conjunction with operating systems that do not typically dispatch mouse events (e.g., mouse move events) to the non-foremost application, unless the user clicks the mouse there. In some embodiments, the status saved during 5603 may merely reflect the foremost application. During step 1605, if that same application has not yet become, foremost, restoring the focus may merely require that the application become foremost again, which, in some embodiments, can occur by issuing a mouse-click event (as if the user had clicked the mouse 140). Alternatively, the computer 130 can generate another signal that induces the application of a window or control under the cursor's current position (restored during step 1606) to become active,
FIG, 17 shows a flowchart for a simplified multi-screen interface management process
1700 that saves and recalls status corresponding to a screen on a first monitor, beginning during step 1701 wherein the user already faces the first monitor. During step 1702, the facing detector module 153 determines thai the user has turned away from the first monitor at least partially representing a first application having the focus of the GUI. During step 1703, the compuier 130 stores a first status corresponding to the first screen monitor in a status memory 1 10, During step 1704, the facing detector module determines that the user again faces the first monitor. During step 1705, the computer 130 recalls the first status from the memory 1710 and applies that information to restore focus to the first application. The process 1700 concludes during step 1706, typically gets repeated many times,
The foregoing describes a technique for managing multiple screens of a compuier workstation, While the technique has been described above in connection with switching between two different applications, based on the user's focus on a particular monitor, the technique, the screen management technique is equally applicable where the first and second application couid represent different instances of the same program or even different instances of the same window.

Claims

CLAJMS 1 , A method for controlling a computer workstation operating a plurality of display devices, each display device displaying at least a portion of a user interface for a corresponding computer application for viewin by a user, comprising the steps of:
rendering active at least a first portion of a user interface for a first computer application displayed by a first one of the plurality of display devices upon detecting thai the user observes the first display device; and
responsive to detecting that t e user has switched from observing the first display device rendering inactive the first portion of the user interface for the first computer application and rendering active at least a second portion of a user interface for a second computer application displayed by a second one of the plurality of display devices.
2. The method according to claim 1 wherein the step of rendering active the second portion of the user interface for the second computer application includes the step of restoring settings of the second computer application to corresponding settings of the second computer application when last observed by the user,
3. The method according to claim 2 wherein the step of restoring settings includes restoring a cursor position,
4. The method according to claim 3 wherein the step of restoring the cursor position includes the steps of:
saving the cursor position associated with a user's last observation of the second display device; and
recalling the saved cursor position.
5. The method according to claim 3 wherein the step of restoring the cursor position includes the steps of:
saving the cursor position based on a history of past cursor positions prior to the user's last observation of the second display device; and
recalling the saved cursor position
6. The method according to claim 1 wherein the step detecting that the user has switched from observing the first display device comprises detecting that the user observes the second display device.
7. The method according to claim 6 wherein the step of detecting that the user observes the second display device further includes the steps of
capturing an image of the user; and
processing the captured image to determine whether the user gazes at the second display device.
8, The method according to claim 1 further comprising the step of storing a setting of the first computer application in further response to the detecting that the user has switched from observing the first display device.
9. The method according to claim 8 wherein the step of storing the setting of the first computer application comprises saving a status of at least one active window in the first computer application.
10. The method according to claim 8 further comprising the step of restoring the setting of the first computer application upon detecting that the user observes the first display device.
1 1. A method for controlling a computer workstation operating a plurality of display devices, comprising the steps of:
a) detecting that the user observes a first one of the plurality of display devices; b) responsive ιο detecting that the user has switched from observing the first display device storing a first position of a cursor on the first display device: and.
c) responsive to detecting thai the user has switched to observing the first display device, restoring the cursor to she first position.
12. The method of claim 1 1 wherein the first position is the position of the cursor when last observed by the user.
13. The method of claim 1 1 wherein the first position is based on a history of past cursor positions on the first display device.
14.. The method of claim 11 wherein step b) further comprises storing settings of a first computer application active on the first display device, and step c) further comprises restoring the settings of the first computer application.
15. The method of claim 11 wherein step a) comprises the steps of: i} capturing an image of the user; and,
ii) processing the image to determine that the user gazes at the first display device.
16. The method of claim 1 1 wherein step a) comprises detecting that the user does not observe a second one of the plurality of display devices.
17. The method of claim 16 wherein step a) comprises the steps of: i) capturing an image of the user; and,
ii) processing the image to determine that the user does not gaze at a second display device.
1 ! 8, A computer workstation for operating a plurality of display devices, each
2 display device displaying at least a portion of the user interface for a corresponding computer
3 application for viewing by a user, comprising:
4 means for determining when the user observes a first one of the plurality of display
5 devices and when the user does not observe the first display device; and
6 a computer responsive to the detecting means for rendering active at least a first
7 portion of a user interface for a first computer application displayed by the first display device
8 when the user observes the first display device and responsive to the user switching from
9 observing the first display device, rendering inactive the first portion of the user interface for 10 the first computer application and rendering active at least a second portion of a user interface S i for a second computer application displayed by a second one of the plurality of display
12 devices.
19. The workstation according to claim 18 wherein the computer restores settings of the second computer application to corresponding settings of that second computer application when rendering active the second portion of the user interface for the second computer application. 20, The workstation according to claim 1 wherein one of the restored settings includes a restored cursor position. 21. The workstation according to claim wherein one of the restored settings includes a status of at least one active window in the first computer application. 22. The workstation according to claim i 8 wherein the means for detecting the user includes a camera. 23. The workstation according to claim 18 wherein the means for detecting the user includes a plurality of cameras, each mounted on a separate one of the plurality of display devices to detect when the user gazes at that display device.
PCT/US2013/063181 2013-10-03 2013-10-03 Improved multi-screen management WO2015050545A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/063181 WO2015050545A1 (en) 2013-10-03 2013-10-03 Improved multi-screen management

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/063181 WO2015050545A1 (en) 2013-10-03 2013-10-03 Improved multi-screen management

Publications (1)

Publication Number Publication Date
WO2015050545A1 true WO2015050545A1 (en) 2015-04-09

Family

ID=49382622

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/063181 WO2015050545A1 (en) 2013-10-03 2013-10-03 Improved multi-screen management

Country Status (1)

Country Link
WO (1) WO2015050545A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019152013A1 (en) * 2018-01-31 2019-08-08 Hewlett-Packard Development Company, L.P. Operating user interfaces
WO2024040861A1 (en) * 2022-08-25 2024-02-29 湖北星纪魅族科技有限公司 Operation permission control method and system, and electronic device and storage medium
US11989475B2 (en) 2018-10-09 2024-05-21 Hewlett-Packard Development Company, L.P. Selecting a display with machine learning

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120141A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and systems for creation of hanging protocols using eye tracking and voice command and control
US20120272179A1 (en) * 2011-04-21 2012-10-25 Sony Computer Entertainment Inc. Gaze-Assisted Computer Interface
US20120324256A1 (en) * 2011-06-14 2012-12-20 International Business Machines Corporation Display management for multi-screen computing environments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120141A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and systems for creation of hanging protocols using eye tracking and voice command and control
US20120272179A1 (en) * 2011-04-21 2012-10-25 Sony Computer Entertainment Inc. Gaze-Assisted Computer Interface
US20120324256A1 (en) * 2011-06-14 2012-12-20 International Business Machines Corporation Display management for multi-screen computing environments

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019152013A1 (en) * 2018-01-31 2019-08-08 Hewlett-Packard Development Company, L.P. Operating user interfaces
US11307762B2 (en) 2018-01-31 2022-04-19 Hewlett-Packard Development Company, L.P. Operating user interfaces
US11989475B2 (en) 2018-10-09 2024-05-21 Hewlett-Packard Development Company, L.P. Selecting a display with machine learning
WO2024040861A1 (en) * 2022-08-25 2024-02-29 湖北星纪魅族科技有限公司 Operation permission control method and system, and electronic device and storage medium

Similar Documents

Publication Publication Date Title
KR102642883B1 (en) Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
EP2992424B1 (en) Proxy gesture recognizer
JP3627241B2 (en) Multi-user / multi-pointing device graphical user interface system
JP5944326B2 (en) Eye tracker based context action
EP3660670B1 (en) Gesture recognizers for controlling and modifying complex gesture recognition
US10223057B2 (en) Information handling system management of virtual input device interactions
EP3258366B1 (en) Event recognition
US9372590B2 (en) Magnifier panning interface for natural input devices
US20130212541A1 (en) Method, a device and a system for receiving user input
AU2018269159B2 (en) Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
US9285990B2 (en) System and method for displaying keypad via various types of gestures
WO2015183503A2 (en) Safari tab and private browsing ui enhancement
CN110651242B (en) Apparatus, method and graphical user interface for touch input processing
JP2005092476A (en) Multi-user/multi-pointing device graphical user interface system
CN117425874A (en) System, method, and user interface for interacting with multiple application views
WO2015050545A1 (en) Improved multi-screen management
US10228892B2 (en) Information handling system management of virtual input device interactions
WO2015183208A1 (en) Gaze based prediction device and method
KR102186642B1 (en) Interaction mouse control method, apparatus, program and computer readable recording medium
CN117076265A (en) Process indication method and device
CN117193574A (en) Split screen control method, device, equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13777414

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13777414

Country of ref document: EP

Kind code of ref document: A1