WO2015167531A2 - Poignée de curseur - Google Patents

Poignée de curseur Download PDF

Info

Publication number
WO2015167531A2
WO2015167531A2 PCT/US2014/036173 US2014036173W WO2015167531A2 WO 2015167531 A2 WO2015167531 A2 WO 2015167531A2 US 2014036173 W US2014036173 W US 2014036173W WO 2015167531 A2 WO2015167531 A2 WO 2015167531A2
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
grip
movement
processor
touch screen
Prior art date
Application number
PCT/US2014/036173
Other languages
English (en)
Other versions
WO2015167531A3 (fr
Inventor
Norman P. Brown
Robert J. Lockwood
John R. BUCK
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2014/036173 priority Critical patent/WO2015167531A2/fr
Publication of WO2015167531A2 publication Critical patent/WO2015167531A2/fr
Publication of WO2015167531A3 publication Critical patent/WO2015167531A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • Figure 1 is a schematic diagram of an example touch screen cursor interface system.
  • Figure 2 is a flow diagram of an example method that may be carried out by the system of Figure 1.
  • Figure 3 is a flow diagram of another example method that may be carried out by the system of Figure 1.
  • Figure 4 is a front view of an example touch screen illustrating reciprocating movement of an example cursor and an example cursor grip.
  • Figure 5 is a front view of the example touch screen of Figure 4 illustrating changing of parameters of the cursor grip in response to the reciprocating movement.
  • Figure 6 is a flow diagram of another example method that may be carried out by the system of Figure 1.
  • Figure 7 is a front view of an example touch screen illustrating misregistration of an example cursor with respect to a target.
  • Figure 8 is a front view of the touch screen of Figure 7 illustrating changing of parameters of an example cursor grip in response to registration of the cursor with the target.
  • Figure 9 is a flow diagram of another example method that may be carried out by the system of Figure 1.
  • Figure 10 is a flow diagram of an example method that may be carried out by the system of Figure 1.
  • Figure 11 is a front view of an example touch screen illustrating an example of interaction with an example cursor grip associated with an example cursor.
  • Figure 12 is a front view of the touch screen of Figure 11 illustrating changing of parameters of the cursor grip in response to identification of an input to the cursor as a determined from the interaction.
  • Figure 13 is a schematic diagram of an example implementation of the touch screen cursor interface system of Figure 1.
  • Figure 14 is a front view of an example touch screen of the system of Figure 13 illustrating an example cursor grip with an associated example cursor.
  • Figure 15 is a flow diagram of an example method for identifying inputs to a cursor based upon interactions with a cursor grip.
  • FIG. 1 schematically illustrates an example touch screen cursor interface system 20.
  • touch screen cursor interface system 20 facilitates precise positioning of a touch screen cursor with respect to graphical user interfaces on a touch screen.
  • touch screen cursor interface system 20 provides a person with feedback regarding the current state of the cursor, providing the person with confidence with regard to the person's intended input using the cursor.
  • Touch screen cursor interface system 20 comprises touch screen 22 and controller 23 comprising processor 24 and memory 26.
  • Touch screen 22 comprises tactile sensitive display that presents visible information, such as graphics, text and graphical user interfaces.
  • Touch screen 22 in response to being touched by a person's finger(s), by a stylus or by other mechanisms, outputs signals representing or indicating input such as selections, commands or the like.
  • Touch screen 22 may employ any of a variety of different touch screen technologies including, but not limited to, resistive touch screen sensing, surface acoustic wave sensing, capacitive sensing, surface capacitance sensing, projected capacitive touch sensing, mutual capacitance sensing, self-capacitance sensing, infrared grid sensing, infrared acrylic projection sensing, optical imaging, dispersive signal sensing, and acoustic pulse recognition sensing.
  • touch screen 22 is provided by a display screen or monitor of a computing device such as a desktop computer, a laptop computer, or a notebook computer.
  • touch screen 22 is provided by a touch screen digitizer such as a digitizer which is part of a personal digital assistant, a tablet computer, a flash memory player, a cellular telephone, a smart phone, digital navigators or any other various electronic devices having a display screen which also serves as an input device.
  • touch screen 22 is provided as part of a separate input device or pad distinct from the display screen or monitor of the computing device.
  • Processor 24 comprises one or more processing units which, amongst other functions, generates control signals directing the operation of touch screen 22.
  • processing unit shall mean a presently developed or future developed processing unit that executes sequences of instructions contained in a memory. Execution of the sequences of instructions causes the processing unit to perform steps such as generating control signals.
  • the instructions may be loaded in a random access memory (RAM) for execution by the processing unit from a read only memory (ROM), a mass storage device, or some other persistent storage.
  • RAM random access memory
  • ROM read only memory
  • mass storage device or some other persistent storage.
  • hard wired circuitry may be used in place of or in combination with software instructions to implement the functions described.
  • controller 23 may be embodied as part of one or more application-specific integrated circuits (ASICs). Unless otherwise specifically noted, the controller is not limited to any specific combination of hardware circuitry and software, nor to any particular source for the instructions executed by the processing unit.
  • Processor 24 follows instructions or programming stored in memory 26.
  • Memory 26 comprises a non-transitory computer-readable medium containing programming or code to direct processor 24 in the generation of control signals for controlling the operation of touch screen 22.
  • Memory 26 comprises display module 30, cursor module 32 and cursor grip module 34.
  • Display module 30 comprises computer-readable programming or code which direct processor 24 in the presentation of graphics 38, text 40 and graphical user interfaces 42 by touch screen 22.
  • Cursor module 32 comprises computer-readable programming or code which directs processor 24 in the generation and positioning or movement of cursor 46 on touch screen 22. Cursor 46, also referred to as a pointer, facilitates a selection of text 38, graphics 40 or graphical user interfaces 42 upon touch screen 22.
  • cursor 46 By positioning cursor 46 into contact with graphics 38, text 40 or graphical user interfaces 42, such graphics, text or graphic user interfaces may be dragged or moved, may be resized, may be selected so as to cause responsive action such as the presentation of a dropdown menu, the opening of a file, or the generation of other inputs.
  • cursor 46 is illustrated as an arrowhead, in other implementations, cursor 46 may have other sizes, shapes and configurations.
  • cursor 46 may comprise a vertical bar, such as when text is to be input at a selected location upon touch screen 22. In some implementations, cursor 46 may change configurations depending upon the location of cursor 46 or the displayed item/icon in contact with cursor 46.
  • cursor 46 may be sized such that it is difficult for a person to interact with cursor 46 through touch on-screen 22.
  • the person's fingers may be too large to precisely contact and move cursor 46.
  • the visible items upon touch screen 22, graphics 38, text 40 and graphical user interfaces 42 may be sized such that is difficult to precisely position cursor 46 with respect to a particular visible item on touch screen 22. As a result, pixel accurate targeting through the use of touch using cursor 46 is often difficult.
  • Cursor grip module 34 comprises computer-readable programming or code to direct processor 24 to generate control signals to cause touch screen 22 to visibly present cursor grip 50 (schematically illustrated).
  • Cursor grip 50 comprises a graphical region linked or associated with cursor 46 and sized larger than cursor 46. Cursor grip 50 is configured to be interacted upon by a finger or fingers of a person so as to manipulate cursor 46 as well as provide input with respect to any displayed item (graphic 38, text 40 or graphical user interfaces 42) contacted by cursor 46.
  • cursor grip 50 is sized larger than cursor 46 and is linked to cursor 46 such that movement of cursor 46 is linked to movement of cursor grip 50
  • cursor grip 50 facilitates manipulation of the smaller cursor 46.
  • a person may more easily locate his or her finger or fingers upon cursor grip 50, as compared to cursor 46, so as to move cursor 46 by moving his or fingers while in contact with cursor grip 50.
  • cursor grip 50 a person may more easily use his or her fingers to interact with cursor grip 50 so as to input commands with regard to the displayed item presently selected by cursor 46.
  • a person may more easily tap cursor grip 50, as compared to tapping the smaller cursor 46, when inputting a left click or left mouse button down input.
  • cursor grip 50 comprises an annular ring extending about cursor 46. In one implementation, cursor grip 50 comprises a shape
  • cursor grip 50 may comprise a circle, an oval or a polygon about and surrounding cursor 46.
  • cursor grip 50 may comprise a shape extending from cursor 46 or visibly or invisibly tethered to cursor 46.
  • cursor grip 50 comprises transparent portions, facilitating viewing of graphics 38, text 40 or graphical user interfaces 42 which underlie cursor grip 50.
  • cursor grip 50 comprises a transparent shape, wherein the perimeter defining the profile a shape is opaque or translucent.
  • selected portions of cursor grip 50 are transparent or translucent.
  • cursor grip 50 is opaque.
  • cursor grip module 34 additionally directs processor 24 to adjust or change one or more parameters of cursor grip 50 based upon one or more dynamic characteristics or the current or present state of cursor 46.
  • cursor grip module 34 changes one or more parameters of cursor grip 50 based upon (1) the present detected positional state of cursor 46 such as an extent to which cursor 46 is registered with, aligned with or in contact with a proximate displayed item such as graphics 38, text 40 or graphical user interface 42; (2) a particular operational state of cursor 46 such as whether cursor 46 is in a select state, a drag state, a left click state, a right-click state, a left double-click state, a right double-click state and the like; and/or (3) a detected motion state such as a detected reciprocating, back-and-forth movement of cursor 46.
  • Examples of the parameters of cursor grip 50 that are changed by processor 46, based upon the instructions provided by control grip module 34 include, but are not limited to: (1) a visible parameter of cursor grip 50 such as (A) the color of cursor grip 50; (B) the brightness of cursor grip 50; (C) the flashing frequency of cursor grip 50; (D) the shape of cursor grip 50; (EE) the opacity or translucency of cursor grip grip 50; (F) the size of cursor grip 50; and (2) an operational parameter of cursor grip 50 such as (A) responsiveness of cursor grip 50; the rate at which cursor grip 50 and cursor 46 move on touch screen 22 in response to movement of a person's fingers across touch screen 22; and/or (B) a magnification state of cursor grip 50 or regions of touch screen 22 about a relative to cursor grip 50.
  • a visible parameter of cursor grip 50 such as (A) the color of cursor grip 50; (B) the brightness of cursor grip 50; (C) the flashing frequency
  • system 20 By automatically switching cursor grip 50 to a magnification or hysteresis operational state, system 20 facilitates easier precise positioning of cursor 46. By automatically changing a visible parameter of cursor grip 50 based upon the state of cursor 46, system 20 provides a person with instantaneous and intuitive feedback as to whether the person's interaction with cursor grip 50 have been recognized and correctly interpreted as input by system 20.
  • FIG. 2 is a flow diagram of an example method 100 that may be carried out by system 20.
  • cursor module 32 directs processor 24 to display cursor 46 on touch screen 22 relative to one or more displayed items such as graphics 38, text 40 and graphical user interfaces 42.
  • cursor grip module 34 directs processor 24 to display cursor grip 50 on touch screen 22.
  • cursor grip module 34 retrieves, obtains or otherwise determines the present state of cursor 46. For example, cursor grip module 34 may retrieve such information from cursor module 42, from processor 24 and/or from touch screen 22.
  • cursor grip module 34 directs processor 24 to change or adjust one or more parameters of cursor grip 50 based upon the present state of cursor 46.
  • cursor grip module 34 changes one or more parameters of cursor grip 50 based upon (1) the present detected positional state of cursor 46 such as an extent to which cursor 46 is registered with, aligned with or in contact with a proximate displayed item such as graphics 38, text 40 or graphical user interface 42; (2) a particular operational state of cursor 46 such as whether cursor 46 is in a select state, a drag state, a left click state, a right-click state, a left double-click state, a right double-click state and the like; and/or (3) a detected motion state such as a detected reciprocating, back-and-forth movement of cursor 46.
  • Examples of the parameters of cursor grip 50 that are changed by processor 46, based upon the instructions provided by control grip module 34 include, but are not limited to, (1) a visible parameter of cursor grip 50 such as the color of cursor grip 50, the brightness of cursor grip 50, the flashing frequency of cursor grip 50 and the size of cursor grip 50; and (2) an operational parameter of cursor grip 50 such as (A) responsiveness of cursor grip 50, the rate at which cursor grip 50 and cursor 46 move on touch screen 22 in response to movement of a person's fingers across touch screen 22, and/or (B) a magnification state of cursor grip 50 or regions of touch screen 22 about a relative to cursor grip 50.
  • a visible parameter of cursor grip 50 such as the color of cursor grip 50, the brightness of cursor grip 50, the flashing frequency of cursor grip 50 and the size of cursor grip 50
  • an operational parameter of cursor grip 50 such as (A) responsiveness of cursor grip 50, the rate at which cursor grip 50 and cursor 46 move on touch screen
  • Figures 3-5 illustrate one example mode of operation of system 20.
  • Figure 3 is a flow diagram of an example method 200 in which an operational parameter of cursor grip 50 is adjusted based upon a detected motion state of cursor 46.
  • Figure 4 illustrates a detected motion state of cursor 46 on touch screen 22.
  • Figure 5 illustrates the changing of an operational parameter of cursor grip 50 in response to the detected motion state of cursor 46 shown in Figure 4.
  • cursor grip module 34 receives information or signals indicating motion or movement of cursor 46.
  • the movement of cursor 46 is inferred from the determined movement of cursor grip 250.
  • Cursor grip module 34 uses such information or signals to determine or identify reciprocal, back-and-forth movement of cursor 46.
  • Figure 4 illustrates reciprocal movement of cursor 46 and an example cursor grip 250 in the directions indicated by arrows 252 on touch screen 22.
  • cursor grip 250 comprises a circle completely surrounding or enclosing cursor 46.
  • cursor grip 250 is transparent or translucent but for the outer line or ring forming the periphery of cursor grip 250. Movement of cursor 46 in Figure 4 is facilitated by a person touching any part of cursor grip 250 and sliding his or her finger or fingers along the surface of touch screen 44 which results in movement of cursor grip 250 as well as cursor 46.
  • Such reciprocal movement is made in an attempt to register, align or make contact between cursor 46 and the displayed item 236 comprising either a graphic 38, text 40 or a graphical user interface 42.
  • cursor 46 is made in an attempt to register, align or make contact between cursor 46 and the displayed item 236 comprising either a graphic 38, text 40 or a graphical user interface 42.
  • cursor grip 250 even with the use of cursor grip 250, precise positioning of cursor 46 in contact with, in alignment with, or in registration with the display item 236 may be difficult.
  • cursor grip module 34 adjusts at least one parameter of cursor grip 250 based upon any determined supplicating movement of cursor 46.
  • cursor grip module 34 compares any identified reciprocating, back-and- forth movement of cursor 46 to a predefined threshold to determine whether an operational parameter of person grip 250 should be adjusted.
  • module 34 adjusts the one or more operational parameters of cursor grip 250 in response to reciprocation of cursor 46 occurring above a predefined frequency.
  • module 34 adjusts the one or more operational parameters of cursor 50 in response to a direction of movement of cursor 46 changing after traveling a distance less than a predefined threshold distance.
  • cursor grip module 34 changes an operational parameter of cursor grip 250 in response to detecting other movement characteristics of cursor 46 which indicate that the person is attempting to precisely locate cursor 46 into contact or in registration with a displayed item or a particular location on touch screen 22.
  • cursor movements are slowed in response to very small back-and- forth movements.
  • cursor grip module 34 adjusts an operational parameter of cursor grip 250, the movement rate of cursor grip 250.
  • cursor grip module 34 uses a hysteresis algorithm to apply very small movements to assist in the positioning of cursor 46.
  • cursor grip module 34 saves the previous eight touch positions and timestamps in a circular queue to determine if the person is trying to precisely position cursor 46.
  • other hysteresis methods or algorithms may be utilized to adjust the rate at which cursor 46 is moved with a number of pixels that cursor 46 is moved in response to dragging or sliding movement of a person's finger or fingers across touch screen 22.
  • cursor grip module 34 in response to detecting such reciprocating movement of cursor 46 and in response to the reciprocating movement satisfying a predefined criteria or predefined threshold serving as a basis for drawing the conclusion that the person is attempting to precisely locate cursor 46 with respect to a displayed target item 236 (shown in Figure 4), cursor grip module 34 automatically switches or actuates to a magnification mode or state. In one implementation, cursor grip module 34 automatically switches to a magnification mode in which the interior contents of cursor grip 250 are enlarged or magnified.
  • cursor grip module 34 automatically switches to a magnification mode in which the interior contents of cursor 250 are enlarged or magnified and in which one or more regions of touch screen 22 adjacent to or about cursor grip 250 are also enlarged or magnified. For example, a ring of a certain radius about the periphery of cursor grip 250 is also enlarged or magnified.
  • Figure 5 illustrates an example in which cursor grip module 34, in response to the detected reciprocal motion of cursor 46 (shown in Figure 4) satisfying a predefined criteria, adjusts both the movement rate and magnification operational parameters of cursor grip 250 of Figure 4.
  • cursor grip module 34 directs processor 24 to proportionally enlarge cursor grip 50 such that cursor grip 250' is presented on display screen 22. All the contents within cursor grip 250' are also proportionally enlarged.
  • touch screen 22 displays an enlarged cursor 46' along with a partially enlarged target item 236. Those displayed items outside of cursor grip 250' remain the same size as previously displayed in Figure 4.
  • cursor grip module 34 also slows the rate at which cursor 46 is moved in relationship to an extent at which the person's finger(s) is/are moved across touch screen 22 while in contact with cursor grip 250'.
  • movement of cursor 46 in Figure 4 prior to adjustment, may have occurred at a rate of x pixels per y amount of movement of a person's finger or fingers across touch screen 22 while contacting cursor grip 50.
  • movement of cursor 46' may occur at a rate of x/2, x/4, x/6 pixels per y amount of movement of a person's finger or fingers across touch screen 22 while contacting cursor grip 250.
  • Figure 5 illustrates both the movement rate of block 206 and the magnification of block 208 being adjusted, in other
  • one of either the movement rate or the magnification may be adjusted in response to a detected reciprocating movement of cursor 46 satisfying a predefined criteria.
  • Figures 6-8 illustrate another example mode of operation of system 20 which may be utilized independent of or in conjunction with the mode described above with respect to Figures 3-5.
  • Figure 6 is a flow diagram of an example method 300 in which a visible parameter of cursor grip 50 is adjusted based upon a detected positional state of cursor 46.
  • Figure 7 illustrates cursor 46 on touch screen 22 while out of alignment or not in registration with any displayed item, such as target display item 336.
  • Figure 8 illustrates the changing of the visible parameter of cursor grip 250 in response to the detected alignment or registration of cursor 46 with the target display item 336.
  • cursor grip module 34 receives signals indicating the current position of cursor 46. Cursor grip module 34 utilized such signals to determine the present cursor location. As indicated by block 304, cursor grip module 34 (shown in Figure 1) further receives signals identifying the present location of surrounding display items, such as proximate graphic items 38, text items 40 or graphical user interface items 42. Cursor grip module 34 compares the determined cursor location with the location of the possible target locations. As indicated by block 306, based upon the determined proximity of the present location of cursor 46 with respect to the closest target location, cursor grip module 34 adjusts a parameter of cursor grip 250. In one implementation, cursor grip module 34 automatically adjusts a visible parameter of cursor grip 250.
  • cursor grip module 34 changes one or more visible parameters of cursor grip 250 in a binary fashion depending upon whether cursor 46 is in registration with or is out of registration with a target location, the location of a displayed item closest to cursor 46. For example, cursor grip module 34 may switch cursor grip 250 from a first color to a second color once cursor 46 is in registration with the target location. In another implementation, cursor grip module 34 continuously or in a stepped fashion adjusts a visible parameter of cursor grip 250 depending upon a determined degree of proximity of cursor 46 to the target location.
  • cursor grip module 34 continuously adjusts the shade of a color of cursor grip 250 as cursor 46 is moved closer and closer and ultimately in registration with the target location (the nearest displayed item at any particular moment in time). For example, cursor grip module 34 may change the color cursor grip 250 from a pale blue to a light blue to a dark blue shade as cursor 46 approaches and ultimately registers with the target location.
  • Figure 7 illustrates an example cursor 46 and cursor grip 250 at a first location on touch screen 22.
  • cursor 46 is not in registration, not in contact, with any of the possible target locations of the displayed selectable items such as graphic item 38, text item 40, graphical user input items 42 or target item 336.
  • Figure 8 illustrates cursor 46 after cursor 46 has been moved into registration with target item 336.
  • cursor grip module 34 changes a visible parameter of cursor grip 250 shown in Figure 7 such that cursor grip 250' is displayed on touch screen 22 in Figure 8.
  • cursor grip 250' is visibly different than cursor grip 250 so as indicate to the person that he or she no longer needs to reposition cursor 46.
  • cursor grip 250' has a different color or difference shade of the color.
  • cursor grip 250' has a flashing state as compared to the non-flashing state of cursor grip 250 or flashes at a different frequency as compared to the flashing of cursor grip 250.
  • cursor grip 250' has a different degree of brightness as compared to cursor grip 250.
  • cursor 250' has a different size and/or a different shape as compared to the size and/or shape of cursor grip 250.
  • the different visible characteristic of parameter cursor grip 250' provides the person with positive feedback as to when he or she hits his target with cursor 46. Because cursor grip 250' is much larger than cursor 46, the change in the physical characteristics of cursor grip 250' is more noticeable, providing a more conspicuous indication.
  • Figure 9 illustrates another example mode of system 20.
  • Figure 9 is a flow diagram of an example method 400 that may be carried out by system 20.
  • cursor grip module 34 directs processor 24 to adjust a parameter of cursor grip 250 based upon the present cursor input state.
  • cursor grip module 34 determines or identifies the input state of cursor 46.
  • the cursor input state is the command or action input by the person while cursor 46 is at a particular location on touch screen 22.
  • Examples of different input states include, but are not limited to, a left click state, a left mouse button down state, a left button up state, a right click, a right mouse button down state, a right button up state, a double left click state, a double right-click state, a left click movement/drag state and a right click movement/drag state.
  • cursor grip module 34 adjusts one or more parameters of cursor grip 250 based upon the cursor input state.
  • the parameters of cursor grip 250 that are changed by processor 24, based upon the instructions provided by control grip module 34 include, but are not limited to, a visible parameter of cursor grip 250 such as the color of cursor grip 250, the brightness of cursor grip 250, the flashing frequency of cursor grip 250, the shape of cursor grip 250 and the size of cursor grip 250.
  • a visible parameter of cursor grip 250 such as the color of cursor grip 250, the brightness of cursor grip 250, the flashing frequency of cursor grip 250, the shape of cursor grip 250 and the size of cursor grip 250.
  • the person using system 20 is provided with visible feedback indicating if system 20 properly recognized the input provided for the displayed item selected by or contacted by cursor 46.
  • the person using system 20 is provided with feedback indicating whether system 20 has properly recognized the input provided by the person to cursor grip 250 for implementation with respect to the displayed item or location contacted by cursor 46.
  • Figures 10-12 illustrate an example implementation of method 400.
  • Figure 10 is a flow diagram of method 500, an example implementation of method 400.
  • Figure 11 illustrates interaction with cursor grip 50 so as to change cursor 46 to a new input state for implementation with respect to a displayed target item 536 being contacted by cursor 46.
  • Figure 12 illustrates the change of cursor grip 250 in Figure 11 to cursor grip 250' to indicate the new input state of cursor 46 resulting from the interaction shown in Figure 11.
  • cursor grip module 34 identifies two concurrent touches 570 of cursor grip 250 while cursor 46 is in registration with the target display item 536.
  • touches 570 may comprise two fingertips of a person simultaneously or concurrently tapping or contacting an interior region of cursor grip 250.
  • cursor grip module 34 recognizes such concurrent touches as input requesting a right click or a right mouse button down action.
  • cursor grip model 34 causes processor 24 to output a right click signal which results in action being taken with respect to a right-click being made upon targeted item 536.
  • cursor grip module 34 indicates the recognized right click input or request by changing one of more visible parameters of cursor grip 250 (shown in Figure 11) to form cursor grip 250' shown in Figure 12.
  • cursor grip 250 is changed to cursor grip 250' immediately after or during receipt of such recognized concurrent touch input, but prior to carrying out the action associated with a right-click upon displayed target item 536.
  • cursor grip 250 is change to cursor grip 250' as the action associated with a right-click upon displayed target item 536 is being carried out.
  • Examples of the parameters of cursor grip 50 that are changed by processor 46, based upon the instructions provided by control grip module 34 include, but are not limited to, a visible parameter of cursor grip 250 such as the color of cursor grip2 50, the brightness of cursor grip 250, the opacity or translucency of cursor grip 250, the flashing frequency of cursor grip 250, the shape of cursor grip 250 and/or the size of cursor grip 250.
  • a visible parameter of cursor grip 250 such as the color of cursor grip2 50, the brightness of cursor grip 250, the opacity or translucency of cursor grip 250, the flashing frequency of cursor grip 250, the shape of cursor grip 250 and/or the size of cursor grip 250.
  • FIG. 13 schematically illustrates touch screen cursor interface system 620, an example implementation of touch screen cursor interface system 20.
  • system 620 is incorporated as part of the computing device comprising display 622, processor 624, system memory 625, nonvolatile memory 626, video interface 630, user input interface 632 and system bus 636.
  • Display 622 is similar to touch screen 22 described above.
  • Display 622 comprises a tactile sensitive display that presents visible information, such as graphics, text and graphical user interfaces.
  • Display 622 in response to being touched by a person's finger(s), by a stylus or by other mechanisms, outputs signals representing or indicating input such as selections, commands or the like.
  • Display 622 may employ any of a variety of different touch screen technologies described above.
  • Processor 624 comprises one or more processing units which, amongst other functions, generates control signals directing the operation of display 622 following instructions contained in system memory 625.
  • System memory 625 comprise a non- transitory computer-readable medium in communication with processor 624 by system bus 636.
  • system memory 65 comprises read-only memory (ROM) 640 and random-access memory (RAM) 642.
  • ROM 640 stores a basic input/output system (BIOS) for system 620 such as basic routines regarding the transfer of information between elements within system 620, such as during startup.
  • BIOS basic input/output system
  • RAM 642 contains data and/or program modules that are immediately accessible to and/or present being operated on or carried out by processor 624. In one
  • RAM 642 stores and executes an operating system, application programs, program data and program modules.
  • Nonvolatile memory 626 comprises one or more non-transitory computer- readable media storing programs and data.
  • Memory 626 may comprise a non- movable or a removable persistent storage device.
  • Example of memory 626 include, but are not limited to, one or more optical disk, one or more magnetic disks, magnetic tape cassettes, flash memory cards, solid-state RAM or solid-state ROM.
  • Video interface 630 interfaces between display 622 and processor 624 of system 20 across system bus 636 with regard to displayed content. Video interface 630 facilitate the display of graphics, text and graphical user interfaces upon display 622.
  • User input interface 632 interfaces between display 622 and the user input components of system 20 across system bus 636 with regard to user inputted commands or selections.
  • memory 626 contains or stores cursor grip module 34 described above.
  • RAM 642 of system memory 625 also stores cursor grip module 34 as well as cursor module 32 and display module 30.
  • cursor grip module 34 changes or adjusts one or more parameters of a cursor grip based upon a determined state of the cursor associated with the cursor grip.
  • Figure 14 illustrates an example screenshot 700 of display 622 illustrating various displayed items such as graphics 638, text 640 and graphical user interfaces 642.
  • Figure 14 further illustrates an example cursor 646 generated by cursor module 32 and an example cursor grip 650 generated by cursor grip module 34 (described above).
  • cursor grip 650 comprises a semi-transparent or translucent annular ring centered about cursor 646. Cursor grip 650 facilitates viewing of displayed items through cursor grip 650.
  • cursor grip 650 comprises a semi-transparent or translucent annular ring centered about cursor 646. Cursor grip 650 facilitates viewing of displayed items through cursor grip 650.
  • cursor grip 650 has a radial width sized greater than the thickness on average person's fingertip. In one implementation, cursor grip 650 has a radial width W of at least 0.25 inches and nominally at least 0.5 inches. In other implementations, cursor grip 650 may have other dimensions. In other implementations, in lieu of being donut- shaped, cursor grip 650 may comprise a completely filled or completely translucent circle containing cursor 646. In still another implementation, cursor grip 650 may comprise a circular ring which is solid or translucent, wherein the interior of the ring is transparent.
  • cursor grip 650 additionally comprises directional cues 670.
  • Directional cues 670 are located along the perimeter of grip 650, indicating the movability of cursor grip 650.
  • directional cues 670 comprise four arrows pointing opposite directions along to orthogonal axes. In other implementations, additional or fewer of such arrows may be provided. In some implementations, cues 670 may have other configurations or may be omitted.
  • system 620 facilitates precise positioning of cursor 646 through manual interaction with cursor grip 650.
  • a person may touch a portion of cursor grip 650 and sliding his or her finger fingers across the touch screen of display 622 so as to reposition the centered cursor 646 which moves with the movement of cursor grip 650.
  • registration of cursor 646 with a particular one of displayed items, such as graphical user interface 646, results in a visible parameter of cursor grip 650 being changed by cursor grip module 34 (as discussed above with respect to Figures 6-8).
  • cursor grip module 34 may additionally change an operational parameter of cursor 650 by automatically actuating cursor grip 650 to a magnification state (as discussed above with respect to Figures 3-5) and/or by changing the movement rate of cursor grip 650 and cursor 646 (as discussed above with respect to Figures 3-5).
  • cursor 646 has been positioned at a desired location, such as in registration with an underlying displayed item, such as graphical user interface item 642, input may be provided also through cursor grip 650.
  • a left click or left mouse button down action input may be provided by the person tapping upon cursor grip 650 with one finger.
  • a right-click or right mouse button down action input may be provided by the person concurrently tapping upon cursor grip 650 with two fingers.
  • one more visible parameters of cursor grip 650 are further changed by processor 624 based upon the present input state of cursor 646 to provide feedback to the person as to whether his or her input request using cursor grip 650 has been properly recognized by system 620.
  • Figure 15 is a flow diagram of one example method 800 for recognizing different inputs or input states for cursor 646 based upon manual interactions with cursor grip 650.
  • system 620 is initially in a starting state or clear state.
  • processor 624 determines whether a first finger Fl is contacting or touching (Fl down) any part of cursor grip 650.
  • processor 624 also determines whether a second finger F2 is contacting or touching (F2 down) any part of cursor grip 650 while the first finger Fl is also contacting or touching any part of cursor grip 650.
  • processor 624 in response to receiving signals indicating that the second finger F2 is contacting or touching cursor grip 650 concurrently with the contacting of cursor grip 650 by first finger Fl, processor 624 outputs a right button down signal, actuating cursor 624 to a right button down input state. As a result, particular actions associated with a right button down input state for the particular application being carried out are executed.
  • processor 624 continues to look for signals from display 622 indicating that the second finger F2 has been lifted or withdrawn from cursor grip 650 (F2 up). As indicated by block 812, upon receiving signals from display 622 indicating that the second finger F2 has been withdrawn from cursor grip 650, processor 624 outputs a right button up signal. As a result, particular actions associated with a right button up input state for the particular application being carried out are executed. Thereafter, system 620 returns to block 814.
  • processor 624 if processor 624 fails to receive signals indicating that the second finger has or is in contact with cursor grip 650 or determines that the second finger F2 is not in contact with cursor grip 650 in block 806, processor 624 waits for expiration of a predetermined timeout period or window (TOl) before taking action. As indicated by block 816, during this window of time, processor 624 looks for or determines whether or not the primary or first finger Fl is still in contact with cursor grip 650 (Fl up).
  • TOl timeout period or window
  • processor 624 continues to also look for or determine whether or not the second finger F2 has been moved into contact with cursor grip 650 (F2 down) in block 806.
  • processor 650 carries out the series of steps indicated by blocks 808, 810 and 812.
  • processor 624 determines from signals received from display 622 that the first finger Fl has remained in contact with cursor grip 650 during the entire window of time TOl, the time out window TOl has expired (per decision block 816), and that the second finger F2 was not moved into contact with cursor grip 650 during this same window of time TOl (per decision block 806), processor 624 actuates cursor 646 to a movement state in which any subsequent movement of the primary finger Fl while in contact with cursor grip 650 drags or moves cursor grip 650 and cursor 646 across display 622 to reposition or relocate cursor 646.
  • processor 620 continues to output signals for the movement of cursor 646 and cursor grip 650 in response to movement of first finger Fl while in contact with cursor grip 650.
  • processor 624 determines that the first finger Fl has been withdrawn from cursor grip 650 (Fl up)
  • processor 624 and system 620 return to the initial start state 802.
  • processor 624 in response to receiving signals indicating, or otherwise making a determination, that the first finger Fl has been lifted or otherwise withdrawn from cursor grip 650 during the timeout window TOl per decision block 816, processor 624 applies a second timeout window T02. As indicated by block 824, during this window of time T02, processor 624 looks for or determines whether or not the first finger Fl has once again been moved into contact with cursor grip 650 (Fl down). As indicated by block 826, if processor 624 does not receive signals indicating that or does not determine that the first finger Fl has been moved back into contact with cursor grip 650 during the window of time T02, processor 624 outputs signals actuating cursor 646 to a left click input state.
  • processor 624 if processor 624 receive signals indicating that are otherwise determines that the first finger Fl has been repositioned into contact with first 650 during the window of time T02 (Fl down) per decision block 824, processor 624 applies a third timeout window or window of time T03. As indicated by blocks 832 and 834, during this window of time T03, processor 624 looks for signals indicating that, or otherwise determines whether, the first finger has been lifted or withdrawn from cursor grip 650 (decision block 832) or the second finger F2 has been moved into contact with cursor grip 650 while the first finger Fl remains in contact with the cursor grip 650 (decision block 834).
  • processor 624 in response to determining that first finger Fl has remained in contact with cursor grip 650 during the entire window of time T03 and that during the same window of time T03 that the second finger F2 has not been moved into contact with cursor grip 650 (T03 expired), processor 624 outputs a left button down signal actuating cursor 646 to a left button down and movement input state. While in the left button down and movement input state, processor 624 drags or moves cursor grip 650 and cursor 646 across display 622 to reposition or relocate cursor 646 in response to any subsequent movement of the primary finger Fl while in contact with cursor grip 650.
  • processor 624 continues to output signals for the movement of cursor 646 and cursor grip 650 in response to movement of first finger Fl while in contact with cursor grip 650 until the first finger has been lifted or otherwise withdrawn from cursor grip 650 (Fl up).
  • processor 624 determines that the first finger Fl has been withdrawn from cursor grip 650 (Fl up)
  • processor 624 and system 620 output a left button up signal to actuate cursor 646 to a left button up input state.
  • particular actions associated with a left button up input state for the particular application being carried out are executed. Thereafter, system 620 returns to the initial start state of block 802.
  • processor 624 in response to determining that the first finger has been withdrawn from (moved out of contact with) cursor grip 650 during the window of time T03, processor 624 outputs a left double-click signal actuating cursor 646 to a left double-click input state.
  • processor 624 outputs a left double-click signal actuating cursor 646 to a left double-click input state.
  • particular actions associated with a left double-click input state for the particular application being carried out are executed. For example, in one implementation and application, when cursor 646 is in a left double-click input state, a file being contacted by cursor 646 is opened. Thereafter, system 620 returns to the initial start state of block 802.
  • processor 624 in response to the second finger F2 remaining in contact with cursor grip 650 until expiration of the window of time T03, processor 624 outputs signals actuating cursor 646 to a right button down movement input state. While in the right button down and movement input state, processor 624 drags or moves cursor grip 650 and cursor 646 across display 622 to reposition or relocate cursor 646 in response to any subsequent movement of the primary or first finger Fl while in contact with cursor grip 650.
  • processor 624 continues to output signals for the movement of cursor 646 and cursor grip 650 in response to movement of first finger Fl while in contact with cursor grip 650 until the first finger has been lifted or otherwise withdrawn from cursor grip 650 (Fl up).
  • processor 624 determines that the first finger Fl has been withdrawn from cursor grip 650 (Fl up)
  • processor 624 and system 620 output a right button up signal to actuate cursor 646 to a right button up input state.
  • system 620 returns to the initial start state of block 802.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un exemple consiste à afficher un curseur sur un écran tactile, déterminer un état du curseur sur l'écran tactile et afficher une poignée de curseur sur l'écran tactile. Le mouvement de la poignée de curseur se produit en réponse à un contact manuel mobile avec la poignée de curseur. Le mouvement du curseur est lié au mouvement de la poignée de curseur. L'exemple consiste à modifier un paramètre de la poignée de curseur d'après l'état déterminé du curseur.
PCT/US2014/036173 2014-04-30 2014-04-30 Poignée de curseur WO2015167531A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2014/036173 WO2015167531A2 (fr) 2014-04-30 2014-04-30 Poignée de curseur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/036173 WO2015167531A2 (fr) 2014-04-30 2014-04-30 Poignée de curseur

Publications (2)

Publication Number Publication Date
WO2015167531A2 true WO2015167531A2 (fr) 2015-11-05
WO2015167531A3 WO2015167531A3 (fr) 2016-04-28

Family

ID=54359463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/036173 WO2015167531A2 (fr) 2014-04-30 2014-04-30 Poignée de curseur

Country Status (1)

Country Link
WO (1) WO2015167531A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210030491A1 (en) * 2014-11-13 2021-02-04 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
US11723734B2 (en) 2014-11-13 2023-08-15 Intuitive Surgical Operations, Inc. User-interface control using master controller

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7489306B2 (en) * 2004-12-22 2009-02-10 Microsoft Corporation Touch screen accuracy
US7770126B2 (en) * 2006-02-10 2010-08-03 Microsoft Corporation Assisting user interface element use
US8826181B2 (en) * 2008-06-28 2014-09-02 Apple Inc. Moving radial menus
US9465457B2 (en) * 2010-08-30 2016-10-11 Vmware, Inc. Multi-touch interface gestures for keyboard and/or mouse inputs
US20130125066A1 (en) * 2011-11-14 2013-05-16 Microsoft Corporation Adaptive Area Cursor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210030491A1 (en) * 2014-11-13 2021-02-04 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
US11723734B2 (en) 2014-11-13 2023-08-15 Intuitive Surgical Operations, Inc. User-interface control using master controller

Also Published As

Publication number Publication date
WO2015167531A3 (fr) 2016-04-28

Similar Documents

Publication Publication Date Title
US11604510B2 (en) Zonal gaze driven interaction
US9529527B2 (en) Information processing apparatus and control method, and recording medium
US9348458B2 (en) Gestures for touch sensitive input devices
JP5456529B2 (ja) グラフィカル・ユーザ・インターフェース・オブジェクトを操作する方法及びコンピュータシステム
US8976140B2 (en) Touch input processor, information processor, and touch input control method
EP3088997A1 (fr) Interaction déformation de retard-regard
US20100229090A1 (en) Systems and Methods for Interacting With Touch Displays Using Single-Touch and Multi-Touch Gestures
US20110227947A1 (en) Multi-Touch User Interface Interaction
US20130257734A1 (en) Use of a sensor to enable touch and type modes for hands of a user via a keyboard
US9965141B2 (en) Movable selection indicators for region or point selection on a user interface
US9477398B2 (en) Terminal and method for processing multi-point input
US10402080B2 (en) Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium
JP2014182814A (ja) 描画装置、描画方法および描画プログラム
US10222866B2 (en) Information processing method and electronic device
WO2015167531A2 (fr) Poignée de curseur
US20150020025A1 (en) Remote display area including input lenses each depicting a region of a graphical user interface
CN110262747B (zh) 控制终端的方法、装置、终端及存储介质
US20210349625A1 (en) Using a touch input tool to modify content rendered on touchscreen displays
US20240086026A1 (en) Virtual mouse for electronic touchscreen display
CN116048370A (zh) 显示设备及操作切换方法

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14890501

Country of ref document: EP

Kind code of ref document: A2