US20170329487A1 - Computer with graphical user interface for interaction - Google Patents

Computer with graphical user interface for interaction Download PDF

Info

Publication number
US20170329487A1
US20170329487A1 US15/651,133 US201715651133A US2017329487A1 US 20170329487 A1 US20170329487 A1 US 20170329487A1 US 201715651133 A US201715651133 A US 201715651133A US 2017329487 A1 US2017329487 A1 US 2017329487A1
Authority
US
United States
Prior art keywords
user interface
user
type
touch input
interface element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/651,133
Inventor
Andrew D. Wilson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US15/651,133 priority Critical patent/US20170329487A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WILSON, ANDREW D.
Publication of US20170329487A1 publication Critical patent/US20170329487A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Computing systems are ubiquitous in business, school and entertainment. The uses and applications of computing systems continue to increase as the capabilities of the systems increase. Furthermore, existing computing systems have expanded capabilities and different types of systems have been more recently introduced as computing technologies continue to improve.
  • At least some aspects of the disclosure provide improvements with respect to processing of user interactions with graphical user interfaces.
  • a graphical user interface is displayed which includes one or more user interface elements which a user may interact with.
  • the displayed user interface elements may be different types of elements (e.g., button, scroll bar) which users interact with in different ways.
  • user inputs received by the graphical user interface may be processed differently corresponding to the different types of user interface elements which are displayed.
  • the processing of the user inputs determines whether the user inputs control (e.g., user inputs activate and/or manipulate) displayed user interface elements in one example.
  • a plurality of different user input processing methods are available to be used to process user inputs and the methods are tailored to process the user inputs in different ways corresponding to the different types of user interface elements which are displayed.
  • users may interact with the different types of elements in different ways (e.g., depress a button-type element or drag a scroll box of a scroll bar-type element) and the processing methods may be configured to process the user inputs in accordance with specific types of user interactions which may be expected to occur if the user intended to interact with the different types of user interface elements.
  • the use of different processing methods enables processing of the user inputs which is tailored to the types of user interface elements being displayed instead of using a single processing method to process user inputs with respect to different types of user interface elements.
  • the computing system selects the processing methods which correspond to the different types of interface elements which are displayed, and then uses the selected methods to process the user inputs to determine whether the user inputs activate and/or manipulate the different types of interface elements. For example, if a button-type element is displayed, one of the processing methods which is tailored to process inputs with respect to a button is selected to process user inputs to determine if the user inputs activate and/or manipulate the button-type element. If other types of user interface elements are displayed, other processing methods may be selected and utilized to process the user inputs to determine if the user inputs activate and/or manipulate the other types of displayed user interface elements as described in further detail below. Some specific example techniques for processing user inputs with respect to different user interface elements are described in the disclosure below.
  • At least one computer-readable storage medium comprises programming stored thereon that is configured to cause processing circuitry to perform processing operations with respect to a graphical user interface.
  • the programming may cause the processing circuitry to display the graphical user interface which includes first and second different types of user interface elements.
  • User inputs with respect to the graphical user interface are processed differently using first and second different user input processing methods which are tailored to process the user inputs with respect to the first and second different types of user interface elements.
  • the processing of the user inputs using the first and second user input processing methods determines whether the user inputs activate and/or manipulate the first and/or second different types of user interface elements as discussed in detail below according to example embodiments of the detailed description.
  • the computing system analyzes each user input separately (and perhaps differently) with respect to all user interface elements (which can include elements of different types) that are displayed.
  • the processing of the user input for a displayed user interface element determines whether the user input controls that user interface element.
  • a computing system includes processing circuitry and an interactive display configured to display a graphical user interface.
  • the graphical user interface includes one of a plurality of different types of user interface elements.
  • the processing circuitry determines the type of the user interface element which is displayed in the graphical user interface and processes user input received by the graphical user interface in accordance with the type of the displayed user interface element to determine if the user input activates and/or manipulates the displayed user interface element.
  • FIG. 1 is an illustrative representation of a computing system according to one embodiment.
  • FIG. 2 is a functional block diagram of a computing system according to one embodiment.
  • FIG. 3 is an illustrative representation of a user interface according to one embodiment.
  • FIG. 4 is an illustrative representation of a region of user interaction with respect to a button-type element according to one embodiment.
  • FIG. 5 is an illustrative representation of a method of processing user input using a summed area table according to one embodiment.
  • FIG. 6 is an illustrative representation of motion vectors of user interaction with respect to a scroll bar-type element according to one embodiment.
  • FIG. 7 is an illustrative representation of interaction of a user's paint brush with respect to the user interface according to one embodiment.
  • FIG. 8 is a flow chart of a method of processing user inputs with respect to user interface elements according to one embodiment.
  • FIG. 9 is a flow chart of a method of processing user inputs with respect to a button-type element according to one embodiment.
  • FIG. 10 is a flow chart of a method of processing user inputs with respect to a scroll bar-type element according to one embodiment.
  • a graphical user interface of the interactive display may include different types of user interface elements (e.g., buttons, scroll bars) which users may interact with in different ways.
  • the user inputs are processed to determine whether the inputs control (e.g., activate and/or manipulate) one or more displayed user interface elements.
  • different processing methods are available to be used to process the user inputs and the different processing methods may be selected and used according to the respective types of user interface elements which are displayed when the user inputs are received.
  • a first processing method may be used to determine if a user input controls a first type of user interface element (e.g., button) and a second processing method may be used to determine if the user input controls a second type of user interface element (e.g., scroll bar).
  • first type of user interface element e.g., button
  • second processing method may be used to determine if the user input controls a second type of user interface element (e.g., scroll bar).
  • the use of different processing methods enables the methods to be tailored to process different types of user inputs in different ways in accordance with the types of user interface elements which are displayed and the types of user inputs (e.g., pressing a button or moving a scroll box) which are expected to be received if the user intended to control the displayed user interface elements. Additional details regarding the processing of user inputs using different processing methods are discussed below.
  • the computing system 10 includes a user interface 12 which is configured to interact with one or more user.
  • the user interface 12 is implemented as a graphical user interface (GUI) which is configured to display visual images for observation by one or more users and to receive user inputs from one or more users.
  • GUI graphical user interface
  • user interface 12 may generate visual images which include user interface elements (e.g., buttons, scroll-bars, etc.) and users may control (e.g., activate and/or manipulate) the user interface elements by interacting with the user interface elements which are displayed.
  • the example embodiment of the user interface 12 includes an interactive display 20 which depicts visual images including the graphical user interface and user interface elements and receives user inputs with respect to the user interface 12 .
  • interactive display 20 is configured as a multi-touch interface which is configured to simultaneously process a plurality of user inputs resulting from user input objects (e.g., fingers, hands, stylus, paintbrush) which are provided adjacent to or contact the interactive display 20 at substantially the same moment in time in one example embodiment.
  • user input objects e.g., fingers, hands, stylus, paintbrush
  • One example of a computing system 10 with a multi-touch interactive display 20 is a Surface® computing system available from Microsoft Corporation. Other configurations of computing system 10 and user interface 12 are possible. Additional details regarding user interaction with computing system 10 using the user interface 12 are discussed further below.
  • computing system 10 includes user interface 12 , processing circuitry 14 , storage circuitry 16 , and a communications interface 18 .
  • Other embodiments of computing system 10 are possible including more, less and/or alternative components.
  • computing device 12 may be provided in other arrangements in other embodiments where interaction with one or more users is implemented including, for example, desktop computers, notebook computers, portable devices (e.g., personal digital assistants, cellular telephones, media devices) or other computing arrangements.
  • user interface 12 is configured to interact with a user including conveying data to a user (e.g., displaying visual images for observation by the user) as well as receiving inputs from the user.
  • User interface 12 may be configured differently in different embodiments.
  • processing circuitry 14 is arranged to process data, control data access and storage, issue commands, and control other desired operations.
  • processing circuitry 14 processes information regarding user inputs received via user interface 12 and controls depiction of visual images which are created by user interface 12 .
  • the processing circuitry 14 processes the information regarding the user inputs to determine whether the user inputs pertain to (e.g., activate and/or manipulate) one or more user interface elements which may be displayed using user interface 12 .
  • Processing circuitry 14 may comprise circuitry configured to implement desired programming provided by appropriate computer-readable storage media in at least one embodiment.
  • the processing circuitry 14 may be implemented as one or more processor(s) and/or other structure configured to execute executable instructions including, for example, software and/or firmware instructions.
  • Other exemplary embodiments of processing circuitry 14 include hardware logic, PGA, FPGA, ASIC, state machines, and/or other structures alone or in combination with one or more processor(s). These examples of processing circuitry 14 are for illustration and other configurations are possible.
  • Storage circuitry 16 is configured to store programming such as executable code or instructions (e.g., software and/or firmware), electronic data, databases, image data, or other digital information and may include computer-readable storage media. At least some embodiments or aspects described herein may be implemented using programming stored within one or more computer-readable storage medium of storage circuitry 16 and configured to control appropriate processing circuitry 14 .
  • the computer-readable storage medium may be embodied in one or more articles of manufacture 17 which can contain, store, or maintain programming, data and/or digital information for use by or in connection with an instruction execution system including processing circuitry 14 in the exemplary embodiment.
  • exemplary computer-readable storage media may include any one of physical media such as electronic, magnetic, optical, electromagnetic, infrared or semiconductor media.
  • Some more specific examples of computer-readable storage media include, but are not limited to, a portable magnetic computer diskette, such as a floppy diskette, a zip disk, a hard drive, random access memory, read only memory, flash memory, cache memory, and/or other configurations capable of storing programming, data, or other digital information.
  • Communications interface 18 is arranged to implement communications of computing system 10 with respect to external devices (not shown).
  • communications interface 18 may be arranged to communicate information bi-directionally with respect to computing system 10 .
  • Communications interface 18 may be implemented as a network interface card (MC), serial or parallel connection, USB port, Firewire interface, flash memory interface, or any other suitable arrangement for implementing communications with respect to computing system 10 .
  • MC network interface card
  • serial or parallel connection USB port
  • Firewire interface Firewire interface
  • flash memory interface or any other suitable arrangement for implementing communications with respect to computing system 10 .
  • the illustrated interactive display 20 includes a display screen 22 which is configured to depict a plurality of visual images for observation by a user.
  • the depicted example visual image of FIG. 3 includes a window 23 which includes a plurality of different types of user interface elements which users may interact with during interactive sessions with user interface 12 of computing system 10 . More specifically, in the example of FIG. 3 , two different types of user interface elements are shown including a button-type element 26 within a window 24 and a scroll bar-type element 30 which corresponds to content being displayed within window 23 in the illustrated example. Other types of user interface elements which users may view and interact with may be utilized.
  • buttons-type element 26 Users interact differently with respect to the different types of user interface elements. For example, users may depress button-type element 26 by bringing a user input object approximate to or contacting a location of interactive display 20 where the button-type element 26 is displayed. Users may move a user input object upwardly or downwardly along an area of interactive display 20 which corresponds to the location of the scroll bar-type element 30 to move a scroll box 32 upwardly or downwardly within a bar region 34 of scroll bar-type element 30 . These example interactions of the user may activate and manipulate the button-type element 26 or scroll bar-element 30 .
  • Interactive display 20 receives user inputs of one or more users interacting with one or more of the displayed user interface elements 26 , 30 .
  • the user inputs are indicative of interaction of one or more user input objects with the user interface 12 and which may be processed by processing circuitry 14 to determine whether the user inputs control one or more of the displayed user interface elements in one embodiment.
  • one or more user input objects may interact with one or more different locations of display screen 22 at substantially the same moment in time to activate and/or manipulate one or more user interface elements displayed using display screen 22 .
  • interactive display 20 may provide information regarding user inputs in the form of an image of user interaction (also referred to as a user input image) which defines a plurality of pixel locations of the display screen 22 which were activated (selected) by one or more user input object interacting with the user interface 12 at substantially the same moment in time.
  • the activated pixel locations may correspond to one or more different regions of the display screen 22 corresponding to one or more user input objects interacting with the user interface 12 at the respective moment in time.
  • information in the form of a series of user input images may be generated at sequential moments in time by interactive display 20 to enable substantially continual monitoring by processing circuitry 14 of user interactions with respect to user interface 12 during operations of computing system 12 .
  • Interactive display 20 may be embodied in any appropriate configuration to provide information regarding user inputs interacting with the user interface 12 , for example, by bringing one or more user input objects approximate to or contacting different regions of display screen 22 .
  • interactive display 20 may include an infrared camera (not shown in FIG. 3 but which may be placed behind the display screen 22 in one embodiment) which captures user input images of interactions of user input objects with respect to the display screen 22 to provide the information regarding the user inputs.
  • infrared or capacitive sensors are embedded at pixel locations within the display screen 22 and are used to provide the information regarding user inputs.
  • the display screen 22 may include an array of capacitive sensors arranged in grid across the surface of the display screen 22 and which is configured to provide information regarding user inputs. Any suitable configurations may be used to provide the information regarding user inputs in other embodiments.
  • the user inputs received by the interactive display 20 may be processed by the processing circuitry 14 to determine whether the user inputs pertain to one or more displayed user interface elements.
  • the processing circuitry 14 identifies one or more regions of display screen 22 which were interacted with by the one or more user input objects. The identified regions may correspond to locations of the display screen 22 which were interacted with by user input objects in an illustrative example.
  • processing circuitry 14 may use motion estimation processing techniques to estimate motion with respect to the display screen 22 between different images.
  • the processing circuitry 14 may extract motion vectors across the surface of display screen 22 by calculating an optical flow field using a plurality of user input images and which includes vectors which are indicative of a pattern of motion of user inputs with respect to the display screen 22 between the user input images.
  • Processing circuitry 14 processes the information regarding the user inputs provided by interactive display 20 to determine whether the user inputs control one or more of the displayed user interface elements as described in detail below with respect to example embodiments of FIGS. 4 and 6 .
  • user interface 12 may display a plurality of user interface elements of different types.
  • processing circuitry 14 may process user inputs differently with respect to the different types of user interface elements which are displayed.
  • a plurality of different user input processing methods may be used to process the information regarding user inputs with respect to the user interface elements of different types.
  • the information may be processed with respect to button-type element 26 using a first user input processing method which corresponds to button-type elements and the user input may be processed with respect to scroll bar-type element 30 using a second user input processing method which corresponds to scroll bar-type elements.
  • storage circuitry 16 may store information comprising a list of the user interface elements (and their respective types) which are displayed using the user interface 12 .
  • Processing circuitry 14 may access the information regarding the displayed user interface elements to ascertain the types and locations of the displayed user interface elements to process the information regarding the user inputs.
  • the storage circuitry 16 may also store a plurality of different user input processing methods which are used by processing circuitry 14 to process the information regarding the user inputs for the different types of the user interface elements, respectively.
  • the processing circuitry 14 may select and utilize appropriate different processing methods to process the information regarding user inputs with respect to the different types of user interface elements which are displayed by the user interface 12 at moments in time corresponding to the user interactions.
  • an exemplary user input processing method is described with respect to processing of information regarding user inputs with respect to a button-type element 26 according to one embodiment. Other methods may be used to process the information with respect to button-type elements in other embodiments.
  • FIG. 4 depicts the information regarding a user input in the form of a region 40 which corresponds to a region of pixels of interaction of a user input object (e.g., fingertip) with display screen 22 .
  • region 40 is indicative of one user input object interacting with display screen 22 .
  • the activated pixels of the region 40 may be provided within a user input image of the display screen 22 which also includes any other regions of pixels which were activated by other user input objects interacting with display screen 22 at substantially the same moment in time (if any).
  • the processing circuitry 14 selects an appropriate user input processing method for processing the region 40 of the information regarding the user input with respect to the button-type element 26 .
  • the processing circuitry 14 determines the number of pixels of the area of button-type element 26 which overlap with the region 40 of interaction by the user.
  • the processing circuitry 14 compares the resultant number of overlapping pixels of the area of button-type element 26 and region 40 with a threshold to determine if the user interaction pertains to the button-type element 26 .
  • An appropriate threshold may be empirically determined in one embodiment.
  • a plurality of thresholds may be used in different processing embodiments. More specifically, a higher threshold may be used in one embodiment if it is desired to reduce the occurrence of false positive interaction determinations resulting from the processing, or a lower threshold may be used if it is desired to reduce the occurrence of false negative interaction determinations resulting from the processing.
  • the processing circuitry 14 may change the button-type element 26 .
  • the button-type element 26 may be activated from a non-activated state and manipulated (e.g., changed to a depressed state from a non-depressed state) as a result of the determination that the user input controls the button-type element 26 .
  • the activation and manipulation of the button-type element 26 may initiate additional processing by the processing circuitry 14 depending upon the application and functionality associated with the button-type element 26 (e.g., save a document, make a spell check change suggestion, etc.).
  • the user inputs may also be processed (perhaps differently as described below) with respect to other user interface elements (e.g., scroll bar-type element 30 ) which are also displayed by display screen 22 at the moment in time when the information regarding the user input is ascertained.
  • other user interface elements e.g., scroll bar-type element 30
  • the user interaction is considered to not pertain to the button-type element 26 and the button-type element 26 is not changed (e.g., not activated nor manipulated).
  • the information regarding the user input may also be processed with respect to other user interface elements which are also displayed as mentioned above.
  • some interactive displays 20 are configured to provide information of user input which includes different types of pixels for the user input. For example, for a contact resulting from a finger, the center of the contact region may be weighted an increased amount compared with outside portions of the contact, for example corresponding to the edges of the finger contact. This weighting may be first applied and a weighted sum of the activated pixels within the geometric boundaries of the button-type element 26 may be computed before comparison to the above-described threshold in one embodiment.
  • the user input processing method may additionally process the information regarding user inputs in an attempt to reduce the occurrence of false activations of a user interface element.
  • center-surround processing may be implemented.
  • a number of pixels within the geometric boundaries of the button-type element 26 which are activated by a user input may be multiplied by a weighting (e.g., two) providing a first value which is indicative of the weighted pixels.
  • a number of additional pixels activated by the user input which are located proximate the button-type element 26 but outside the geometric boundaries of element 26 may be added to the number of activated pixels within the boundaries (which are unweighted) to provide a second value.
  • the second value may be subtracted from the first value and the result of the subtraction may be compared to a threshold. If the result is greater than the threshold, then the user interaction may be deemed to control the button-type element 26 while the user interaction may not be deemed to control the button-type element 26 if the result is less than the threshold in one embodiment.
  • an integral image technique is implemented which uses a summed area table of activated pixels to quickly and efficiently determine the number of pixels within the geometric boundaries of the button-type element 26 . More specifically, the summed area table is calculated once and includes the activated pixels and their respective coordinates within the display screen 22 . The summed area table may be thereafter be used to efficiently identify the number of activated pixels within the geometric boundaries of the button-type element 26 .
  • a summed area table to determine the number of activated pixels within the geometric boundaries of button-type interface 26 is described. Initially, a plurality of rectangles 44 - 47 may be determined from the origin (e.g., lower left corner in the described example) with respect to the button-type interface 26 and values for the respective rectangles may be determined using the summed area table and which indicate the numbers of activated pixels present within the respective rectangles 44 - 47 . Thereafter, the values for rectangles 45 and 46 may be subtracted from the value of rectangle 44 .
  • rectangle 47 may be added to the result of the subtraction of rectangles 45 , 46 from rectangle 44 to provide the number of activated pixels present within the geometric boundaries of button-type element 26 and which may thereafter be used to determine whether the user input controls button-type element 26 .
  • an exemplary user input processing method is described with respect to scroll bar-type element 30 according to one embodiment. Other methods may be used to process scroll bar-type elements 30 in other embodiments. Furthermore, the user input processing method used by processing circuitry 14 to process the user input with respect to the scroll bar-type element 30 is different than the user input processing method described above to process the information with respect to button-type elements 26 as is apparent from the following discussion.
  • processing circuitry 14 may be able to ascertain one or more motion vectors of user interactions from user inputs received by interactive display 20 .
  • FIG. 6 one example of a plurality of motion vectors 42 determined from the information regarding user inputs is illustrated.
  • processing circuitry 14 selects one of a plurality of different user input processing methods pertinent to scroll bar-type element 30 to process user inputs.
  • processing circuitry 14 may analyze the motion vectors 42 with respect to the scroll bar-type element 30 to determine whether the user interactions pertain to the scroll bar-type element 30 .
  • the scroll box 32 is configured to move either upwardly or downwardly in a vertical direction in the bar region 34 .
  • processing circuitry 14 may process motion vectors 42 , which are either overlapping with or within a specified distance of scroll bar-type element 30 , with respect to the scroll bar-type element 30 to determine whether the user input controls the scroll bar-type element 30 .
  • the processing circuitry 14 may determine that the user input controls the scroll bar-type element 30 and may change the scroll bar-type element 30 .
  • an angle of tolerance may be specified and the user input may be deemed to control the scroll bar-type element 30 if the angle between a possible direction of movement of scroll box 32 and the direction of the motion vectors 42 is less than the angle of tolerance.
  • Different angles of tolerance may be used (e.g., 5-45 degrees) in different embodiments depending upon how accurate the user input is desired to be with respect to the scroll bar-type element 30 before the user input is deemed to control the scroll bar-type element 30 .
  • the scroll box 32 may be activated from a non-activated state and manipulated (e.g., moved upwardly corresponding to the direction of the motion vectors 42 ) as a result of the determination that the user input pertains to the scroll bar-type element 30 .
  • Usage of motion vectors 42 described herein in one example allows processing of input with respect to some user interface elements without tracking discrete objects of user interaction.
  • the activation and manipulation of the scroll bar-type element 30 may initiate additional processing by the processing circuitry 14 depending upon the application and functionality associated with the scroll bar-type element 30 (e.g., move the contents which are displayed in window 23 corresponding to the movement of the scroll box 32 of the scroll bar-type element 30 , etc.).
  • the pertinent motion vectors 42 e.g., the motion vectors which overlap with or are sufficiently proximate to the scroll bar-type element 30
  • the user interaction is considered to not pertain to the scroll bar-type element 30 and the scroll box 32 is not changed (e.g., not activated nor manipulated) in one embodiment.
  • the information regarding the user inputs may also be processed with respect to other user interface elements which may also be displayed concurrently with the button-type element 26 and scroll bar-type element 30 in one embodiment.
  • An example window 50 which may be displayed by user interface 12 includes a plurality of user interface elements in the form of paint container-type elements 52 a , 52 b , 52 c which are different colors in the illustrated embodiment.
  • Computing system 12 may use information regarding the painting application and paint container-type elements 52 a , 52 b , 52 c to process user inputs with respect to window 50 in one embodiment.
  • the user may use a user input object in the form of a physical paint brush 54 to interact with window 50 of user interface 12 in one embodiment.
  • processing circuitry 14 may use object recognition techniques to process the information regarding user input to identify the type of user input object interacting with the user interface 12 in one embodiment.
  • the processing circuitry 14 may use object recognition to determine whether the user input object interacting with window 50 is a paint brush user input object as may be expected given that window 50 is associated with a painting application. If the user input object interacting with window 50 is determined to be a paint brush, then a user input processing method configured to analyze interactions of paint brush 54 may be selected and utilized to process the information regarding the user inputs with respect to paint container-type elements 52 a , 52 b , 52 c .
  • the selected user input processing method is tailored to process inputs corresponding to strokes of the paint brush 54 over the user interface 12 .
  • Computing system 10 may be configured to recognize different types of user input objects and process user inputs from different types of user input objects in other embodiments.
  • FIG. 8 one method of processing user interactions with respect to a user interface is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • user input information indicative of user inputs of one or more user input objects with the user interface is accessed.
  • a list of user interface elements is accessed which includes the elements which were displayed by the interactive display when the user input was received.
  • the user input information may be processed differently with respect to different types of the user interface elements.
  • a plurality of different user input processing methods are selected which correspond to the types of user interface elements being displayed as indicated by the accessed list of act A 14 .
  • the user input information is processed for each of the user interface elements using respective ones of the selected user input processing methods which correspond to the respective types of the displayed user interface elements.
  • the processing determines whether the user inputs pertain to the user interface elements for activation and/or manipulation of the user interface elements.
  • the user interface elements which were determined to be activated and/or manipulated by the user inputs are changed in accordance with the user inputs.
  • FIG. 9 one method of processing user interactions with respect to a button-type element is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • the information may define the location and geometric boundaries of the button-type element of the display.
  • regions of the user input information which overlap with the button-type element are identified.
  • the overlapping regions may be expressed as a number of pixels of the regions of user input which overlap with the button-type element.
  • the overlapping portion(s) of the user interaction regions and the button-type element are compared with a threshold.
  • a number of overlapping pixels identified in act A 34 is compared with a threshold number of pixels.
  • the button-type element may be activated and/or manipulated according to the user input at an act A 38 .
  • the user input information may be disregarded with respect to the button-type element if the condition of act A 36 is negative.
  • FIG. 10 one method of processing user interactions with respect to a scroll bar-type element is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • information regarding a displayed scroll bar-type element is accessed.
  • the information may define the location and area of the scroll bar-type element of the display as well as control directions of movement of the scroll box.
  • the scroll bar-type element may be activated and/or manipulated according to the user input at an act A 46 .
  • the user input information may be disregarded with respect to the scroll bar-type element if the condition of act A 44 is negative.
  • some conventional user interaction processing techniques implement hit-testing calculations where a finger contact is reduced to a point, such as a centroid of the finger contact, and the coordinates of the point are compared against onscreen objects.
  • this process disregards much of the information regarding the finger contact, such as size and shape.
  • the reduction of the contact to a point presumes a precision of the coordinates which is not supported by the sensed data which can lead to a breakdown in the interaction if the computed point does not agree with the user's own notion of the point of interaction (if the user has one).
  • a button-type element may not be depressed in the conventional arrangement even when part of the contact overlaps with the button-type element but a calculated centroid lies outside of the button-type element.
  • At least some aspects of the disclosure utilize an increased amount of the sensed information of user inputs interacting with the user interface compared with the above-described conventional arrangement.
  • the information of the user inputs may be examined with regards to the displayed types of user interface elements and the information may be processed differently using different processing methods specific to the different types of user interface elements.
  • Some of the described aspects reduce breakdowns in interaction which may otherwise occur with an incorrect assumption that a contact may be modeled as a point.

Abstract

Different techniques of processing user interactions with a computing system are described. In one implementation, an interactive display is configured to depict a graphical user interface which includes a plurality of different types of user interface elements (e.g., button-type element, scroll bar-type element). A user may use one or more user input object (e.g., finger, hand, stylus) to simultaneously interact with the interactive display. A plurality of different user input processing methods are used to process user inputs received by the graphical user interface differently and in accordance with the types of the user interface elements which are displayed. The processing of the user inputs is implemented to determine whether the user inputs control the respective user interface elements. The processing may determine whether the user inputs activate and/or manipulate the displayed user interface elements in but one example.

Description

    BACKGROUND
  • Computing systems are ubiquitous in business, school and entertainment. The uses and applications of computing systems continue to increase as the capabilities of the systems increase. Furthermore, existing computing systems have expanded capabilities and different types of systems have been more recently introduced as computing technologies continue to improve.
  • Various types of user interfaces permitting users to interact with computing systems have also evolved. Keyboards have traditionally been used by users to interact with computing systems while graphical user interfaces have also been introduced which permit users to interact with a computing system utilizing a pointing device, such as a mouse. More recently, interface systems have been introduced which respond to a user's touch interacting with a graphical display, and some multi-touch interface systems are configured to process multiple simultaneous inputs.
  • At least some aspects of the disclosure provide improvements with respect to processing of user interactions with graphical user interfaces.
  • SUMMARY
  • Different techniques of processing user interactions with a computing system are described in this disclosure. In one implementation, a graphical user interface is displayed which includes one or more user interface elements which a user may interact with. The displayed user interface elements may be different types of elements (e.g., button, scroll bar) which users interact with in different ways.
  • In one embodiment, user inputs received by the graphical user interface may be processed differently corresponding to the different types of user interface elements which are displayed. The processing of the user inputs determines whether the user inputs control (e.g., user inputs activate and/or manipulate) displayed user interface elements in one example. In one more specific implementation, a plurality of different user input processing methods are available to be used to process user inputs and the methods are tailored to process the user inputs in different ways corresponding to the different types of user interface elements which are displayed. As mentioned above, users may interact with the different types of elements in different ways (e.g., depress a button-type element or drag a scroll box of a scroll bar-type element) and the processing methods may be configured to process the user inputs in accordance with specific types of user interactions which may be expected to occur if the user intended to interact with the different types of user interface elements. The use of different processing methods enables processing of the user inputs which is tailored to the types of user interface elements being displayed instead of using a single processing method to process user inputs with respect to different types of user interface elements.
  • In one embodiment, the computing system selects the processing methods which correspond to the different types of interface elements which are displayed, and then uses the selected methods to process the user inputs to determine whether the user inputs activate and/or manipulate the different types of interface elements. For example, if a button-type element is displayed, one of the processing methods which is tailored to process inputs with respect to a button is selected to process user inputs to determine if the user inputs activate and/or manipulate the button-type element. If other types of user interface elements are displayed, other processing methods may be selected and utilized to process the user inputs to determine if the user inputs activate and/or manipulate the other types of displayed user interface elements as described in further detail below. Some specific example techniques for processing user inputs with respect to different user interface elements are described in the disclosure below.
  • According to one implementation, at least one computer-readable storage medium comprises programming stored thereon that is configured to cause processing circuitry to perform processing operations with respect to a graphical user interface. The programming may cause the processing circuitry to display the graphical user interface which includes first and second different types of user interface elements. User inputs with respect to the graphical user interface are processed differently using first and second different user input processing methods which are tailored to process the user inputs with respect to the first and second different types of user interface elements. The processing of the user inputs using the first and second user input processing methods determines whether the user inputs activate and/or manipulate the first and/or second different types of user interface elements as discussed in detail below according to example embodiments of the detailed description.
  • In an example embodiment, the computing system analyzes each user input separately (and perhaps differently) with respect to all user interface elements (which can include elements of different types) that are displayed. The processing of the user input for a displayed user interface element determines whether the user input controls that user interface element.
  • In yet another implementation, a computing system includes processing circuitry and an interactive display configured to display a graphical user interface. The graphical user interface includes one of a plurality of different types of user interface elements. The processing circuitry determines the type of the user interface element which is displayed in the graphical user interface and processes user input received by the graphical user interface in accordance with the type of the displayed user interface element to determine if the user input activates and/or manipulates the displayed user interface element.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an illustrative representation of a computing system according to one embodiment.
  • FIG. 2 is a functional block diagram of a computing system according to one embodiment.
  • FIG. 3 is an illustrative representation of a user interface according to one embodiment.
  • FIG. 4 is an illustrative representation of a region of user interaction with respect to a button-type element according to one embodiment.
  • FIG. 5 is an illustrative representation of a method of processing user input using a summed area table according to one embodiment.
  • FIG. 6 is an illustrative representation of motion vectors of user interaction with respect to a scroll bar-type element according to one embodiment.
  • FIG. 7 is an illustrative representation of interaction of a user's paint brush with respect to the user interface according to one embodiment.
  • FIG. 8 is a flow chart of a method of processing user inputs with respect to user interface elements according to one embodiment.
  • FIG. 9 is a flow chart of a method of processing user inputs with respect to a button-type element according to one embodiment.
  • FIG. 10 is a flow chart of a method of processing user inputs with respect to a scroll bar-type element according to one embodiment.
  • DETAILED DESCRIPTION
  • At least some aspects of the disclosure pertain to processing of user inputs of an interactive display of a computing system. More specifically, a graphical user interface of the interactive display may include different types of user interface elements (e.g., buttons, scroll bars) which users may interact with in different ways. The user inputs are processed to determine whether the inputs control (e.g., activate and/or manipulate) one or more displayed user interface elements. As described in detail below in accordance with some embodiments of the disclosure, different processing methods are available to be used to process the user inputs and the different processing methods may be selected and used according to the respective types of user interface elements which are displayed when the user inputs are received. For example, a first processing method may be used to determine if a user input controls a first type of user interface element (e.g., button) and a second processing method may be used to determine if the user input controls a second type of user interface element (e.g., scroll bar). The use of different processing methods enables the methods to be tailored to process different types of user inputs in different ways in accordance with the types of user interface elements which are displayed and the types of user inputs (e.g., pressing a button or moving a scroll box) which are expected to be received if the user intended to control the displayed user interface elements. Additional details regarding the processing of user inputs using different processing methods are discussed below.
  • Referring to FIG. 1, a computing system 10 is illustrated according to one embodiment of the disclosure. The computing system 10 includes a user interface 12 which is configured to interact with one or more user. In one embodiment, the user interface 12 is implemented as a graphical user interface (GUI) which is configured to display visual images for observation by one or more users and to receive user inputs from one or more users. For example, user interface 12 may generate visual images which include user interface elements (e.g., buttons, scroll-bars, etc.) and users may control (e.g., activate and/or manipulate) the user interface elements by interacting with the user interface elements which are displayed. In FIG. 1, the example embodiment of the user interface 12 includes an interactive display 20 which depicts visual images including the graphical user interface and user interface elements and receives user inputs with respect to the user interface 12.
  • In one example embodiment, interactive display 20 is configured as a multi-touch interface which is configured to simultaneously process a plurality of user inputs resulting from user input objects (e.g., fingers, hands, stylus, paintbrush) which are provided adjacent to or contact the interactive display 20 at substantially the same moment in time in one example embodiment. One example of a computing system 10 with a multi-touch interactive display 20 is a Surface® computing system available from Microsoft Corporation. Other configurations of computing system 10 and user interface 12 are possible. Additional details regarding user interaction with computing system 10 using the user interface 12 are discussed further below.
  • Referring to FIG. 2, one example arrangement of components and circuitry of computing system 10 is shown according to one embodiment. The illustrated embodiment of computing system 10 includes user interface 12, processing circuitry 14, storage circuitry 16, and a communications interface 18. Other embodiments of computing system 10 are possible including more, less and/or alternative components. While one possible arrangement of computing system 10 is the Surface® computing system as mentioned above, computing device 12 may be provided in other arrangements in other embodiments where interaction with one or more users is implemented including, for example, desktop computers, notebook computers, portable devices (e.g., personal digital assistants, cellular telephones, media devices) or other computing arrangements.
  • As described above, user interface 12 is configured to interact with a user including conveying data to a user (e.g., displaying visual images for observation by the user) as well as receiving inputs from the user. User interface 12 may be configured differently in different embodiments.
  • In one embodiment, processing circuitry 14 is arranged to process data, control data access and storage, issue commands, and control other desired operations. In more specific examples, processing circuitry 14 processes information regarding user inputs received via user interface 12 and controls depiction of visual images which are created by user interface 12. As described further below, the processing circuitry 14 processes the information regarding the user inputs to determine whether the user inputs pertain to (e.g., activate and/or manipulate) one or more user interface elements which may be displayed using user interface 12.
  • Processing circuitry 14 may comprise circuitry configured to implement desired programming provided by appropriate computer-readable storage media in at least one embodiment. For example, the processing circuitry 14 may be implemented as one or more processor(s) and/or other structure configured to execute executable instructions including, for example, software and/or firmware instructions. Other exemplary embodiments of processing circuitry 14 include hardware logic, PGA, FPGA, ASIC, state machines, and/or other structures alone or in combination with one or more processor(s). These examples of processing circuitry 14 are for illustration and other configurations are possible.
  • Storage circuitry 16 is configured to store programming such as executable code or instructions (e.g., software and/or firmware), electronic data, databases, image data, or other digital information and may include computer-readable storage media. At least some embodiments or aspects described herein may be implemented using programming stored within one or more computer-readable storage medium of storage circuitry 16 and configured to control appropriate processing circuitry 14.
  • The computer-readable storage medium may be embodied in one or more articles of manufacture 17 which can contain, store, or maintain programming, data and/or digital information for use by or in connection with an instruction execution system including processing circuitry 14 in the exemplary embodiment. For example, exemplary computer-readable storage media may include any one of physical media such as electronic, magnetic, optical, electromagnetic, infrared or semiconductor media. Some more specific examples of computer-readable storage media include, but are not limited to, a portable magnetic computer diskette, such as a floppy diskette, a zip disk, a hard drive, random access memory, read only memory, flash memory, cache memory, and/or other configurations capable of storing programming, data, or other digital information.
  • Communications interface 18 is arranged to implement communications of computing system 10 with respect to external devices (not shown). For example, communications interface 18 may be arranged to communicate information bi-directionally with respect to computing system 10. Communications interface 18 may be implemented as a network interface card (MC), serial or parallel connection, USB port, Firewire interface, flash memory interface, or any other suitable arrangement for implementing communications with respect to computing system 10.
  • Referring to FIG. 3, example interactions of a user with user interface 12 in the form of a multi-touch interactive display 20 in one embodiment are described. The illustrated interactive display 20 includes a display screen 22 which is configured to depict a plurality of visual images for observation by a user. The depicted example visual image of FIG. 3 includes a window 23 which includes a plurality of different types of user interface elements which users may interact with during interactive sessions with user interface 12 of computing system 10. More specifically, in the example of FIG. 3, two different types of user interface elements are shown including a button-type element 26 within a window 24 and a scroll bar-type element 30 which corresponds to content being displayed within window 23 in the illustrated example. Other types of user interface elements which users may view and interact with may be utilized.
  • Users interact differently with respect to the different types of user interface elements. For example, users may depress button-type element 26 by bringing a user input object approximate to or contacting a location of interactive display 20 where the button-type element 26 is displayed. Users may move a user input object upwardly or downwardly along an area of interactive display 20 which corresponds to the location of the scroll bar-type element 30 to move a scroll box 32 upwardly or downwardly within a bar region 34 of scroll bar-type element 30. These example interactions of the user may activate and manipulate the button-type element 26 or scroll bar-element 30.
  • Interactive display 20 receives user inputs of one or more users interacting with one or more of the displayed user interface elements 26, 30. In one embodiment, the user inputs are indicative of interaction of one or more user input objects with the user interface 12 and which may be processed by processing circuitry 14 to determine whether the user inputs control one or more of the displayed user interface elements in one embodiment. In one embodiment, one or more user input objects may interact with one or more different locations of display screen 22 at substantially the same moment in time to activate and/or manipulate one or more user interface elements displayed using display screen 22.
  • In one more specific embodiment, interactive display 20 may provide information regarding user inputs in the form of an image of user interaction (also referred to as a user input image) which defines a plurality of pixel locations of the display screen 22 which were activated (selected) by one or more user input object interacting with the user interface 12 at substantially the same moment in time. The activated pixel locations may correspond to one or more different regions of the display screen 22 corresponding to one or more user input objects interacting with the user interface 12 at the respective moment in time. In one operative embodiment of computing device 10, information in the form of a series of user input images may be generated at sequential moments in time by interactive display 20 to enable substantially continual monitoring by processing circuitry 14 of user interactions with respect to user interface 12 during operations of computing system 12.
  • Interactive display 20 may be embodied in any appropriate configuration to provide information regarding user inputs interacting with the user interface 12, for example, by bringing one or more user input objects approximate to or contacting different regions of display screen 22. In one example, interactive display 20 may include an infrared camera (not shown in FIG. 3 but which may be placed behind the display screen 22 in one embodiment) which captures user input images of interactions of user input objects with respect to the display screen 22 to provide the information regarding the user inputs. In another embodiment, infrared or capacitive sensors are embedded at pixel locations within the display screen 22 and are used to provide the information regarding user inputs. In yet another example, the display screen 22 may include an array of capacitive sensors arranged in grid across the surface of the display screen 22 and which is configured to provide information regarding user inputs. Any suitable configurations may be used to provide the information regarding user inputs in other embodiments.
  • In one embodiment, the user inputs received by the interactive display 20 may be processed by the processing circuitry 14 to determine whether the user inputs pertain to one or more displayed user interface elements. In one embodiment, the processing circuitry 14 identifies one or more regions of display screen 22 which were interacted with by the one or more user input objects. The identified regions may correspond to locations of the display screen 22 which were interacted with by user input objects in an illustrative example. In addition, processing circuitry 14 may use motion estimation processing techniques to estimate motion with respect to the display screen 22 between different images. In one more specific embodiment, the processing circuitry 14 may extract motion vectors across the surface of display screen 22 by calculating an optical flow field using a plurality of user input images and which includes vectors which are indicative of a pattern of motion of user inputs with respect to the display screen 22 between the user input images.
  • Processing circuitry 14 processes the information regarding the user inputs provided by interactive display 20 to determine whether the user inputs control one or more of the displayed user interface elements as described in detail below with respect to example embodiments of FIGS. 4 and 6. As mentioned previously, user interface 12 may display a plurality of user interface elements of different types. As described below, processing circuitry 14 may process user inputs differently with respect to the different types of user interface elements which are displayed.
  • In one embodiment, a plurality of different user input processing methods may be used to process the information regarding user inputs with respect to the user interface elements of different types. For example, the information may be processed with respect to button-type element 26 using a first user input processing method which corresponds to button-type elements and the user input may be processed with respect to scroll bar-type element 30 using a second user input processing method which corresponds to scroll bar-type elements.
  • In one embodiment, storage circuitry 16 may store information comprising a list of the user interface elements (and their respective types) which are displayed using the user interface 12. Processing circuitry 14 may access the information regarding the displayed user interface elements to ascertain the types and locations of the displayed user interface elements to process the information regarding the user inputs. The storage circuitry 16 may also store a plurality of different user input processing methods which are used by processing circuitry 14 to process the information regarding the user inputs for the different types of the user interface elements, respectively. The processing circuitry 14 may select and utilize appropriate different processing methods to process the information regarding user inputs with respect to the different types of user interface elements which are displayed by the user interface 12 at moments in time corresponding to the user interactions.
  • Referring to FIG. 4, an exemplary user input processing method is described with respect to processing of information regarding user inputs with respect to a button-type element 26 according to one embodiment. Other methods may be used to process the information with respect to button-type elements in other embodiments.
  • FIG. 4 depicts the information regarding a user input in the form of a region 40 which corresponds to a region of pixels of interaction of a user input object (e.g., fingertip) with display screen 22. In one embodiment, region 40 is indicative of one user input object interacting with display screen 22. In one embodiment, the activated pixels of the region 40 may be provided within a user input image of the display screen 22 which also includes any other regions of pixels which were activated by other user input objects interacting with display screen 22 at substantially the same moment in time (if any).
  • As shown, one portion of the region 40 overlaps with an area of button-type element 26 defined by the geometric boundaries of the element while another portion of the region 40 is outside of the area of button-type object 26. In one embodiment, the processing circuitry 14 selects an appropriate user input processing method for processing the region 40 of the information regarding the user input with respect to the button-type element 26. In one example method of processing a user input with respect to button-type element 26, the processing circuitry 14 determines the number of pixels of the area of button-type element 26 which overlap with the region 40 of interaction by the user. In the described example, the processing circuitry 14 compares the resultant number of overlapping pixels of the area of button-type element 26 and region 40 with a threshold to determine if the user interaction pertains to the button-type element 26. An appropriate threshold may be empirically determined in one embodiment. Furthermore, a plurality of thresholds may be used in different processing embodiments. More specifically, a higher threshold may be used in one embodiment if it is desired to reduce the occurrence of false positive interaction determinations resulting from the processing, or a lower threshold may be used if it is desired to reduce the occurrence of false negative interaction determinations resulting from the processing.
  • If the number of activated pixels which overlap with the button-type element 26 exceeds the threshold, then the user interaction of region 40 is considered to control to the button-type element 26 and the processing circuitry 14 may change the button-type element 26. For example, the button-type element 26 may be activated from a non-activated state and manipulated (e.g., changed to a depressed state from a non-depressed state) as a result of the determination that the user input controls the button-type element 26. In addition, the activation and manipulation of the button-type element 26 may initiate additional processing by the processing circuitry 14 depending upon the application and functionality associated with the button-type element 26 (e.g., save a document, make a spell check change suggestion, etc.). In addition, the user inputs (e.g., the user input image) may also be processed (perhaps differently as described below) with respect to other user interface elements (e.g., scroll bar-type element 30) which are also displayed by display screen 22 at the moment in time when the information regarding the user input is ascertained.
  • If the overlapping number of pixels does not exceed the threshold, the user interaction is considered to not pertain to the button-type element 26 and the button-type element 26 is not changed (e.g., not activated nor manipulated). The information regarding the user input may also be processed with respect to other user interface elements which are also displayed as mentioned above.
  • In addition, some interactive displays 20 are configured to provide information of user input which includes different types of pixels for the user input. For example, for a contact resulting from a finger, the center of the contact region may be weighted an increased amount compared with outside portions of the contact, for example corresponding to the edges of the finger contact. This weighting may be first applied and a weighted sum of the activated pixels within the geometric boundaries of the button-type element 26 may be computed before comparison to the above-described threshold in one embodiment.
  • In some embodiments, the user input processing method may additionally process the information regarding user inputs in an attempt to reduce the occurrence of false activations of a user interface element. In one more specific embodiment, center-surround processing may be implemented. In one example, a number of pixels within the geometric boundaries of the button-type element 26 which are activated by a user input may be multiplied by a weighting (e.g., two) providing a first value which is indicative of the weighted pixels. A number of additional pixels activated by the user input which are located proximate the button-type element 26 but outside the geometric boundaries of element 26 (e.g., within a predetermined distance outside of the element 26) may be added to the number of activated pixels within the boundaries (which are unweighted) to provide a second value. The second value may be subtracted from the first value and the result of the subtraction may be compared to a threshold. If the result is greater than the threshold, then the user interaction may be deemed to control the button-type element 26 while the user interaction may not be deemed to control the button-type element 26 if the result is less than the threshold in one embodiment.
  • In another example technique for processing information regarding user inputs for a button-type element 26, an integral image technique is implemented which uses a summed area table of activated pixels to quickly and efficiently determine the number of pixels within the geometric boundaries of the button-type element 26. More specifically, the summed area table is calculated once and includes the activated pixels and their respective coordinates within the display screen 22. The summed area table may be thereafter be used to efficiently identify the number of activated pixels within the geometric boundaries of the button-type element 26.
  • Referring to FIG. 5, one example method of use of a summed area table to determine the number of activated pixels within the geometric boundaries of button-type interface 26 is described. Initially, a plurality of rectangles 44-47 may be determined from the origin (e.g., lower left corner in the described example) with respect to the button-type interface 26 and values for the respective rectangles may be determined using the summed area table and which indicate the numbers of activated pixels present within the respective rectangles 44-47. Thereafter, the values for rectangles 45 and 46 may be subtracted from the value of rectangle 44. Furthermore, the value of rectangle 47 may be added to the result of the subtraction of rectangles 45, 46 from rectangle 44 to provide the number of activated pixels present within the geometric boundaries of button-type element 26 and which may thereafter be used to determine whether the user input controls button-type element 26.
  • Referring to FIG. 6, an exemplary user input processing method is described with respect to scroll bar-type element 30 according to one embodiment. Other methods may be used to process scroll bar-type elements 30 in other embodiments. Furthermore, the user input processing method used by processing circuitry 14 to process the user input with respect to the scroll bar-type element 30 is different than the user input processing method described above to process the information with respect to button-type elements 26 as is apparent from the following discussion.
  • As mentioned previously, the processing circuitry 14 may be able to ascertain one or more motion vectors of user interactions from user inputs received by interactive display 20. In FIG. 6, one example of a plurality of motion vectors 42 determined from the information regarding user inputs is illustrated. In one embodiment, processing circuitry 14 selects one of a plurality of different user input processing methods pertinent to scroll bar-type element 30 to process user inputs. According to one example processing method, processing circuitry 14 may analyze the motion vectors 42 with respect to the scroll bar-type element 30 to determine whether the user interactions pertain to the scroll bar-type element 30. In the depicted example, the scroll box 32 is configured to move either upwardly or downwardly in a vertical direction in the bar region 34. In one processing embodiment, processing circuitry 14 may process motion vectors 42, which are either overlapping with or within a specified distance of scroll bar-type element 30, with respect to the scroll bar-type element 30 to determine whether the user input controls the scroll bar-type element 30.
  • If the direction of the motion vectors 42 is aligned with possible directions of movement of scroll box 32 (i.e., upwardly or downwardly in the illustrated embodiment) within a specified tolerance, then the processing circuitry 14 may determine that the user input controls the scroll bar-type element 30 and may change the scroll bar-type element 30. In one example, an angle of tolerance may be specified and the user input may be deemed to control the scroll bar-type element 30 if the angle between a possible direction of movement of scroll box 32 and the direction of the motion vectors 42 is less than the angle of tolerance. Different angles of tolerance may be used (e.g., 5-45 degrees) in different embodiments depending upon how accurate the user input is desired to be with respect to the scroll bar-type element 30 before the user input is deemed to control the scroll bar-type element 30.
  • For example, the scroll box 32 may be activated from a non-activated state and manipulated (e.g., moved upwardly corresponding to the direction of the motion vectors 42) as a result of the determination that the user input pertains to the scroll bar-type element 30. Usage of motion vectors 42 described herein in one example allows processing of input with respect to some user interface elements without tracking discrete objects of user interaction.
  • The activation and manipulation of the scroll bar-type element 30 may initiate additional processing by the processing circuitry 14 depending upon the application and functionality associated with the scroll bar-type element 30 (e.g., move the contents which are displayed in window 23 corresponding to the movement of the scroll box 32 of the scroll bar-type element 30, etc.).
  • If the pertinent motion vectors 42 (e.g., the motion vectors which overlap with or are sufficiently proximate to the scroll bar-type element 30) are not sufficiently aligned with the possible directions of movement of the scroll box 32, the user interaction is considered to not pertain to the scroll bar-type element 30 and the scroll box 32 is not changed (e.g., not activated nor manipulated) in one embodiment.
  • The information regarding the user inputs may also be processed with respect to other user interface elements which may also be displayed concurrently with the button-type element 26 and scroll bar-type element 30 in one embodiment.
  • As mentioned above, different types of user input objects may interact with user interface 12. Referring to FIG. 7, an example of a user interacting with a painting application is described. An example window 50 which may be displayed by user interface 12 includes a plurality of user interface elements in the form of paint container- type elements 52 a, 52 b, 52 c which are different colors in the illustrated embodiment. Computing system 12 may use information regarding the painting application and paint container- type elements 52 a, 52 b, 52 c to process user inputs with respect to window 50 in one embodiment. The user may use a user input object in the form of a physical paint brush 54 to interact with window 50 of user interface 12 in one embodiment. In one embodiment, processing circuitry 14 may use object recognition techniques to process the information regarding user input to identify the type of user input object interacting with the user interface 12 in one embodiment. In the example embodiment of FIG. 7, the processing circuitry 14 may use object recognition to determine whether the user input object interacting with window 50 is a paint brush user input object as may be expected given that window 50 is associated with a painting application. If the user input object interacting with window 50 is determined to be a paint brush, then a user input processing method configured to analyze interactions of paint brush 54 may be selected and utilized to process the information regarding the user inputs with respect to paint container- type elements 52 a, 52 b, 52 c. In one embodiment, the selected user input processing method is tailored to process inputs corresponding to strokes of the paint brush 54 over the user interface 12. Computing system 10 may be configured to recognize different types of user input objects and process user inputs from different types of user input objects in other embodiments.
  • Referring to FIG. 8, one method of processing user interactions with respect to a user interface is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • At an act A12, user input information indicative of user inputs of one or more user input objects with the user interface is accessed.
  • At an act A14, a list of user interface elements is accessed which includes the elements which were displayed by the interactive display when the user input was received.
  • At an act A16, the user input information may be processed differently with respect to different types of the user interface elements. In one embodiment, a plurality of different user input processing methods are selected which correspond to the types of user interface elements being displayed as indicated by the accessed list of act A14.
  • At an act A18, the user input information is processed for each of the user interface elements using respective ones of the selected user input processing methods which correspond to the respective types of the displayed user interface elements. The processing determines whether the user inputs pertain to the user interface elements for activation and/or manipulation of the user interface elements.
  • At an act A20, the user interface elements which were determined to be activated and/or manipulated by the user inputs are changed in accordance with the user inputs.
  • At an act A22, additional processing is performed (if any) as a result of the user interacting with one or more of the user interface elements.
  • Referring to FIG. 9, one method of processing user interactions with respect to a button-type element is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • At an act A30, user input information regarding one or more regions of the display interacted with by one or more user input objects is accessed.
  • At an act A32, information regarding a displayed button-type element is accessed. For example, the information may define the location and geometric boundaries of the button-type element of the display.
  • At an act A34, regions of the user input information which overlap with the button-type element are identified. In one embodiment, the overlapping regions may be expressed as a number of pixels of the regions of user input which overlap with the button-type element.
  • At an act A36, the overlapping portion(s) of the user interaction regions and the button-type element are compared with a threshold. In one example, a number of overlapping pixels identified in act A34 is compared with a threshold number of pixels.
  • If the condition of act A36 is affirmative, the button-type element may be activated and/or manipulated according to the user input at an act A38.
  • The user input information may be disregarded with respect to the button-type element if the condition of act A36 is negative.
  • Referring to FIG. 10, one method of processing user interactions with respect to a scroll bar-type element is described according to one embodiment. Other methods are possible including more, less and/or alternative acts. The method may be implemented by processing circuitry 14 in one embodiment.
  • At an act A40, motion vectors of the user input information is accessed.
  • At an act A42, information regarding a displayed scroll bar-type element is accessed. For example, the information may define the location and area of the scroll bar-type element of the display as well as control directions of movement of the scroll box.
  • At an act A44, it is determined whether the motion vectors in the vicinity of the scroll bar-type element are sufficiently aligned within an acceptable tolerance with possible control directions of the scroll box.
  • If the condition of act A44 is affirmative, the scroll bar-type element may be activated and/or manipulated according to the user input at an act A46.
  • The user input information may be disregarded with respect to the scroll bar-type element if the condition of act A44 is negative.
  • At least some aspects described above provide improvements over some conventional user interaction processing methods. For example, some conventional user interaction processing techniques implement hit-testing calculations where a finger contact is reduced to a point, such as a centroid of the finger contact, and the coordinates of the point are compared against onscreen objects. However, this process disregards much of the information regarding the finger contact, such as size and shape. The reduction of the contact to a point presumes a precision of the coordinates which is not supported by the sensed data which can lead to a breakdown in the interaction if the computed point does not agree with the user's own notion of the point of interaction (if the user has one). In but one example, a button-type element may not be depressed in the conventional arrangement even when part of the contact overlaps with the button-type element but a calculated centroid lies outside of the button-type element.
  • At least some aspects of the disclosure utilize an increased amount of the sensed information of user inputs interacting with the user interface compared with the above-described conventional arrangement. The information of the user inputs may be examined with regards to the displayed types of user interface elements and the information may be processed differently using different processing methods specific to the different types of user interface elements. Some of the described aspects reduce breakdowns in interaction which may otherwise occur with an incorrect assumption that a contact may be modeled as a point. The processing of interactions with respect to user interface elements as described in the present disclosure is believed to be more robust and accurate with respect to the user's intentions compared with the above-described conventional method where user interactive contacts are reduced to points since the processing of the information of the user inputs may be tailored to individual user interface elements while also using an increased amount of information regarding user inputs provided by the user interface compared with distilled information regarding individual points of contact.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (21)

1.-20. (canceled)
21. A method for interacting with a device, the method comprising:
receiving a first touch input from at least one finger of a first user via a first region of a multi-touch user interface associated with the device and substantially simultaneously receiving a second touch input from a stylus held by a second user, different from the first user, via a second region of the multi-touch user interface associated with the device;
automatically determining by the device, the first touch input as corresponding to the at least one finger of the first user and the second touch input as corresponding to the stylus held by the second user; and
processing the first touch input using a first processing method and processing the second touch input using a second processing method, wherein the first processing method is tailored to process inputs corresponding to finger touch events and the second processing method is tailored to process inputs corresponding to input events from the stylus, wherein the first processing method comprises automatically determining a first value corresponding to an area of the activated pixels in the first region of the multi-touch user interface by the first touch input and automatically modifying the first value corresponding to the area of the activated pixels in the first region of the multi-touch user interface to determine a second value corresponding to the area of the activated pixels in the first region of the multi-touch user interface, wherein the second value is different form the first value.
22. The method of claim 21 further comprising automatically determining by the device whether the first touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
23. The method of claim 22 further comprising processing the first touch input using a processing method corresponding to the first type of user interface element if the first touch input interacts with the first type of user interface element.
24. The method of claim 22 further comprising processing the first touch input using a processing method corresponding to the second type of user interface element if the first touch input interacts with the second type of user interface element.
25. The method of claim 21 further comprising automatically determining by the device whether the second touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
26. The method of claim 25 further comprising processing the second touch input using a processing method corresponding to the first type of user interface element if the second touch input interacts with the first type of user interface element.
27. The method of claim 25 further comprising processing the second touch input using a processing method corresponding to the second type of user interface element if the second touch input interacts with the second type of user interface element.
28. A computer-readable medium comprising instructions, when executed by at least one processor in a device, configured to:
receive a first touch input from at least one finger of a first user via a first region of a multi-touch user interface associated with the device and substantially simultaneously receive a second touch input from a stylus held by a second user, different from the first user, via a second region of the multi-touch user interface associated with the device;
automatically determine by the device the first touch input as corresponding to the at least one finger of the first user and the second touch input as corresponding to the stylus held by the second user; and
process the first touch input using a first processing method and process the second touch input using a second processing method, wherein the first processing method is tailored to process inputs corresponding to finger touch events and the second processing method is tailored to process inputs corresponding to input events from the stylus, wherein the first processing method comprises automatically determining a first value corresponding to an area of the activated pixels in the first region of the multi-touch user interface by the first touch input and automatically modifying the first value corresponding to the area of the activated pixels in the first region of the multi-touch user interface to determine a second value corresponding to the area of the activated pixels in the first region of the multi-touch user interface, wherein the second value is different form the first value.
29. The computer-readable medium of claim 28 further comprising instructions, when executed by the at least one processor in the device, configured to automatically determine whether the first touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
30. The computer-readable medium of claim 29 further comprising instructions, when executed by the at least one processor in the device, configured to process the first touch input using a processing method corresponding to the first type of user interface element if the first touch input interacts with the first type of user interface element.
31. The computer-readable medium of claim 29 further comprising instructions, when executed by the at least one processor in the device, configured to process the first touch input using a processing method corresponding to the second type of user interface element if the first touch input interacts with the second type of user interface element.
32. The computer-readable medium of claim 28 further comprising instructions, when executed by the at least one processor in the device, configured to automatically determine whether the second touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
33. The computer-readable medium of claim 32 further comprising instructions, when executed by the at least one processor in the device, configured to process the second touch input using a processing method corresponding to the first type of user interface element if the second touch input interacts with the first type of user interface element.
34. The computer-readable medium of claim 32 further comprising instructions, when executed by the at least one processor in the device, configured to process the second touch input using a processing method corresponding to the second type of user interface element if the second touch input interacts with the second type of user interface element.
35. A device comprising at least one processor and a memory comprising instructions, when executed by at least one processor in a device, configured to:
receive a first touch input from at least one finger of a first user via a first region of a multi-touch user interface associated with the device and substantially simultaneously receive a second touch input from a stylus held by a second user, different from the first user, via a second region of the multi-touch user interface associated with the device;
automatically determine by the device the first touch input as corresponding to the at least one finger of the first user and the second touch input as corresponding to the stylus held by the second user; and
process the first touch input using a first processing method and process the second touch input using a second processing method, wherein the first processing method is tailored to process inputs corresponding to finger touch events and the second processing method is tailored to process inputs corresponding to input events from the stylus, wherein the first processing method comprises automatically determining a first value corresponding to an area of the activated pixels in the first region of the multi-touch user interface by the first touch input and automatically modifying the first value corresponding to the area of the activated pixels in the first region of the multi-touch user interface to determine a second value corresponding to the area of the activated pixels in the first region of the multi-touch user interface, wherein the second value is different form the first value.
36. The device of claim 35, wherein the memory further comprising instructions, when executed by the at least one processor in the device, configured to automatically determine whether the first touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
37. The device of claim 36, wherein the memory further comprising instructions, when executed by the at least one processor in the device, configured to process the first touch input using a processing method corresponding to the first type of user interface element if the first touch input interacts with the first type of user interface element.
38. The device of claim 36, wherein the memory further comprising instructions, when executed by the at least one processor in the device, configured to process the first touch input using a processing method corresponding to the second type of user interface element if the first touch input interacts with the second type of user interface element.
39. The device of claim 35, wherein the memory further comprising instructions, when executed by the at least one processor in the device, configured to automatically determine whether the second touch input interacts with a first type of user interface element of the multi-touch user interface or with a second type of user interface element of the multi-touch user interface, wherein the second type of user interface element is different from the first type of user interface element.
40. The device of claim 39, wherein the memory further comprising instructions, when executed by the at least one processor in the device, configured to process the second touch input using a processing method corresponding to the first type of user interface element if the second touch input interacts with the first type of user interface element and process the second touch input using a processing method corresponding to the second type of user interface element if the second touch input interacts with the second type of user interface element.
US15/651,133 2010-05-03 2017-07-17 Computer with graphical user interface for interaction Abandoned US20170329487A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/651,133 US20170329487A1 (en) 2010-05-03 2017-07-17 Computer with graphical user interface for interaction

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/772,864 US9740364B2 (en) 2010-05-03 2010-05-03 Computer with graphical user interface for interaction
US15/651,133 US20170329487A1 (en) 2010-05-03 2017-07-17 Computer with graphical user interface for interaction

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/772,864 Continuation US9740364B2 (en) 2010-05-03 2010-05-03 Computer with graphical user interface for interaction

Publications (1)

Publication Number Publication Date
US20170329487A1 true US20170329487A1 (en) 2017-11-16

Family

ID=44859318

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/772,864 Active 2032-08-24 US9740364B2 (en) 2010-05-03 2010-05-03 Computer with graphical user interface for interaction
US15/651,133 Abandoned US20170329487A1 (en) 2010-05-03 2017-07-17 Computer with graphical user interface for interaction

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/772,864 Active 2032-08-24 US9740364B2 (en) 2010-05-03 2010-05-03 Computer with graphical user interface for interaction

Country Status (1)

Country Link
US (2) US9740364B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989907A (en) * 2019-11-27 2020-04-10 浙江大华技术股份有限公司 Data display method and device, electronic equipment and storage medium

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8766936B2 (en) 2011-03-25 2014-07-01 Honeywell International Inc. Touch screen and method for providing stable touches
US9134969B2 (en) 2011-12-13 2015-09-15 Ipar, Llc Computer-implemented systems and methods for providing consistent application generation
US9733707B2 (en) 2012-03-22 2017-08-15 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US9423871B2 (en) 2012-08-07 2016-08-23 Honeywell International Inc. System and method for reducing the effects of inadvertent touch on a touch screen controller
CN102819404B (en) * 2012-08-29 2016-06-29 曙光信息产业(北京)有限公司 The interface display method of basic input-output system BIOS
US9128580B2 (en) 2012-12-07 2015-09-08 Honeywell International Inc. System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
US20160070455A1 (en) * 2014-09-10 2016-03-10 International Business Machines Corporation Toggle graphic object
WO2017039125A1 (en) * 2015-08-28 2017-03-09 Samsung Electronics Co., Ltd. Electronic device and operating method of the same
GB2541730B (en) * 2015-08-28 2020-05-13 Samsung Electronics Co Ltd Displaying graphical user interface elements on a touch screen
US20170199748A1 (en) * 2016-01-13 2017-07-13 International Business Machines Corporation Preventing accidental interaction when rendering user interface components
KR102586792B1 (en) * 2016-08-23 2023-10-12 삼성디스플레이 주식회사 Display device and driving method thereof
US10474341B2 (en) 2017-09-11 2019-11-12 Adobe Inc. Digital paint generation mix control
US10515465B2 (en) * 2017-09-11 2019-12-24 Adobe Inc. Digital paint generation, container representation, and hierarchical storage
US10559096B2 (en) 2017-09-11 2020-02-11 Adobe Inc. Digital paint generation based on physical digital paint property interaction
US10521932B2 (en) 2017-09-11 2019-12-31 Adobe Inc. Digital paint generation based on digital paint properties
US10489938B2 (en) 2017-09-26 2019-11-26 Adobe Inc. Digital paint generation feedback

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI248576B (en) * 2004-07-05 2006-02-01 Elan Microelectronics Corp Method for controlling rolling of scroll bar on a touch panel
US7489306B2 (en) * 2004-12-22 2009-02-10 Microsoft Corporation Touch screen accuracy
US7487467B1 (en) * 2005-06-23 2009-02-03 Sun Microsystems, Inc. Visual representation and other effects for application management on a device with a small screen
US7777728B2 (en) * 2006-03-17 2010-08-17 Nokia Corporation Mobile communication terminal
US7620901B2 (en) * 2006-03-21 2009-11-17 Microsoft Corporation Simultaneous input across multiple applications
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US7643011B2 (en) 2007-01-03 2010-01-05 Apple Inc. Noise detection in multi-touch sensors
KR100894146B1 (en) * 2007-02-03 2009-04-22 엘지전자 주식회사 Mobile communication device and control method thereof
US8074581B2 (en) * 2007-10-12 2011-12-13 Steelcase Inc. Conference table assembly
US20090132937A1 (en) * 2007-11-15 2009-05-21 International Business Machines Corporation Modifying Hover Help for a User Interface
TW200943163A (en) * 2008-04-02 2009-10-16 Kye Systems Corp Method of scrolling computer windows
US8952894B2 (en) 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US8300023B2 (en) * 2009-04-10 2012-10-30 Qualcomm Incorporated Virtual keypad generator with learning capabilities

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989907A (en) * 2019-11-27 2020-04-10 浙江大华技术股份有限公司 Data display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
US9740364B2 (en) 2017-08-22
US20110271216A1 (en) 2011-11-03

Similar Documents

Publication Publication Date Title
US20170329487A1 (en) Computer with graphical user interface for interaction
US9870137B2 (en) Speed/positional mode translations
US9182884B2 (en) Pinch-throw and translation gestures
US9720544B2 (en) Techniques for reducing jitter for taps
US9507417B2 (en) Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US7880720B2 (en) Gesture recognition method and touch system incorporating the same
US9092125B2 (en) Multi-mode touchscreen user interface for a multi-state touchscreen device
CA2822812C (en) Systems and methods for adaptive gesture recognition
US20080168403A1 (en) Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US9317171B2 (en) Systems and methods for implementing and using gesture based user interface widgets with camera input
US20140082559A1 (en) Control area for facilitating user input
US20160357301A1 (en) Method and system for performing an action based on number of hover events
US10078443B2 (en) Control system for virtual mouse and control method thereof
US10303295B2 (en) Modifying an on-screen keyboard based on asymmetric touch drift
KR20150090698A (en) Method and apparatus of managing objects on wallpaper
EP2681646B1 (en) Electronic apparatus, display method, and program
WO2021223546A1 (en) Using a stylus to modify display layout of touchscreen displays
US9778822B2 (en) Touch input method and electronic apparatus thereof
Komuro Vision-based 3D input interface technologies

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WILSON, ANDREW D.;REEL/FRAME:043020/0197

Effective date: 20100503

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: TC RETURN OF APPEAL

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION