US20100066705A1 - Highlevel active pen matrix - Google Patents
Highlevel active pen matrix Download PDFInfo
- Publication number
- US20100066705A1 US20100066705A1 US12/627,275 US62727509A US2010066705A1 US 20100066705 A1 US20100066705 A1 US 20100066705A1 US 62727509 A US62727509 A US 62727509A US 2010066705 A1 US2010066705 A1 US 2010066705A1
- Authority
- US
- United States
- Prior art keywords
- computer
- user input
- threshold
- digitizer
- stroke
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
- G06F18/256—Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0238—Programmable keyboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
Abstract
The present invention relates to a system, method and medium for receiving and acting upon user input. In one embodiment, the user may only have access to a limited input device, like a stylus. Using the present invention, a user is provided with intuitive responses from the system based on inputs from the limited input device.
Description
- The present application is a continuation of U.S. patent application Ser. No. 11/202,034, filed Aug. 12, 2005, which is a continuation of U.S. patent application Ser. No. 10/993,357, filed Nov. 22, 2004, now issued as U.S. Pat. No. 7,081,889, which is a continuation of U.S. patent application Ser. No. 09/736,170, filed Dec. 15, 2000, now issued as U.S. Pat. No. 6,897,853, which claims priority to U.S. Provisional Patent Application Ser. No. 60/247,400, filed Nov. 10, 2000, each of which is incorporated by reference herein as to its entirety.
- Aspects of the present invention are directed generally to apparatus and methods for controlling a graphical user interface (GUI). More particularly, the present invention relates to receiving user input, determining based on the user input what the user wants to do, and performing a function related to the desired input.
- Typical computer systems, especially computer systems using graphical user interface (GUI) systems such as Microsoft WINDOWS, are optimized for accepting user input from one or more discrete input devices such as a keyboard and for entering text, and a pointing device such as a mouse with one or more buttons for driving the user interface. Virtually all software applications designed to run on Microsoft WINDOWS are optimized to accept user input in the same manner. For instance, many applications make extensive use of the right mouse button (a “right click”) to display context-sensitive command menus. The user may generate other gestures using the mouse such as by clicking the left button of the mouse (a “left click”), or by clicking the left or right button of the mouse and moving the mouse while the button is depressed (either a “left click drag” or a “right click drag”).
- In some environments, a mouse is not usable or desirable. For example, in a digitizer tablet environment, the primary input device may be a stylus. While a stylus attempts to provide pad and paper-like feel to a computing environment, current systems are limited. For example, the use of a stylus in a graphical user interface is limited to tapping on various items for selection. See, for example, the Palm-series of products using the Palm OS 3.0 operating system. Further, in stylus-based input environments, a user is continually forced to select tools or operations from a remote tool bar, generally on a top or bottom of a screen. While a user can type in letters or have the digitizer recognize handwriting, these operations require selecting a keyboard input mode and writing in a predefined portion of the digitizer, respectively. In short, requiring a user to tell the computer, for every new input, what a user wants to do makes stylus-based computing difficult for the average user. Accordingly, stylus based inputs have been relegated to personal data assistants (PDAs) where significant user input is not possible. Mainstream computing still requires the use of at least a keyboard and mouse (or mouse-based input device, for example, trackballs, touch-pads, and other mouse substitutes).
- Accordingly, a need exists for permitting a user to perform all operations of a mouse-type device using a stylus.
- As discussed in the various copending patent applications incorporated herein by reference, aspects of the present invention are directed to a tablet-like computer that allows users to directly write on a display surface using a stylus. The display surface may physically, optically, and or electro magnetically detect the stylus. The computer may allow the user to write and to edit, manipulate, and create objects through the use of the stylus. Many of the features discussed in these copending applications are more easily performed by use of the various aspects of the present invention discussed herein.
- An aspect of the present invention is directed to methods and apparatus for simulating gestures of a mouse by use of a stylus on a display surface. The present invention determines the operation a user wants to perform based on the user's input. This determination may include reference to other information including the location of the user's input on a digitizer (e.g., location on a screen) and the status of other objects or elements as displayed. By using this information, the system determines what the user wants to do and implements the action.
- A number of inputs with a stylus are possible. For example, a user may tap a stylus, stroke the stylus, hold the stylus at a given point, or hold then drag the stylus. Other inputs and combinations are possible as noted by the above-identified applications, which are expressly incorporated herein by reference.
- As to a stroke operation, the system may drag an object, may maintain a current state or operation, or being inking. Inking may include writing, drawing, or adding annotations as described in greater detail in U.S. Ser. No. 60/212,825, filed Jun. 21, 2000, entitled “Methods for Classifying, Anchoring, and Transforming Ink Annotations” and incorporated by reference.
- As to a tap operation, the system may add to existing writing, may select a new object, insert a cursor or insertion point, or may perform an action on a selected object.
- As to a hold operation, the system may simulate a right mouse button click or other definable event.
- As to a hold and drag operation, the system may drag a selected object or perform other functions.
- These and other features of the invention will be apparent upon consideration of the following detailed description of preferred embodiments. Although the invention has been defined using the appended claims, these claims are exemplary in that the invention is intended to include the elements and steps described herein in any combination or subcombination. Accordingly, there are any number of alternative combinations for defining the invention, which incorporate one or more elements from the specification, including the description, claims, and drawings, in various combinations or subcombinations. It will be apparent to those skilled in the relevant technology, in light of the present specification, that alternate combinations of aspects of the invention, either alone or in combination with one or more elements or steps defined herein, may be utilized as modifications or alterations of the invention or as part of the invention. It is intended that the written description of the invention contained herein covers all such modifications and alterations.
- The foregoing summary of the invention, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the accompanying drawings, which are included by way of example, and not by way of limitation with regard to the claimed invention. In the accompanying drawings, elements are labeled with three-digit reference numbers, wherein the first digit of a reference number indicates the drawing number in which the element is first illustrated. The same reference number in different drawings refers to the same element.
-
FIG. 1 is a schematic diagram of a general-purpose digital computing environment that can be used to implement various aspects of the invention. -
FIG. 2 is a plan view of a tablet computer and stylus that can be used in accordance with various aspects of the present invention. -
FIGS. 3-7 are flowcharts showing a variety of steps for interpreting a user's input in accordance with embodiments of the present invention. - The present invention may be more readily described with reference to
FIGS. 1-7 .FIG. 1 illustrates a schematic diagram of a conventional general-purpose digital computing environment that can be used to implement various aspects of the present invention. InFIG. 1 , acomputer 100 includes aprocessing unit 110, asystem memory 120, and asystem bus 130 that couples various system components including the system memory to theprocessing unit 110. Thesystem bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Thesystem memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150. - A basic input/output system 160 (BIOS), containing the basic routines that help to transfer information between elements within the
computer 100, such as during start-up, is stored in the ROM 140. Thecomputer 100 also includes ahard disk drive 170 for reading from and writing to a hard disk (not shown), amagnetic disk drive 180 for reading from or writing to a removablemagnetic disk 190, and anoptical disk drive 191 for reading from or writing to a removableoptical disk 192 such as a CD ROM or other optical media. Thehard disk drive 170,magnetic disk drive 180, andoptical disk drive 191 are connected to thesystem bus 130 by a harddisk drive interface 192, a magneticdisk drive interface 193, and an opticaldisk drive interface 194, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for thepersonal computer 100. It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment. - A number of program modules can be stored on the
hard disk drive 170,magnetic disk 190,optical disk 192, ROM 140 orRAM 150, including anoperating system 195, one ormore application programs 196,other program modules 197, andprogram data 198. A user can enter commands and information into thecomputer 100 through input devices such as akeyboard 101 andpointing device 102. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner or the like. These and other input devices are often connected to theprocessing unit 110 through aserial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). Further still, these devices may be coupled directly to thesystem bus 130 via an appropriate interface (not shown). Amonitor 107 or other type of display device is also connected to thesystem bus 130 via an interface, such as avideo adapter 108. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. In a preferred embodiment, apen digitizer 165 and accompanying pen orstylus 166 are provided in order to digitally capture freehand input. Although a direct connection between thepen digitizer 165 and theprocessing unit 110 is shown, in practice, thepen digitizer 165 may be coupled to theprocessing unit 110 via a serial port, parallel port or other interface and thesystem bus 130 as known in the art. Furthermore, although thedigitizer 165 is shown apart from themonitor 107, it is preferred that the usable input area of thedigitizer 165 be co-extensive with the display area of themonitor 107. Further still, thedigitizer 165 may be integrated in themonitor 107, or may exist as a separate device overlaying or otherwise appended to themonitor 107. - The
computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as aremote computer 109. Theremote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer 100, although only amemory storage device 111 has been illustrated inFIG. 1 . The logical connections depicted inFIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment, the
computer 100 is connected to thelocal network 112 through a network interface oradapter 114. When used in a WAN networking environment, thepersonal computer 100 typically includes amodem 115 or other means for establishing a communications over thewide area network 113, such as the Internet. Themodem 115, which may be internal or external, is connected to thesystem bus 130 via theserial port interface 106. In a networked environment, program modules depicted relative to thepersonal computer 100, or portions thereof, may be stored in the remote memory storage device. - It will be appreciated that the network connections shown are exemplary and other techniques for establishing a communications link between the computers can be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server. Any of various conventional web browsers can be used to display and manipulate data on web pages.
-
FIG. 2 illustrates atablet PC 201 that can be used in accordance with various aspects of the present invention. Any or all of the features, subsystems, and functions in the system ofFIG. 1 can be included in the computer ofFIG. 2 .Tablet PC 201 includes alarge display surface 202, e.g., a digitizing flat panel display, preferably, a liquid crystal display (LCD) screen, on which a plurality ofwindows 203 is displayed. Usingstylus 204, a user can select, highlight, and write on the digitizing display area. Examples of suitable digitizing display panels include electromagnetic pen digitizers, such as the Mutoh or Wacom pen digitizers. Other types of pen digitizers, e.g., optical digitizers, may also be used.Tablet PC 201 interprets marks made usingstylus 204 in order to manipulate data, enter text, and execute conventional computer application tasks such as spreadsheets, word processing programs, and the like. - A stylus could be equipped with buttons or other features to augment its selection capabilities. In one embodiment, a stylus could be implemented as a “pencil” or “pen”, in which one end constitutes a writing portion and the other end constitutes an “eraser” end, and which, when moved across the display, indicates portions of the display are to be erased. Other types of input devices, such as a mouse, trackball, or the like could be used. Additionally, a user's own finger could be used for selecting or indicating portions of the displayed image on a touch-sensitive or proximity-sensitive display. Consequently, the term “user input device”, as used herein, is intended to have a broad definition and encompasses many variations on well-known input devices.
-
Region 205 shows a feed back region or contact region permitting the user to determine where the stylus as contacted the digitizer. In another embodiment, theregion 205 provides visual feedback when the hold status of the present invention has been reached. -
FIGS. 3-7 show various flowcharts for determining what a user wants to do based on a user's interaction with the digitizer. As will be discussed below, the user contacts the digitizer where the user wants to begin writing, tapping, annotating, dragging, etc. In the case where the digitizer is superimposed over a display, the user's contact with the digitizer is directed at operating at (or near) the contact point between the user's stylus and the currently displayed information at or near the contact point. - In
step 301, the system senses a contact or other indication of an action. In one embodiment the contact may be the stylus contacting the surface of the digitizer. In another embodiment, the action may be bringing the tip of the stylus near the digitizer's surface. Further, if the stylus includes another signaling method (for example, a radio transmitter transmitting a signal to the digitizer signaling a user's input), the digitizer (or related input mechanism or mechanisms) interpret the received signal as a user's input. Other methods of starting an operation or writing or contact with a digitizer are known in the art. For purposes of illustration and description, the system and method reference physical contact with the digitizer. All other ways of providing signals to a processor are considered within the scope of the invention and are not mentioned here for simplicity. - In
step 302, the system determines the contact position and what lies beneath the contact position (for example, an object, a drawing, blank space, ink, and the like). Instep 303, the system determines if the stylus has moved beyond a first threshold (time, distance, rate, or acceleration, and the like). In one embodiment, the threshold is set to the minimum resolvable movement. In another embodiment, the threshold is set higher to account for shaky hands, vibrations of the digitizer or tablet pc (for example, if trying to use the system while driving in a car over a bumpy road). It is noted that objects may have all the same threshold. Alternatively, objects may have different thresholds. This may be dependent on the object, the size of the object, the state of the system, the state of the object, and the like. - If the first threshold has been exceeded, then the system proceeds to step 304 where the user's input is classified as a stroke and the system steps to point A 305. If the first threshold has not been exceeded, the system determines if the stylus was still in contact with the digitizer when a time threshold had expired in
step 306. If no (meaning that the stylus was still in contact with the digitizer surface), the system classifies the input as a tap instep 307 and proceeds topoint B 308. - If the stylus was still in contact with the surface after the time threshold in
step 306, the system determines if a second move threshold was exceeded instep 309. The first and second move thresholds may be identical or different. For example, both may be 0.25 mm. Or, the first may be 0.5 mm or one mm and the second be 0.3 mm. Further, the first may be 1.2 mm or more and the second may be 0.5 mm or more. In short, any values may be used as long as they are not obtrusive to the user. The second threshold may be determined only after the time threshold ofstep 306 has expired. In this example, the second threshold may be higher than the first threshold (or it may be the same or smaller). - If the second move threshold was not exceeded, then the system classifies the input as a hold in
step 310 and proceeds topoint C 311. If the second move threshold was exceeded, then the system classifies the input as a ‘hold and drag’ instep 312 and moves to pointD 313. -
FIG. 4 shows point A asstarting point 401. Here, the system classified the input as a stroke and begins stroke processing instep 402. Instep 403, the system determines if the stroke started on a draggable object. If yes, the system determines instep 404 whether drag threshold was exceeded (for example, 0.25 inches, 0.25 inches per second and the like). If so, the system classifies the stroke as a drag instep 405 and performs a function that is dependent on the object. For example, the drag may extend a selection as described in greater detail in “Selection Handles in Editing Electronic Documents,” filed concurrently with the present application (attorney docket 03797.00069), and expressly incorporated by reference. Also, the drag may operate a bungee tool as described in Serial No. (Atty docket 3797.00070), entitled “Insertion Point Bungee Space Tool”, and filed concurrently with the present application, and expressly incorporated herein. - If, in
step 404, the drag threshold has not been exceeded, the system maintains the current state (with the object being selected or not) instep 407. If the stroke was not over a draggable object instep 403, the system determines if the area under the contact point is inkable instep 408. For example, inkable may mean an area capable of receiving ink (including drawings, annotations, or writing) as detailed in Ser. No. 60/212,825, filed Jun. 21, 2000, and expressly incorporated herein by reference for essential subject matter. By contrast, a control button (for copy, save, open, etc.) may not be inkable. If inkable instep 408, the system permits inking (drawing, writing, annotating and other related functions) instep 409. If not inkable, the system maintains the current state (objects selected or not) instep 407. - In
FIG. 5A , the system starts atpoint B 501 and operates on the input as atap 502. The system determines whether the tap was on an area or object that is inkable instep 503. If yes, the system determines whether any ink was recently added or “wet” (for example, less than 0.5 or 1 second old) instep 504. If so, the system considers the tap as a dot to be added to the ink in step 505 (and adds the dot). If no wet ink exists, then the system determines if the tap was over a selectable object instep 506. It is noted thatsteps step 507. If it was not, then the system selects the tapped object instep 508. If a previous object had been selected, the system cancels the previous or old selection instep 509. If the object was previously selected as determined bystep 507, the system performs an action relevant to the object instep 510. This action may include editing the object, performing a predefined operation (for example, enlarge, shrink and the like). Fromstep 506, if the tap was not on a selectable object, then the system proceeds to pointBB 512. -
FIG. 5B shows additional processing toFIG. 5A . Aspoint BB 512, the system determines if the tap was in a space between text (referred to herein as an inline space) in step 513. If yes, the system places an insertion point at the tap point in step 514. As shown in a broken lined box, the system may also cancel any old or previous selections in step 515. If no, then the system determines if the tap point has ink nearby in step 518. If the system determines that the tap was nearby ink, then the system adds a dot to the ink in step 516. If there was an old selection, then the system cancels the old selection in step 517 (as shown by a broken line box). - If not nearby ink in step 518, the system determines if the tap is on an active object in step 519. If the tap was not on an active object, the system places an insertion point at the tap point or performs some other definable action in step 520. Again, if there was an old selection, then the system cancels the old selection in step 521 (as shown by a broken line box). If the tap was on an active object as determined by step 519, the system performs an action in step 522. The action may be definable by the user or relate to any function desirable. In one embodiment, the action may be to perform a function to operate a selection handle or bungee space tool as described in Ser. No. 60/247,973 (Attorney docket 3797.00069), “Selection Handles in Editing Electronic Documents,” filed concurrently with the present application and expressly incorporated by reference. Also, the drag may operate a bungee tool as described in Ser. No. 60/247,842 (Atty. docket 3797.00070), entitled “Insertion Point Bungee Space Tool”, and filed concurrently with the present application, and expressly incorporated herein. Other operations are known in the art and incorporated herein.
-
FIG. 6 relates to holding a stylus beyond a time threshold. Starting frompoint C 601, the system classifies the user input as a hold operation instep 602. Next, the system simulates a right mouse button click or other definable event instep 603. The functions associated withstep 603 are described in greater detail in U.S. application Ser. No. 60/247,844 (Atty. docket 3797.00072), entitled “Simulating Gestures of a Mouse Using a Stylus and Providing Feedback Thereto”, filed Nov. 10, 2000, whose contents are expressly incorporated herein by reference. -
FIG. 7 relates to holding a stylus beyond a time threshold and moving the stylus. Starting frompoint D 701, the system classifies the user input as a hold and drag operation instep 702. Next, instep 703 the system drags the selected object as directed by the user. - There are a number of alternatives associated with dragging. If the hold and drag relates to an inline space, the system may use this hold and drag function to select text. Similarly, one may use this function to select a drawing encountered by the dragged stylus. Further, one may select both text and drawings in this manner. Also, the cursor's point may become a selection tool that leaves a trail behind it. In this regard, the user may loop a number of objects, drawing or text in this regard. The looping of the objects may result in the selecting of the objects.
- An alternate embodiment of the present invention relates to modifying ink drawings or annotations. For example, if one added an annotation (from step 409) to text, one may manipulate the text (for example, by inserting new text) and have the annotation track the manipulation of the text. So, if one circled text then added text to the circled text, the annotation would expand to include the added text as well. This is described in relation to in U.S. Ser. No. 60/212,825, filed Jun. 21, 2000, entitled “Methods for Classifying, Anchoring, and Transforming Ink Annotations” and incorporated by reference.
- While exemplary systems and methods embodying the present invention are shown by way of example, it will be understood, of course, that the invention is not limited to these embodiments. Modifications may be made by those skilled in the art, particularly in light of the foregoing teachings. For example, each of the elements of the aforementioned embodiments may be utilized alone or in combination with elements of the other embodiment.
Claims (20)
1. One or more computer-readable media storing computer-executable instructions that when executed perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold;
determining whether the user input ends before an amount of time; and
responsive to the user input failing to exceed the first threshold and ending before the amount of time, classifying the user input as a tap.
2. The one or more computer-readable media of claim 1 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer, and the user input ends when the digitizer no longer detects the user's finger in contact with the digitizer.
3. The one or more computer-readable media of claim 1 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
4. The one or more computer-readable media of claim 1 wherein the first threshold is not changed based on an object associated with the user input.
5. The one or more computer-readable media of claim 1 wherein the first threshold depends on an object associated with the user input.
6. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
selecting an object within proximity of the tap when the object has not already been selected.
7. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
de-selecting an object within proximity of the tap when the object has already been selected.
8. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
placing an insertion point at a location within proximity of the tap when the location of the tap was within text.
9. One or more computer-readable media storing computer-executable instructions that when executed perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold; and
responsive to the user input moving beyond the first threshold, classifying the user input as a stroke.
10. The one or more computer-readable media of 9 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer.
11. The one or more computer-readable media of claim 9 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
12. The one or more computer-readable media of claim 9 wherein the computer-executable instructions perform operations further comprising:
when the stroke started within proximity to a draggable object and a drag threshold had been exceeded, performing a function with the draggable object based on the stroke; and
when the drag threshold has not been exceeded, maintaining the draggable object at a current state.
13. The one or more computer-readable media of claim 9 wherein the computer-executable instructions perform operations further comprising:
when the stroke did not start within proximity to a draggable object, determining whether an area under the stroke is inkable; and
when the area under the stroke inkable, performing inking based on the stroke.
14. A computing device, comprising:
one or more processors; and
one or more computer-readable media storing computer-executable instructions that when executed by the one or more processors perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold;
responsive to the user input moving beyond the first threshold, classifying the user input as a stroke;
determining whether the user input ends before an amount of time; and
responsive to the user input failing to exceed the first threshold within the amount of time and ending before the amount of time, classifying the user input as a tap.
15. The computing device of claim 14 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer, and the user input ends when the digitizer no longer detects the user's finger in contact with the digitizer.
16. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
selecting a first object within proximity of the tap when the first object has not already been selected; and
de-selecting a second object within proximity of the tap when the second object has already been selected.
17. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
placing an insertion point at a location within proximity of the tap when the location of the tap was within text.
18. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
when the stroke started within proximity to a draggable object and a drag threshold had been exceeded, performing a function with the draggable object based on the stroke; and
when the drag threshold has not been exceeded, maintaining the draggable object at a current state.
19. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
when the stroke did not start within proximity to a draggable object, determining whether an area under the stroke is inkable; and
when the area under the stroke inkable, performing inking based on the stroke.
20. The computing device of claim 14 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/627,275 US20100066705A1 (en) | 2000-11-10 | 2009-11-30 | Highlevel active pen matrix |
US13/917,413 US20130293500A1 (en) | 2000-11-10 | 2013-06-13 | Highlevel Active Pen Matrix |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US24740000P | 2000-11-10 | 2000-11-10 | |
US09/736,170 US6897853B2 (en) | 2000-11-10 | 2000-12-15 | Highlevel active pen matrix |
US10/993,357 US7081889B2 (en) | 2000-11-10 | 2004-11-22 | Highlevel active pen matrix |
US11/202,034 US7626580B2 (en) | 2000-11-10 | 2005-08-12 | Highlevel active pen matrix |
US12/627,275 US20100066705A1 (en) | 2000-11-10 | 2009-11-30 | Highlevel active pen matrix |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/202,034 Continuation US7626580B2 (en) | 2000-11-10 | 2005-08-12 | Highlevel active pen matrix |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/917,413 Continuation US20130293500A1 (en) | 2000-11-10 | 2013-06-13 | Highlevel Active Pen Matrix |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100066705A1 true US20100066705A1 (en) | 2010-03-18 |
Family
ID=26938659
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/736,170 Expired - Lifetime US6897853B2 (en) | 2000-11-10 | 2000-12-15 | Highlevel active pen matrix |
US10/993,353 Expired - Fee Related US7277089B2 (en) | 2000-11-10 | 2004-11-22 | Highlevel active pen matrix |
US10/993,357 Expired - Fee Related US7081889B2 (en) | 2000-11-10 | 2004-11-22 | Highlevel active pen matrix |
US11/202,034 Expired - Lifetime US7626580B2 (en) | 2000-11-10 | 2005-08-12 | Highlevel active pen matrix |
US12/627,275 Abandoned US20100066705A1 (en) | 2000-11-10 | 2009-11-30 | Highlevel active pen matrix |
US13/917,413 Abandoned US20130293500A1 (en) | 2000-11-10 | 2013-06-13 | Highlevel Active Pen Matrix |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/736,170 Expired - Lifetime US6897853B2 (en) | 2000-11-10 | 2000-12-15 | Highlevel active pen matrix |
US10/993,353 Expired - Fee Related US7277089B2 (en) | 2000-11-10 | 2004-11-22 | Highlevel active pen matrix |
US10/993,357 Expired - Fee Related US7081889B2 (en) | 2000-11-10 | 2004-11-22 | Highlevel active pen matrix |
US11/202,034 Expired - Lifetime US7626580B2 (en) | 2000-11-10 | 2005-08-12 | Highlevel active pen matrix |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/917,413 Abandoned US20130293500A1 (en) | 2000-11-10 | 2013-06-13 | Highlevel Active Pen Matrix |
Country Status (4)
Country | Link |
---|---|
US (6) | US6897853B2 (en) |
EP (1) | EP1205836A3 (en) |
JP (2) | JP4809558B2 (en) |
CN (1) | CN1262910C (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080313568A1 (en) * | 2007-06-12 | 2008-12-18 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
WO2013063241A1 (en) * | 2011-10-25 | 2013-05-02 | Barnesandnoble.Com Llc | Pen interface for a touch screen device |
US20130167058A1 (en) * | 2011-12-22 | 2013-06-27 | Microsoft Corporation | Closing applications |
US20130222301A1 (en) * | 2012-02-24 | 2013-08-29 | Samsung Electronics Co., Ltd. | Method and apparatus for moving contents in terminal |
US20130342485A1 (en) * | 2012-06-22 | 2013-12-26 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
US9015606B2 (en) | 2010-12-23 | 2015-04-21 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US9696888B2 (en) | 2010-12-20 | 2017-07-04 | Microsoft Technology Licensing, Llc | Application-launching interface for multiple modes |
US9841874B2 (en) | 2014-04-04 | 2017-12-12 | Microsoft Technology Licensing, Llc | Expandable application representation |
US10254955B2 (en) | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US10289660B2 (en) | 2012-02-15 | 2019-05-14 | Apple Inc. | Device, method, and graphical user interface for sharing a content object in a document |
US10303325B2 (en) | 2011-05-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Multi-application environment |
US10579250B2 (en) | 2011-09-01 | 2020-03-03 | Microsoft Technology Licensing, Llc | Arranging tiles |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US11272017B2 (en) | 2011-05-27 | 2022-03-08 | Microsoft Technology Licensing, Llc | Application notifications manifest |
Families Citing this family (158)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9722766D0 (en) | 1997-10-28 | 1997-12-24 | British Telecomm | Portable computers |
US8510668B1 (en) | 2000-04-03 | 2013-08-13 | Google Inc. | Indicating potential focus in a user interface |
US9189069B2 (en) * | 2000-07-17 | 2015-11-17 | Microsoft Technology Licensing, Llc | Throwing gestures for mobile devices |
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US6897853B2 (en) * | 2000-11-10 | 2005-05-24 | Microsoft Corp. | Highlevel active pen matrix |
US7345671B2 (en) | 2001-10-22 | 2008-03-18 | Apple Inc. | Method and apparatus for use of rotational user inputs |
US7312785B2 (en) | 2001-10-22 | 2007-12-25 | Apple Inc. | Method and apparatus for accelerated scrolling |
JP2003173226A (en) * | 2001-11-27 | 2003-06-20 | Internatl Business Mach Corp <Ibm> | Information processor, program and coordinate input method |
US6938221B2 (en) * | 2001-11-30 | 2005-08-30 | Microsoft Corporation | User interface for stylus-based user input |
US7333092B2 (en) | 2002-02-25 | 2008-02-19 | Apple Computer, Inc. | Touch pad for handheld device |
AU2003303837A1 (en) * | 2003-01-30 | 2004-08-23 | Fujitsu Limited | Handwriting-input device and method |
US8508508B2 (en) | 2003-02-14 | 2013-08-13 | Next Holdings Limited | Touch screen signal processing with single-point calibration |
US8456447B2 (en) | 2003-02-14 | 2013-06-04 | Next Holdings Limited | Touch screen signal processing |
US7629967B2 (en) | 2003-02-14 | 2009-12-08 | Next Holdings Limited | Touch screen signal processing |
US7426329B2 (en) | 2003-03-06 | 2008-09-16 | Microsoft Corporation | Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player |
US7256773B2 (en) * | 2003-06-09 | 2007-08-14 | Microsoft Corporation | Detection of a dwell gesture by examining parameters associated with pen motion |
JP2007515691A (en) * | 2003-08-14 | 2007-06-14 | 株式会社エヌ・ティ・ティ・ドコモ | Direct data entry |
US20070152977A1 (en) | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Illuminated touchpad |
US7499040B2 (en) | 2003-08-18 | 2009-03-03 | Apple Inc. | Movable touch pad with added functionality |
US7495659B2 (en) * | 2003-11-25 | 2009-02-24 | Apple Inc. | Touch pad for handheld device |
US8059099B2 (en) | 2006-06-02 | 2011-11-15 | Apple Inc. | Techniques for interactive input to portable electronic devices |
US7538759B2 (en) | 2004-05-07 | 2009-05-26 | Next Holdings Limited | Touch panel display system with illumination and detection provided from a single edge |
US20060007174A1 (en) * | 2004-07-06 | 2006-01-12 | Chung-Yi Shen | Touch control method for a drag gesture and control module thereof |
CN100555200C (en) | 2004-08-16 | 2009-10-28 | 苹果公司 | The method of the spatial resolution of touch sensitive devices and raising touch sensitive devices |
US7761814B2 (en) * | 2004-09-13 | 2010-07-20 | Microsoft Corporation | Flick gesture |
JP4583893B2 (en) * | 2004-11-19 | 2010-11-17 | 任天堂株式会社 | GAME PROGRAM AND GAME DEVICE |
US8787706B2 (en) | 2005-03-18 | 2014-07-22 | The Invention Science Fund I, Llc | Acquisition of a user expression and an environment of the expression |
US8823636B2 (en) | 2005-03-18 | 2014-09-02 | The Invention Science Fund I, Llc | Including environmental information in a manual expression |
US7809215B2 (en) | 2006-10-11 | 2010-10-05 | The Invention Science Fund I, Llc | Contextual information encoded in a formed expression |
US7873243B2 (en) | 2005-03-18 | 2011-01-18 | The Invention Science Fund I, Llc | Decoding digital information included in a hand-formed expression |
US7672512B2 (en) * | 2005-03-18 | 2010-03-02 | Searete Llc | Forms for completion with an electronic writing device |
US8232979B2 (en) | 2005-05-25 | 2012-07-31 | The Invention Science Fund I, Llc | Performing an action with respect to hand-formed expression |
US8290313B2 (en) * | 2005-03-18 | 2012-10-16 | The Invention Science Fund I, Llc | Electronic acquisition of a hand formed expression and a context of the expression |
US8340476B2 (en) * | 2005-03-18 | 2012-12-25 | The Invention Science Fund I, Llc | Electronic acquisition of a hand formed expression and a context of the expression |
US8229252B2 (en) * | 2005-03-18 | 2012-07-24 | The Invention Science Fund I, Llc | Electronic association of a user expression and a context of the expression |
US20060212430A1 (en) * | 2005-03-18 | 2006-09-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Outputting a saved hand-formed expression |
US7486274B2 (en) * | 2005-08-18 | 2009-02-03 | Mitsubishi Electric Research Laboratories, Inc. | Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices |
US7671837B2 (en) | 2005-09-06 | 2010-03-02 | Apple Inc. | Scrolling input arrangements using capacitive sensors on a flexible membrane |
US7880729B2 (en) | 2005-10-11 | 2011-02-01 | Apple Inc. | Center button isolation ring |
US20070152983A1 (en) | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Touch pad with symbols based on mode |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
JP4700539B2 (en) * | 2006-03-22 | 2011-06-15 | パナソニック株式会社 | Display device |
US7773075B2 (en) | 2006-03-22 | 2010-08-10 | Panasonic Corporation | Display apparatus |
US8743060B2 (en) | 2006-07-06 | 2014-06-03 | Apple Inc. | Mutual capacitance touch sensing device |
US8022935B2 (en) | 2006-07-06 | 2011-09-20 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
US9360967B2 (en) | 2006-07-06 | 2016-06-07 | Apple Inc. | Mutual capacitance touch sensing device |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US9304675B2 (en) | 2006-09-06 | 2016-04-05 | Apple Inc. | Portable electronic device for instant messaging |
US7795553B2 (en) | 2006-09-11 | 2010-09-14 | Apple Inc. | Hybrid button |
US20080065722A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Media device playlists |
US20080066135A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Search user interface for media device |
US20080062137A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Touch actuation controller for multi-state media presentation |
US8243017B2 (en) | 2006-09-11 | 2012-08-14 | Apple Inc. | Menu overlay including context dependent menu icon |
US9565387B2 (en) * | 2006-09-11 | 2017-02-07 | Apple Inc. | Perspective scale video with navigation menu |
US20080154573A1 (en) * | 2006-10-02 | 2008-06-26 | Microsoft Corporation | Simulating new input devices using old input devices |
US8274479B2 (en) | 2006-10-11 | 2012-09-25 | Apple Inc. | Gimballed scroll wheel |
US8482530B2 (en) | 2006-11-13 | 2013-07-09 | Apple Inc. | Method of capacitively sensing finger position |
US7844915B2 (en) | 2007-01-07 | 2010-11-30 | Apple Inc. | Application programming interfaces for scrolling operations |
US8689132B2 (en) | 2007-01-07 | 2014-04-01 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying electronic documents and lists |
US8451232B2 (en) | 2007-01-07 | 2013-05-28 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
WO2008087638A1 (en) * | 2007-01-16 | 2008-07-24 | N-Trig Ltd. | System and method for calibration of a capacitive touch digitizer system |
EP2135155B1 (en) * | 2007-04-11 | 2013-09-18 | Next Holdings, Inc. | Touch screen system with hover and click input methods |
TWI367436B (en) * | 2007-05-15 | 2012-07-01 | Htc Corp | Method for operating user interfaces of handheld device |
US20080307367A1 (en) * | 2007-06-07 | 2008-12-11 | John Michael Garrison | Method and apparatus for a drag and drop operation implementing a hierarchical path name |
US7932896B2 (en) * | 2007-06-13 | 2011-04-26 | Apple Inc. | Techniques for reducing jitter for taps |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US9654104B2 (en) | 2007-07-17 | 2017-05-16 | Apple Inc. | Resistive force sensor with capacitive discrimination |
WO2009029764A1 (en) | 2007-08-30 | 2009-03-05 | Next Holdings, Inc. | Low profile touch panel systems |
US8432377B2 (en) * | 2007-08-30 | 2013-04-30 | Next Holdings Limited | Optical touchscreen with improved illumination |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
AU2016213886B2 (en) * | 2007-09-04 | 2018-02-22 | Apple Inc. | Editing interface |
US7910843B2 (en) | 2007-09-04 | 2011-03-22 | Apple Inc. | Compact input device |
US8683378B2 (en) | 2007-09-04 | 2014-03-25 | Apple Inc. | Scrolling techniques for user interfaces |
US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
KR20090029138A (en) * | 2007-09-17 | 2009-03-20 | 삼성전자주식회사 | The method of inputting user command by gesture and the multimedia apparatus thereof |
US8416198B2 (en) | 2007-12-03 | 2013-04-09 | Apple Inc. | Multi-dimensional scroll wheel |
US8116569B2 (en) * | 2007-12-21 | 2012-02-14 | Microsoft Corporation | Inline handwriting recognition and correction |
US8255822B2 (en) * | 2007-12-21 | 2012-08-28 | Microsoft Corporation | Incorporated handwriting input experience for textboxes |
JP5239328B2 (en) * | 2007-12-21 | 2013-07-17 | ソニー株式会社 | Information processing apparatus and touch motion recognition method |
US8064702B2 (en) * | 2007-12-21 | 2011-11-22 | Microsoft Corporation | Handwriting templates |
US20090207144A1 (en) * | 2008-01-07 | 2009-08-20 | Next Holdings Limited | Position Sensing System With Edge Positioning Enhancement |
US8405636B2 (en) * | 2008-01-07 | 2013-03-26 | Next Holdings Limited | Optical position sensing system and optical position sensor assembly |
US20090213093A1 (en) * | 2008-01-07 | 2009-08-27 | Next Holdings Limited | Optical position sensor using retroreflection |
US8125461B2 (en) | 2008-01-11 | 2012-02-28 | Apple Inc. | Dynamic input graphic display |
US8820133B2 (en) | 2008-02-01 | 2014-09-02 | Apple Inc. | Co-extruded materials and methods |
US9454256B2 (en) | 2008-03-14 | 2016-09-27 | Apple Inc. | Sensor configurations of an input device that are switchable based on mode |
US8296670B2 (en) * | 2008-05-19 | 2012-10-23 | Microsoft Corporation | Accessing a menu utilizing a drag-operation |
US20090327886A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Use of secondary factors to analyze user intention in gui element activation |
US8816967B2 (en) | 2008-09-25 | 2014-08-26 | Apple Inc. | Capacitive sensor having electrodes arranged on the substrate and the flex circuit |
KR20110066198A (en) * | 2008-10-02 | 2011-06-16 | 넥스트 홀딩즈 리미티드 | Stereo optical sensors for resolving multi-touch in a touch detection system |
JP5299892B2 (en) * | 2008-10-15 | 2013-09-25 | 任天堂株式会社 | Display control program and information processing apparatus |
JP5176870B2 (en) * | 2008-10-28 | 2013-04-03 | 富士通株式会社 | Information processing apparatus and input control method |
US8395590B2 (en) | 2008-12-17 | 2013-03-12 | Apple Inc. | Integrated contact switch and touch sensor elements |
US9354751B2 (en) | 2009-05-15 | 2016-05-31 | Apple Inc. | Input device with optimized capacitive sensing |
US8352884B2 (en) | 2009-05-21 | 2013-01-08 | Sony Computer Entertainment Inc. | Dynamic reconfiguration of GUI display decomposition based on predictive model |
US20120327009A1 (en) * | 2009-06-07 | 2012-12-27 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface |
US8681106B2 (en) | 2009-06-07 | 2014-03-25 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface |
US8872771B2 (en) | 2009-07-07 | 2014-10-28 | Apple Inc. | Touch sensing device having conductive nodes |
JP2011028635A (en) * | 2009-07-28 | 2011-02-10 | Sony Corp | Display control apparatus, display control method and computer program |
JP5127792B2 (en) | 2009-08-18 | 2013-01-23 | キヤノン株式会社 | Information processing apparatus, control method therefor, program, and recording medium |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
WO2011037558A1 (en) * | 2009-09-22 | 2011-03-31 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8766928B2 (en) * | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8832585B2 (en) | 2009-09-25 | 2014-09-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8799826B2 (en) * | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
DE102009043719A1 (en) * | 2009-10-01 | 2011-04-07 | Deutsche Telekom Ag | Method for entering commands on a touch-sensitive surface |
US20120212440A1 (en) * | 2009-10-19 | 2012-08-23 | Sharp Kabushiki Kaisha | Input motion analysis method and information processing device |
JP5433375B2 (en) * | 2009-10-23 | 2014-03-05 | 楽天株式会社 | Terminal device, function execution method, function execution program, and information processing system |
WO2011066343A2 (en) * | 2009-11-24 | 2011-06-03 | Next Holdings Limited | Methods and apparatus for gesture recognition mode control |
US20110199387A1 (en) * | 2009-11-24 | 2011-08-18 | John David Newton | Activating Features on an Imaging Device Based on Manipulations |
EP2507692A2 (en) * | 2009-12-04 | 2012-10-10 | Next Holdings Limited | Imaging methods and systems for position detection |
US8539386B2 (en) * | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US8539385B2 (en) * | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US20110234542A1 (en) * | 2010-03-26 | 2011-09-29 | Paul Marson | Methods and Systems Utilizing Multiple Wavelengths for Position Detection |
US10788976B2 (en) | 2010-04-07 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US8881060B2 (en) | 2010-04-07 | 2014-11-04 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US8972879B2 (en) * | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
JP5664164B2 (en) * | 2010-11-18 | 2015-02-04 | 株式会社リコー | Electronic information board device, information display method, program |
US8797283B2 (en) | 2010-11-22 | 2014-08-05 | Sony Computer Entertainment America Llc | Method and apparatus for performing user-defined macros |
TWI530847B (en) * | 2010-12-27 | 2016-04-21 | 聯詠科技股份有限公司 | Click gesture determination method, touch control chip, touch control system and computer system |
US8907903B2 (en) | 2011-01-13 | 2014-12-09 | Sony Computer Entertainment America Llc | Handing control of an object from one touch input to another touch input |
EP2487570B1 (en) * | 2011-02-11 | 2019-10-16 | BlackBerry Limited | Electronic device and method of controlling same |
US8810529B2 (en) | 2011-02-11 | 2014-08-19 | Blackberry Limited | Electronic device and method of controlling same |
JP5618926B2 (en) * | 2011-07-11 | 2014-11-05 | 株式会社セルシス | Multipointing device control method and program |
CN102890574A (en) * | 2011-07-21 | 2013-01-23 | 鸿富锦精密工业(深圳)有限公司 | Touch device and mouse using same |
US9612670B2 (en) * | 2011-09-12 | 2017-04-04 | Microsoft Technology Licensing, Llc | Explicit touch selection and cursor placement |
EP2607999B1 (en) * | 2011-12-23 | 2017-10-18 | Deutsche Telekom AG | Method for controlling an electric device, in particular a portable electric device, electric device, computer program and computer program product |
JP5991509B2 (en) * | 2012-03-02 | 2016-09-14 | コニカミノルタ株式会社 | Information processing apparatus and program |
US20130246975A1 (en) * | 2012-03-15 | 2013-09-19 | Chandar Kumar Oddiraju | Gesture group selection |
JP5945926B2 (en) * | 2012-03-26 | 2016-07-05 | コニカミノルタ株式会社 | Operation display device |
US9575652B2 (en) | 2012-03-31 | 2017-02-21 | Microsoft Technology Licensing, Llc | Instantiable gesture objects |
US8881269B2 (en) | 2012-03-31 | 2014-11-04 | Apple Inc. | Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader |
JP6273671B2 (en) * | 2013-01-22 | 2018-02-07 | セイコーエプソン株式会社 | Projector, display system, and projector control method |
US9134814B2 (en) | 2012-04-05 | 2015-09-15 | Seiko Epson Corporation | Input device, display system and input method |
CN103383658B (en) * | 2012-05-04 | 2017-09-29 | 腾讯科技(深圳)有限公司 | Remember the method and server of mobile terminal operation |
US10222975B2 (en) * | 2012-08-27 | 2019-03-05 | Apple Inc. | Single contact scaling gesture |
JP2014044605A (en) * | 2012-08-28 | 2014-03-13 | Fujifilm Corp | Input control device and method in touch-sensitive display, and program |
US8949735B2 (en) | 2012-11-02 | 2015-02-03 | Google Inc. | Determining scroll direction intent |
US9335913B2 (en) | 2012-11-12 | 2016-05-10 | Microsoft Technology Licensing, Llc | Cross slide gesture |
KR102405189B1 (en) | 2013-10-30 | 2022-06-07 | 애플 인크. | Displaying relevant user interface objects |
DE102013224979A1 (en) * | 2013-12-05 | 2015-06-11 | Volkswagen Aktiengesellschaft | Motor vehicle operating device with touch-sensitive input surface |
US10146424B2 (en) | 2014-02-28 | 2018-12-04 | Dell Products, Lp | Display of objects on a touch screen and their selection |
US11955236B2 (en) | 2015-04-20 | 2024-04-09 | Murj, Inc. | Systems and methods for managing patient medical devices |
CN105468278B (en) * | 2015-11-06 | 2019-07-19 | 网易(杭州)网络有限公司 | Contact action identification, response, game control method and the device of virtual key |
CN105468279B (en) * | 2015-11-06 | 2019-08-23 | 网易(杭州)网络有限公司 | Contact action identification and response method, device and game control method, device |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
US10452197B2 (en) | 2016-06-23 | 2019-10-22 | Wacom Co., Ltd. | Threshold based coordinate data generation providing tap assist |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces |
CN114356121B (en) * | 2021-12-06 | 2023-07-14 | 深圳市千分一智能技术有限公司 | Active pen use discriminating method and apparatus, active pen and computer storage medium |
US11456072B1 (en) | 2022-03-15 | 2022-09-27 | Murj, Inc. | Systems and methods to distribute cardiac device advisory data |
Citations (95)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2143875A (en) * | 1934-12-11 | 1939-01-17 | Rca Corp | Multiplex facsimile printer system |
US4534060A (en) * | 1983-08-09 | 1985-08-06 | Pencept, Inc. | Method and apparatus for removing noise at the ends of a stroke |
US4608658A (en) * | 1984-04-13 | 1986-08-26 | Pencept, Inc. | Method and apparatus for removing noise at the ends of a stroke caused by retracing |
US4686332A (en) * | 1986-06-26 | 1987-08-11 | International Business Machines Corporation | Combined finger touch and stylus detection system for use on the viewing surface of a visual display device |
US4899138A (en) * | 1987-01-10 | 1990-02-06 | Pioneer Electronic Corporation | Touch panel control device with touch time and finger direction discrimination |
US4933670A (en) * | 1988-07-21 | 1990-06-12 | Picker International, Inc. | Multi-axis trackball |
US4954817A (en) * | 1988-05-02 | 1990-09-04 | Levine Neil A | Finger worn graphic interface device |
US4982618A (en) * | 1987-11-03 | 1991-01-08 | Culver Craig F | Multifunction tactile manipulatable control |
US4988981A (en) * | 1987-03-17 | 1991-01-29 | Vpl Research, Inc. | Computer data entry and manipulation apparatus and method |
US5060135A (en) * | 1988-09-16 | 1991-10-22 | Wang Laboratories, Inc. | Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable |
US5147155A (en) * | 1988-11-15 | 1992-09-15 | Molnlycke Ab | Device for achieving uniform distribution of airborne fibres, e.g. cellulose-fibres |
US5231578A (en) * | 1988-11-01 | 1993-07-27 | Wang Laboratories, Inc. | Apparatus for document annotation and manipulation using images from a window source |
US5280276A (en) * | 1992-07-10 | 1994-01-18 | Quickshot (Bvi) Ltd. | Combination mouse/trackball input device |
US5294792A (en) * | 1991-12-31 | 1994-03-15 | Texas Instruments Incorporated | Writing tip position sensing and processing apparatus |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US5404458A (en) * | 1991-10-10 | 1995-04-04 | International Business Machines Corporation | Recognizing the cessation of motion of a pointing device on a display by comparing a group of signals to an anchor point |
US5404439A (en) * | 1992-04-15 | 1995-04-04 | Xerox Corporation | Time-space object containment for graphical user interface |
US5442795A (en) * | 1988-05-27 | 1995-08-15 | Wang Laboratories, Inc. | System and method for viewing icon contents on a video display |
US5463696A (en) * | 1992-05-27 | 1995-10-31 | Apple Computer, Inc. | Recognition system and method for user inputs to a computer system |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5485171A (en) * | 1991-10-04 | 1996-01-16 | Micromed Systems, Inc. | Hand held computer input apparatus and method |
US5488392A (en) * | 1994-04-28 | 1996-01-30 | Harris; Thomas S. | Precision, absolute mapping computer pointing device and versatile accessories |
US5488204A (en) * | 1992-06-08 | 1996-01-30 | Synaptics, Incorporated | Paintbrush stylus for capacitive touch sensor pad |
US5491495A (en) * | 1990-11-13 | 1996-02-13 | Wang Laboratories, Inc. | User interface having simulated devices |
US5513309A (en) * | 1993-01-05 | 1996-04-30 | Apple Computer, Inc. | Graphic editor user interface for a pointer-based computer system |
US5523775A (en) * | 1992-05-26 | 1996-06-04 | Apple Computer, Inc. | Method for selecting objects on a computer display |
US5534893A (en) * | 1993-12-15 | 1996-07-09 | Apple Computer, Inc. | Method and apparatus for using stylus-tablet input in a computer system |
US5539427A (en) * | 1992-02-10 | 1996-07-23 | Compaq Computer Corporation | Graphic indexing system |
US5544295A (en) * | 1992-05-27 | 1996-08-06 | Apple Computer, Inc. | Method and apparatus for indicating a change in status of an object and its disposition using animation |
US5543590A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature |
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5546527A (en) * | 1994-05-23 | 1996-08-13 | International Business Machines Corporation | Overriding action defaults in direct manipulation of objects on a user interface by hovering a source object |
US5548705A (en) * | 1992-04-15 | 1996-08-20 | Xerox Corporation | Wiping metaphor as a user interface for operating on graphical objects on an interactive graphical display |
US5555363A (en) * | 1993-09-30 | 1996-09-10 | Apple Computer, Inc. | Resetting the case of text on a computer display |
US5559943A (en) * | 1994-06-27 | 1996-09-24 | Microsoft Corporation | Method and apparatus customizing a dual actuation setting of a computer input device switch |
US5592566A (en) * | 1992-05-27 | 1997-01-07 | Apple Computer, Incorporated | Method and apparatus for computerized recognition |
US5590567A (en) * | 1995-03-14 | 1997-01-07 | Delco Electronics Corporation | Snap retainer and retainer system |
US5594810A (en) * | 1993-09-30 | 1997-01-14 | Apple Computer, Inc. | Method and apparatus for recognizing gestures on a computer system |
US5596698A (en) * | 1992-12-22 | 1997-01-21 | Morgan; Michael W. | Method and apparatus for recognizing handwritten inputs in a computerized teaching system |
US5602570A (en) * | 1992-05-26 | 1997-02-11 | Capps; Stephen P. | Method for deleting objects on a computer display |
US5613019A (en) * | 1993-05-20 | 1997-03-18 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
US5612719A (en) * | 1992-12-03 | 1997-03-18 | Apple Computer, Inc. | Gesture sensitive buttons for graphical user interfaces |
US5621817A (en) * | 1992-05-27 | 1997-04-15 | Apple Computer, Inc. | Pointer-based computer system capable of aligning geometric figures |
US5625377A (en) * | 1992-05-27 | 1997-04-29 | Apple Computer, Inc. | Method for controlling a computerized organizer |
US5640178A (en) * | 1994-09-16 | 1997-06-17 | Fujitsu Limited | Pointing device |
US5666438A (en) * | 1994-07-29 | 1997-09-09 | Apple Computer, Inc. | Method and apparatus for recognizing handwriting of different users of a pen-based computer system |
US5666499A (en) * | 1995-08-04 | 1997-09-09 | Silicon Graphics, Inc. | Clickaround tool-based graphical interface with two cursors |
US5666113A (en) * | 1991-07-31 | 1997-09-09 | Microtouch Systems, Inc. | System for using a touchpad input device for cursor control and keyboard emulation |
US5670955A (en) * | 1995-01-31 | 1997-09-23 | Microsoft Corporation | Method and apparatus for generating directional and force vector in an input device |
US5748926A (en) * | 1995-04-18 | 1998-05-05 | Canon Kabushiki Kaisha | Data processing method and apparatus |
US5751260A (en) * | 1992-01-10 | 1998-05-12 | The United States Of America As Represented By The Secretary Of The Navy | Sensory integrated data interface |
US5757361A (en) * | 1996-03-20 | 1998-05-26 | International Business Machines Corporation | Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary |
US5757368A (en) * | 1995-03-27 | 1998-05-26 | Cirque Corporation | System and method for extending the drag function of a computer pointing device |
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5764218A (en) * | 1995-01-31 | 1998-06-09 | Apple Computer, Inc. | Method and apparatus for contacting a touch-sensitive cursor-controlling input device to generate button values |
US5781181A (en) * | 1995-07-25 | 1998-07-14 | Alps Electric Co., Ltd. | Apparatus and method for changing an operation mode of a coordinate input apparatus |
US5805144A (en) * | 1994-12-14 | 1998-09-08 | Dell Usa, L.P. | Mouse pointing device having integrated touchpad |
US5812118A (en) * | 1996-06-25 | 1998-09-22 | International Business Machines Corporation | Method, apparatus, and memory for creating at least two virtual pointing devices |
US5856822A (en) * | 1995-10-27 | 1999-01-05 | 02 Micro, Inc. | Touch-pad digital computer pointing-device |
US5861583A (en) * | 1992-06-08 | 1999-01-19 | Synaptics, Incorporated | Object position detector |
US5861886A (en) * | 1996-06-26 | 1999-01-19 | Xerox Corporation | Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface |
US5864635A (en) * | 1996-06-14 | 1999-01-26 | International Business Machines Corporation | Distinguishing gestures from handwriting in a pen based computer by stroke analysis |
US5880411A (en) * | 1992-06-08 | 1999-03-09 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5880717A (en) * | 1997-03-14 | 1999-03-09 | Tritech Microelectronics International, Ltd. | Automatic cursor motion control for a touchpad mouse |
US5883622A (en) * | 1997-01-17 | 1999-03-16 | Tritech Microelectronics International Ltd. | Touchpad pen-input controller |
US5898424A (en) * | 1996-09-30 | 1999-04-27 | Gateway 2000, Inc. | Pointing device with differing actuation forces for primary and secondary buttons |
US5907327A (en) * | 1996-08-28 | 1999-05-25 | Alps Electric Co., Ltd. | Apparatus and method regarding drag locking with notification |
US5910800A (en) * | 1997-06-11 | 1999-06-08 | Microsoft Corporation | Usage tips for on-screen touch-sensitive controls |
US5912659A (en) * | 1997-09-03 | 1999-06-15 | International Business Machines Corporation | Graphics display pointer with integrated selection |
US5920694A (en) * | 1993-03-19 | 1999-07-06 | Ncr Corporation | Annotation of computer video displays |
US5926179A (en) * | 1996-09-30 | 1999-07-20 | Sony Corporation | Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium |
US5926567A (en) * | 1995-03-01 | 1999-07-20 | Compaq Computer Corporation | Method and apparatus for storing and rapidly displaying graphic data |
US5943043A (en) * | 1995-11-09 | 1999-08-24 | International Business Machines Corporation | Touch panel "double-touch" input method and detection apparatus |
US5943044A (en) * | 1996-08-05 | 1999-08-24 | Interlink Electronics | Force sensing semiconductive touchpad |
US5945979A (en) * | 1994-11-17 | 1999-08-31 | International Business Machines Corporation | Combined digital and analog cursor control |
US6049329A (en) * | 1996-06-04 | 2000-04-11 | International Business Machines Corporartion | Method of and system for facilitating user input into a small GUI window using a stylus |
US6057830A (en) * | 1997-01-17 | 2000-05-02 | Tritech Microelectronics International Ltd. | Touchpad mouse controller |
US6061051A (en) * | 1997-01-17 | 2000-05-09 | Tritech Microelectronics | Command set for touchpad pen-input mouse |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US6118427A (en) * | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6173287B1 (en) * | 1998-03-11 | 2001-01-09 | Digital Equipment Corporation | Technique for ranking multimedia annotations of interest |
US6204837B1 (en) * | 1998-07-13 | 2001-03-20 | Hewlett-Packard Company | Computing apparatus having multiple pointing devices |
US6208329B1 (en) * | 1996-08-13 | 2001-03-27 | Lsi Logic Corporation | Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device |
US6262719B1 (en) * | 1994-09-02 | 2001-07-17 | Packard Bell Nec, Inc. | Mouse emulation with a passive pen |
US6266050B1 (en) * | 1997-08-08 | 2001-07-24 | Samsung Electronics Co., Ltd. | Portable computer having touch pad input control function |
US6339431B1 (en) * | 1998-09-30 | 2002-01-15 | Kabushiki Kaisha Toshiba | Information presentation apparatus and method |
US6342906B1 (en) * | 1999-02-02 | 2002-01-29 | International Business Machines Corporation | Annotation layer for synchronous collaboration |
US20020015064A1 (en) * | 2000-08-07 | 2002-02-07 | Robotham John S. | Gesture-based user interface to multi-level and multi-modal sets of bit-maps |
US6414700B1 (en) * | 1998-07-21 | 2002-07-02 | Silicon Graphics, Inc. | System for accessing a large number of menu items using a zoned menu bar |
US6557042B1 (en) * | 1999-03-19 | 2003-04-29 | Microsoft Corporation | Multimedia summary generation employing user feedback |
US6677930B2 (en) * | 1998-04-01 | 2004-01-13 | Fujitsu Takamisawa Component Ltd | Mouse |
US6847350B2 (en) * | 1999-12-16 | 2005-01-25 | Hewlett-Packard Development Company, L.P. | Optical pointing device |
US6897853B2 (en) * | 2000-11-10 | 2005-05-24 | Microsoft Corp. | Highlevel active pen matrix |
US6930672B1 (en) * | 1998-10-19 | 2005-08-16 | Fujitsu Limited | Input processing method and input control apparatus |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62224820A (en) * | 1986-03-26 | 1987-10-02 | Hitachi Ltd | Switching device for coordinate input mode |
JPH06175775A (en) * | 1992-12-09 | 1994-06-24 | Hitachi Ltd | Information processor |
US5572651A (en) * | 1993-10-15 | 1996-11-05 | Xerox Corporation | Table-based user interface for retrieving and manipulating indices between data structures |
US5504439A (en) * | 1994-04-01 | 1996-04-02 | Xilinx, Inc. | I/O interface cell for use with optional pad |
JPH07302306A (en) * | 1994-05-09 | 1995-11-14 | Hitachi Ltd | Character inputting device |
JP3181181B2 (en) * | 1994-11-11 | 2001-07-03 | シャープ株式会社 | Document information processing device |
JPH09319502A (en) * | 1996-05-28 | 1997-12-12 | Toshiba Corp | Information equipment provided with display integrated type coordinate input device |
US6128007A (en) * | 1996-07-29 | 2000-10-03 | Motorola, Inc. | Method and apparatus for multi-mode handwritten input and hand directed control of a computing device |
GB2317090B (en) | 1996-09-06 | 2001-04-04 | Quantel Ltd | An electronic graphic system |
JPH10187339A (en) * | 1996-12-25 | 1998-07-14 | Nec Corp | Slide pad with keyboard function |
US5986655A (en) * | 1997-10-28 | 1999-11-16 | Xerox Corporation | Method and system for indexing and controlling the playback of multimedia documents |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
JP3385965B2 (en) * | 1998-04-20 | 2003-03-10 | セイコーエプソン株式会社 | Input device and input method |
JP3519007B2 (en) * | 1999-01-29 | 2004-04-12 | シャープ株式会社 | Information device having map information display function, map information display method, and recording medium recording map information display program |
-
2000
- 2000-12-15 US US09/736,170 patent/US6897853B2/en not_active Expired - Lifetime
-
2001
- 2001-10-10 JP JP2001312524A patent/JP4809558B2/en not_active Expired - Fee Related
- 2001-11-07 EP EP01126625A patent/EP1205836A3/en not_active Withdrawn
- 2001-11-09 CN CNB011378743A patent/CN1262910C/en not_active Expired - Fee Related
-
2004
- 2004-11-22 US US10/993,353 patent/US7277089B2/en not_active Expired - Fee Related
- 2004-11-22 US US10/993,357 patent/US7081889B2/en not_active Expired - Fee Related
-
2005
- 2005-08-12 US US11/202,034 patent/US7626580B2/en not_active Expired - Lifetime
-
2009
- 2009-11-30 US US12/627,275 patent/US20100066705A1/en not_active Abandoned
-
2011
- 2011-07-14 JP JP2011156074A patent/JP5211211B2/en not_active Expired - Fee Related
-
2013
- 2013-06-13 US US13/917,413 patent/US20130293500A1/en not_active Abandoned
Patent Citations (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2143875A (en) * | 1934-12-11 | 1939-01-17 | Rca Corp | Multiplex facsimile printer system |
US4534060A (en) * | 1983-08-09 | 1985-08-06 | Pencept, Inc. | Method and apparatus for removing noise at the ends of a stroke |
US4608658A (en) * | 1984-04-13 | 1986-08-26 | Pencept, Inc. | Method and apparatus for removing noise at the ends of a stroke caused by retracing |
US4686332A (en) * | 1986-06-26 | 1987-08-11 | International Business Machines Corporation | Combined finger touch and stylus detection system for use on the viewing surface of a visual display device |
US4899138A (en) * | 1987-01-10 | 1990-02-06 | Pioneer Electronic Corporation | Touch panel control device with touch time and finger direction discrimination |
US4988981A (en) * | 1987-03-17 | 1991-01-29 | Vpl Research, Inc. | Computer data entry and manipulation apparatus and method |
US4988981B1 (en) * | 1987-03-17 | 1999-05-18 | Vpl Newco Inc | Computer data entry and manipulation apparatus and method |
US4982618A (en) * | 1987-11-03 | 1991-01-08 | Culver Craig F | Multifunction tactile manipulatable control |
US4954817A (en) * | 1988-05-02 | 1990-09-04 | Levine Neil A | Finger worn graphic interface device |
US5442795A (en) * | 1988-05-27 | 1995-08-15 | Wang Laboratories, Inc. | System and method for viewing icon contents on a video display |
US6115043A (en) * | 1988-05-27 | 2000-09-05 | Kodak Limited | Data processing system with folder means for associating a plurality of reduced size images in a stacked arrangement |
US4933670A (en) * | 1988-07-21 | 1990-06-12 | Picker International, Inc. | Multi-axis trackball |
US5060135A (en) * | 1988-09-16 | 1991-10-22 | Wang Laboratories, Inc. | Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable |
US5231578A (en) * | 1988-11-01 | 1993-07-27 | Wang Laboratories, Inc. | Apparatus for document annotation and manipulation using images from a window source |
US5147155A (en) * | 1988-11-15 | 1992-09-15 | Molnlycke Ab | Device for achieving uniform distribution of airborne fibres, e.g. cellulose-fibres |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US5491495A (en) * | 1990-11-13 | 1996-02-13 | Wang Laboratories, Inc. | User interface having simulated devices |
US5666113A (en) * | 1991-07-31 | 1997-09-09 | Microtouch Systems, Inc. | System for using a touchpad input device for cursor control and keyboard emulation |
US5485171A (en) * | 1991-10-04 | 1996-01-16 | Micromed Systems, Inc. | Hand held computer input apparatus and method |
US5404458A (en) * | 1991-10-10 | 1995-04-04 | International Business Machines Corporation | Recognizing the cessation of motion of a pointing device on a display by comparing a group of signals to an anchor point |
US5294792A (en) * | 1991-12-31 | 1994-03-15 | Texas Instruments Incorporated | Writing tip position sensing and processing apparatus |
US5751260A (en) * | 1992-01-10 | 1998-05-12 | The United States Of America As Represented By The Secretary Of The Navy | Sensory integrated data interface |
US5539427A (en) * | 1992-02-10 | 1996-07-23 | Compaq Computer Corporation | Graphic indexing system |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5548705A (en) * | 1992-04-15 | 1996-08-20 | Xerox Corporation | Wiping metaphor as a user interface for operating on graphical objects on an interactive graphical display |
US5404439A (en) * | 1992-04-15 | 1995-04-04 | Xerox Corporation | Time-space object containment for graphical user interface |
US5523775A (en) * | 1992-05-26 | 1996-06-04 | Apple Computer, Inc. | Method for selecting objects on a computer display |
US5602570A (en) * | 1992-05-26 | 1997-02-11 | Capps; Stephen P. | Method for deleting objects on a computer display |
US5625377A (en) * | 1992-05-27 | 1997-04-29 | Apple Computer, Inc. | Method for controlling a computerized organizer |
US5592566A (en) * | 1992-05-27 | 1997-01-07 | Apple Computer, Incorporated | Method and apparatus for computerized recognition |
US5463696A (en) * | 1992-05-27 | 1995-10-31 | Apple Computer, Inc. | Recognition system and method for user inputs to a computer system |
US5621817A (en) * | 1992-05-27 | 1997-04-15 | Apple Computer, Inc. | Pointer-based computer system capable of aligning geometric figures |
US5544295A (en) * | 1992-05-27 | 1996-08-06 | Apple Computer, Inc. | Method and apparatus for indicating a change in status of an object and its disposition using animation |
US5596694A (en) * | 1992-05-27 | 1997-01-21 | Apple Computer, Inc. | Method and apparatus for indicating a change in status of an object and its disposition using animation |
US6610936B2 (en) * | 1992-06-08 | 2003-08-26 | Synaptics, Inc. | Object position detector with edge motion feature and gesture recognition |
US5861583A (en) * | 1992-06-08 | 1999-01-19 | Synaptics, Incorporated | Object position detector |
US5543590A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature |
US5488204A (en) * | 1992-06-08 | 1996-01-30 | Synaptics, Incorporated | Paintbrush stylus for capacitive touch sensor pad |
US5880411A (en) * | 1992-06-08 | 1999-03-09 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5280276A (en) * | 1992-07-10 | 1994-01-18 | Quickshot (Bvi) Ltd. | Combination mouse/trackball input device |
US5612719A (en) * | 1992-12-03 | 1997-03-18 | Apple Computer, Inc. | Gesture sensitive buttons for graphical user interfaces |
US5596698A (en) * | 1992-12-22 | 1997-01-21 | Morgan; Michael W. | Method and apparatus for recognizing handwritten inputs in a computerized teaching system |
US5513309A (en) * | 1993-01-05 | 1996-04-30 | Apple Computer, Inc. | Graphic editor user interface for a pointer-based computer system |
US5920694A (en) * | 1993-03-19 | 1999-07-06 | Ncr Corporation | Annotation of computer video displays |
US5613019A (en) * | 1993-05-20 | 1997-03-18 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
US5555363A (en) * | 1993-09-30 | 1996-09-10 | Apple Computer, Inc. | Resetting the case of text on a computer display |
US5594810A (en) * | 1993-09-30 | 1997-01-14 | Apple Computer, Inc. | Method and apparatus for recognizing gestures on a computer system |
US5534893A (en) * | 1993-12-15 | 1996-07-09 | Apple Computer, Inc. | Method and apparatus for using stylus-tablet input in a computer system |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US5488392A (en) * | 1994-04-28 | 1996-01-30 | Harris; Thomas S. | Precision, absolute mapping computer pointing device and versatile accessories |
US5546527A (en) * | 1994-05-23 | 1996-08-13 | International Business Machines Corporation | Overriding action defaults in direct manipulation of objects on a user interface by hovering a source object |
US5559943A (en) * | 1994-06-27 | 1996-09-24 | Microsoft Corporation | Method and apparatus customizing a dual actuation setting of a computer input device switch |
US5666438A (en) * | 1994-07-29 | 1997-09-09 | Apple Computer, Inc. | Method and apparatus for recognizing handwriting of different users of a pen-based computer system |
US6262719B1 (en) * | 1994-09-02 | 2001-07-17 | Packard Bell Nec, Inc. | Mouse emulation with a passive pen |
US5640178A (en) * | 1994-09-16 | 1997-06-17 | Fujitsu Limited | Pointing device |
US5945979A (en) * | 1994-11-17 | 1999-08-31 | International Business Machines Corporation | Combined digital and analog cursor control |
US5805144A (en) * | 1994-12-14 | 1998-09-08 | Dell Usa, L.P. | Mouse pointing device having integrated touchpad |
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5670955A (en) * | 1995-01-31 | 1997-09-23 | Microsoft Corporation | Method and apparatus for generating directional and force vector in an input device |
US5764218A (en) * | 1995-01-31 | 1998-06-09 | Apple Computer, Inc. | Method and apparatus for contacting a touch-sensitive cursor-controlling input device to generate button values |
US5926567A (en) * | 1995-03-01 | 1999-07-20 | Compaq Computer Corporation | Method and apparatus for storing and rapidly displaying graphic data |
US5590567A (en) * | 1995-03-14 | 1997-01-07 | Delco Electronics Corporation | Snap retainer and retainer system |
US5757368A (en) * | 1995-03-27 | 1998-05-26 | Cirque Corporation | System and method for extending the drag function of a computer pointing device |
US5748926A (en) * | 1995-04-18 | 1998-05-05 | Canon Kabushiki Kaisha | Data processing method and apparatus |
US5781181A (en) * | 1995-07-25 | 1998-07-14 | Alps Electric Co., Ltd. | Apparatus and method for changing an operation mode of a coordinate input apparatus |
US5666499A (en) * | 1995-08-04 | 1997-09-09 | Silicon Graphics, Inc. | Clickaround tool-based graphical interface with two cursors |
US5856822A (en) * | 1995-10-27 | 1999-01-05 | 02 Micro, Inc. | Touch-pad digital computer pointing-device |
US5943043A (en) * | 1995-11-09 | 1999-08-24 | International Business Machines Corporation | Touch panel "double-touch" input method and detection apparatus |
US5757361A (en) * | 1996-03-20 | 1998-05-26 | International Business Machines Corporation | Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary |
US6118427A (en) * | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6049329A (en) * | 1996-06-04 | 2000-04-11 | International Business Machines Corporartion | Method of and system for facilitating user input into a small GUI window using a stylus |
US5864635A (en) * | 1996-06-14 | 1999-01-26 | International Business Machines Corporation | Distinguishing gestures from handwriting in a pen based computer by stroke analysis |
US5812118A (en) * | 1996-06-25 | 1998-09-22 | International Business Machines Corporation | Method, apparatus, and memory for creating at least two virtual pointing devices |
US5861886A (en) * | 1996-06-26 | 1999-01-19 | Xerox Corporation | Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface |
US5943044A (en) * | 1996-08-05 | 1999-08-24 | Interlink Electronics | Force sensing semiconductive touchpad |
US6208329B1 (en) * | 1996-08-13 | 2001-03-27 | Lsi Logic Corporation | Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device |
US5907327A (en) * | 1996-08-28 | 1999-05-25 | Alps Electric Co., Ltd. | Apparatus and method regarding drag locking with notification |
US5926179A (en) * | 1996-09-30 | 1999-07-20 | Sony Corporation | Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium |
US5898424A (en) * | 1996-09-30 | 1999-04-27 | Gateway 2000, Inc. | Pointing device with differing actuation forces for primary and secondary buttons |
US6061051A (en) * | 1997-01-17 | 2000-05-09 | Tritech Microelectronics | Command set for touchpad pen-input mouse |
US6057830A (en) * | 1997-01-17 | 2000-05-02 | Tritech Microelectronics International Ltd. | Touchpad mouse controller |
US5883622A (en) * | 1997-01-17 | 1999-03-16 | Tritech Microelectronics International Ltd. | Touchpad pen-input controller |
US5880717A (en) * | 1997-03-14 | 1999-03-09 | Tritech Microelectronics International, Ltd. | Automatic cursor motion control for a touchpad mouse |
US5910800A (en) * | 1997-06-11 | 1999-06-08 | Microsoft Corporation | Usage tips for on-screen touch-sensitive controls |
US6266050B1 (en) * | 1997-08-08 | 2001-07-24 | Samsung Electronics Co., Ltd. | Portable computer having touch pad input control function |
US5912659A (en) * | 1997-09-03 | 1999-06-15 | International Business Machines Corporation | Graphics display pointer with integrated selection |
US6173287B1 (en) * | 1998-03-11 | 2001-01-09 | Digital Equipment Corporation | Technique for ranking multimedia annotations of interest |
US6677930B2 (en) * | 1998-04-01 | 2004-01-13 | Fujitsu Takamisawa Component Ltd | Mouse |
US6204837B1 (en) * | 1998-07-13 | 2001-03-20 | Hewlett-Packard Company | Computing apparatus having multiple pointing devices |
US6414700B1 (en) * | 1998-07-21 | 2002-07-02 | Silicon Graphics, Inc. | System for accessing a large number of menu items using a zoned menu bar |
US6339431B1 (en) * | 1998-09-30 | 2002-01-15 | Kabushiki Kaisha Toshiba | Information presentation apparatus and method |
US6930672B1 (en) * | 1998-10-19 | 2005-08-16 | Fujitsu Limited | Input processing method and input control apparatus |
US6342906B1 (en) * | 1999-02-02 | 2002-01-29 | International Business Machines Corporation | Annotation layer for synchronous collaboration |
US6557042B1 (en) * | 1999-03-19 | 2003-04-29 | Microsoft Corporation | Multimedia summary generation employing user feedback |
US6847350B2 (en) * | 1999-12-16 | 2005-01-25 | Hewlett-Packard Development Company, L.P. | Optical pointing device |
US20020015064A1 (en) * | 2000-08-07 | 2002-02-07 | Robotham John S. | Gesture-based user interface to multi-level and multi-modal sets of bit-maps |
US6897853B2 (en) * | 2000-11-10 | 2005-05-24 | Microsoft Corp. | Highlevel active pen matrix |
US7081889B2 (en) * | 2000-11-10 | 2006-07-25 | Microsoft Corporation | Highlevel active pen matrix |
US7626580B2 (en) * | 2000-11-10 | 2009-12-01 | Microsoft Corporation | Highlevel active pen matrix |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120140102A1 (en) * | 2007-06-12 | 2012-06-07 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
US20080313568A1 (en) * | 2007-06-12 | 2008-12-18 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
US9696888B2 (en) | 2010-12-20 | 2017-07-04 | Microsoft Technology Licensing, Llc | Application-launching interface for multiple modes |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US11126333B2 (en) | 2010-12-23 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9015606B2 (en) | 2010-12-23 | 2015-04-21 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US11272017B2 (en) | 2011-05-27 | 2022-03-08 | Microsoft Technology Licensing, Llc | Application notifications manifest |
US10303325B2 (en) | 2011-05-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Multi-application environment |
US10579250B2 (en) | 2011-09-01 | 2020-03-03 | Microsoft Technology Licensing, Llc | Arranging tiles |
US10254955B2 (en) | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US9134849B2 (en) | 2011-10-25 | 2015-09-15 | Nook Digital, Llc | Pen interface for a touch screen device |
WO2013063241A1 (en) * | 2011-10-25 | 2013-05-02 | Barnesandnoble.Com Llc | Pen interface for a touch screen device |
US9223472B2 (en) * | 2011-12-22 | 2015-12-29 | Microsoft Technology Licensing, Llc | Closing applications |
US20130167058A1 (en) * | 2011-12-22 | 2013-06-27 | Microsoft Corporation | Closing applications |
US10191633B2 (en) | 2011-12-22 | 2019-01-29 | Microsoft Technology Licensing, Llc | Closing applications |
US10803235B2 (en) | 2012-02-15 | 2020-10-13 | Apple Inc. | Device, method, and graphical user interface for sharing a content object in a document |
US11783117B2 (en) | 2012-02-15 | 2023-10-10 | Apple Inc. | Device, method, and graphical user interface for sharing a content object in a document |
US10289660B2 (en) | 2012-02-15 | 2019-05-14 | Apple Inc. | Device, method, and graphical user interface for sharing a content object in a document |
AU2013223015B2 (en) * | 2012-02-24 | 2018-02-22 | Samsung Electronics Co., Ltd. | Method and apparatus for moving contents in terminal |
US10437360B2 (en) * | 2012-02-24 | 2019-10-08 | Samsung Electronics Co., Ltd. | Method and apparatus for moving contents in terminal |
US20130222301A1 (en) * | 2012-02-24 | 2013-08-29 | Samsung Electronics Co., Ltd. | Method and apparatus for moving contents in terminal |
RU2649945C2 (en) * | 2012-06-22 | 2018-04-05 | Самсунг Электроникс Ко., Лтд. | Method for improving touch recognition and electronic device thereof |
US9588607B2 (en) * | 2012-06-22 | 2017-03-07 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
CN103513822A (en) * | 2012-06-22 | 2014-01-15 | 三星电子株式会社 | Method for improving touch recognition and electronic device thereof |
US20130342485A1 (en) * | 2012-06-22 | 2013-12-26 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
US10459607B2 (en) | 2014-04-04 | 2019-10-29 | Microsoft Technology Licensing, Llc | Expandable application representation |
US9841874B2 (en) | 2014-04-04 | 2017-12-12 | Microsoft Technology Licensing, Llc | Expandable application representation |
Also Published As
Publication number | Publication date |
---|---|
US20130293500A1 (en) | 2013-11-07 |
EP1205836A2 (en) | 2002-05-15 |
EP1205836A3 (en) | 2009-07-22 |
JP2011204282A (en) | 2011-10-13 |
US20050088423A1 (en) | 2005-04-28 |
US20060033751A1 (en) | 2006-02-16 |
CN1360249A (en) | 2002-07-24 |
US6897853B2 (en) | 2005-05-24 |
US20020056575A1 (en) | 2002-05-16 |
US20050088422A1 (en) | 2005-04-28 |
JP4809558B2 (en) | 2011-11-09 |
US7277089B2 (en) | 2007-10-02 |
US7081889B2 (en) | 2006-07-25 |
US7626580B2 (en) | 2009-12-01 |
JP2002189567A (en) | 2002-07-05 |
CN1262910C (en) | 2006-07-05 |
JP5211211B2 (en) | 2013-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7626580B2 (en) | Highlevel active pen matrix | |
US6903730B2 (en) | In-air gestures for electromagnetic coordinate digitizers | |
US7319454B2 (en) | Two-button mouse input using a stylus | |
US7499035B2 (en) | Focus management using in-air points | |
US7106312B2 (en) | Text input window with auto-growth | |
US6791536B2 (en) | Simulating gestures of a pointing device using a stylus and providing feedback thereto | |
US7810042B2 (en) | Page bar control | |
US7174042B1 (en) | System and method for automatically recognizing electronic handwriting in an electronic document and converting to text | |
US7002558B2 (en) | Mode hinting and switching | |
EP1538549A1 (en) | Scaled text replacement of digital ink | |
EP1686449A2 (en) | System Control By Stylus Location | |
RU2328030C2 (en) | Focusing control involving points corresponding to stylus position over digitiser surface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |