New! View global litigation for patent families

US20120326965A1 - Methods and apparatus for processing combinations of kinematical inputs - Google Patents

Methods and apparatus for processing combinations of kinematical inputs Download PDF

Info

Publication number
US20120326965A1
US20120326965A1 US13447909 US201213447909A US2012326965A1 US 20120326965 A1 US20120326965 A1 US 20120326965A1 US 13447909 US13447909 US 13447909 US 201213447909 A US201213447909 A US 201213447909A US 2012326965 A1 US2012326965 A1 US 2012326965A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
gesture
input
device
force
embodiment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13447909
Inventor
Omar S. Leung
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks

Abstract

Methods and apparatus for processing combinations of force and velocity data generated over a given period of time. In one embodiment, an input device comprising one or more force sensors and one or more motion sensors is manipulated relative to a surface. A receiving system is adapted to receive input sequences or “gestures” which are triggered upon the occurrence of one or more conditions detected by the input device. An application executing in the receiving system may be implemented such that the system responds differently to each specific gesture provided by the user.

Description

    FIELD OF THE INVENTION
  • [0001]
    The present invention relates generally to the field of data input. More particularly, the present invention is directed in one exemplary aspect to an input device adapted to process combinations of force and velocity data.
  • SUMMARY OF THE INVENTION
  • [0002]
    The present invention is directed in one exemplary aspect to an input device capable of detecting its present velocity as well as various forces exerting upon it. Force and velocity data are generated after receiving input from one or more sensory modules comprised within the input device. If the force and velocity data from a target interval match one or more predetermined gesture profiles, a receiving device can then process the identified gesture accordingly.
  • [0003]
    Some embodiments of the present invention therefore enable a user to provide a series of gestures as input to the receiving device. Such gestures may include, for example, brushing motions, scooping motions, nudges, tilt and slides, and tilt and taps. The application can then respond to each gesture (or gesture combination) in any number of ways.
  • [0004]
    Embodiments of the present invention may therefore have applicability to any electronic system or application capable of receiving input. For example, embodiments of the present invention may be useful with video games, file browsing, interactive navigation, communication systems, control systems, military systems, medical devices, and industrial applications.
  • [0005]
    In a first aspect of the invention, an input system is disclosed. In one embodiment, the input system comprises: a force sensor; a velocity sensor; and a processor in communication with the force and velocity sensors and operable to determine whether at least one of a plurality of gestures has been performed using an input device.
  • [0006]
    In a second aspect of the invention, a method is disclosed. In one embodiment, the method comprises: estimating a force applied to an apparatus during a first period: estimating a velocity associated with the apparatus during the first period; and selecting one of a plurality of predetermined gestures based at least in part upon the force and the velocity.
  • [0007]
    In a third aspect of the invention, a computer readable medium is disclosed. In one embodiment, the computer readable medium comprises instructions which, when executed by a computer, perform a process comprising: estimating a velocity attained by an input device; estimating a force exerted upon the input device; and determining whether one of a plurality of predetermined gestures has been performed using the input device based at least in part on the estimated force and velocity.
  • [0008]
    In a fourth aspect of the invention, a system is disclosed. In one embodiment, the system comprises: an input device, comprising one or more force modules and one or more velocity modules; a processor in communication with the force and velocity modules and operable to determine whether at least one of a plurality of gestures has been performed using the input device; a computing device in communication with the processor and operable to receive a signal from processor, wherein the signal is indicative of the determined at least one gesture; and a display device in communication with the computing device for displaying at least one object response in accordance with the at least one gesture.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0009]
    FIG. 1 is a block diagram illustrating a typical environment in which an input device may be used according to one embodiment of the present invention.
  • [0010]
    FIG. 2 is a system diagram of a modular arrangement of an input device according to one embodiment of the present invention.
  • [0011]
    FIG. 3 is a flow diagram illustrating a method of indicating a gesture according to one embodiment of the present invention.
  • [0012]
    FIG. 4 is a block diagram illustrating an input device with an applied velocity according to one embodiment of the present invention.
  • [0013]
    FIG. 5 a is a block diagram illustrating a plurality of forces exerting upon an input device according to one embodiment of the present invention.
  • [0014]
    FIG. 5 b is a block diagram illustrating a surface reacting to a downward force applied to an input device according to one embodiment of the present invention.
  • [0015]
    FIG. 5 c is a block diagram illustrating a surface reacting to a lateral force applied to an input device according to one embodiment of the present invention.
  • [0016]
    FIG. 6 a is a set of graphs depicting a first gesture profile according to one embodiment of the present invention.
  • [0017]
    FIG. 6 b is a block diagram illustrating the gesture depicted by FIG. 6 a.
  • [0018]
    FIG. 7 is flow diagram illustrating a method of implementing the gesture illustrated in FIGS. 6 a and 6 h.
  • [0019]
    FIG. 8 a is a set of graphs depicting a second gesture profile according to one embodiment of the present invention.
  • [0020]
    FIG. 8 b is a block diagram illustrating the gesture depicted by FIG. 8 a.
  • [0021]
    FIG. 9 is flow diagram illustrating a method of implementing the gesture illustrated in FIGS. 8 a and 8 b.
  • [0022]
    FIG. 10 a is a set of graphs depicting a third gesture profile according to one embodiment of the present invention.
  • [0023]
    FIG. 10 b is a block diagram illustrating the gesture depicted by FIG. 10 a.
  • [0024]
    FIG. 11 is flow diagram illustrating a method of implementing the gesture illustrated in FIGS. 10 a and 10 b.
  • [0025]
    FIG. 12 is a block diagram depicting a fourth gesture profile according to one embodiment of the present invention.
  • [0026]
    FIG. 13 is a block diagram depicting a fifth gesture profile according to one embodiment of the present invention.
  • [0027]
    FIG. 14 is a flow diagram illustrating a method of implementing the gestures illustrated in FIGS. 12 and 13.
  • [0028]
    FIG. 15 a is a block diagram depicting a method of enabling a sixth gesture according to one embodiment of the present invention.
  • [0029]
    FIG. 15 b is a block diagram depicting a method of triggering the gesture depicted in FIG. 15 a.
  • [0030]
    FIG. 16 is a flow diagram illustrating a method of implementing the gesture profile depicted in FIGS. 15 a and 15 b.
  • DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
  • [0031]
    In the following description of exemplary embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
  • [0032]
    As used herein, the term “application” includes without limitation any unit of executable software that implements a specific functionality or theme. The unit of executable software may run in a predetermined environment; for example, a downloadable Java Xlet™ which runs within the JavaTV™ environment.
  • [0033]
    As used herein, the terms “computer program” and “software” include without limitation any sequence of human or machine cognizable steps that are adapted to be processed by a computer. Such may be rendered in any programming language or environment including, for example, C/C++, Fortran, COBOL, PASCAL, Perl, Prolog, assembly language, scripting languages, markup languages (e.g., HTML, SGML, XML, VoXML), functional languages (e.g., APL, Erlang, Haskell, Lisp, ML, F# and Scheme), as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), Java™ (including J2ME, Java Beans, etc.).
  • [0034]
    As used herein, the term “display” includes any type of device adapted to display information, including without limitation cathode ray tube displays (CRTs), liquid crystal displays (LCDs), thin film transistor displays (TFTs), digital light processor displays (DLPs), plasma displays, light emitting diodes (LEDs) or diode arrays, incandescent devices, and fluorescent devices. Display devices may also include less dynamic devices such as printers, c-ink devices, and other similar structures.
  • [0035]
    As used herein, the term “interface” refers to any signal or data interface with a component or network including, without limitation, those compliant with USB (e.g., USB2), FireWire (e.g., IEEE 1394b), Ethernet (e.g., 10/100, 10/100/1000 Gigabit Ethernet, 10-Gig-E, etc.), MoCA, Serial ATA (e.g., SATA, e-SATA, SATAII), Ultra-ATA/DMA, Coaxsys (e.g., TVnet™), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), modem, WiFi (802.11a,b,g,n), WiMAX (802.16), PAN (802.15), or IrDA families.
  • [0036]
    As used herein, the term “memory” includes any type of integrated circuit or other storage device adapted for storing digital data including, without limitation, ROM, PROM, EEPROM, DRAM, SDRAM, DDR/2 SDRAM, EDO/FPMS, RLDRAM, SRAM, “flash” memory (e.g., NAND/NOR), and PSRAM.
  • [0037]
    As user herein, the term “module” refers to any unit or combination of units incorporating software, firmware, hardware, or any combination thereof that is designed and configured to perform a desired function.
  • [0038]
    As used herein, the terms “processor,” “microprocessor,” and “digital processor” refer to all types of digital processing devices including, without limitation, digital signal processors (DSPs), reduced instruction set computers (RISC), general-purpose (CISC) processors, microprocessors, gate arrays (e.g., FPGAs), programmable logic devices (PLDs), reconfigurable compute fabrics (RCFs), array processors, and application-specific integrated circuits (ASICs). Such processors may be contained on a single unitary IC die, or distributed across multiple components.
  • [0039]
    As used herein, the terms “receiving device” and “receiver” include without limitation video game consoles, set-top boxes, televisions, personal computers (whether desktop, laptop, or otherwise), digital video recorders, communications equipment, terminals, and display devices.
  • [0040]
    As used herein, the term “wireless” refers to any wireless signal, data, communication, or other interface including, without limitation. Wi-Fi, Bluetooth, 3G, HSDPA/HSUPA, TDMA, CDMA (e.g., IS-95A, WCDMA, etc.), FHSS, DSSS, GSM, PAN/802.15, WiMAX (802.16), 802.20, narrowband/FDMA, OFDM, PCS/DCS, analog cellular, CDPD, satellite systems, millimeter wave or microwave systems, acoustic, and infrared (i.e., IrDA).
  • [0041]
    FIG. 1 is a block diagram illustrating a typical environment in which an input device can be used according to one embodiment of the present invention. An input device 102 is initially positioned upon a surface 104 such as a desk or a tabletop. In order to generate input data, a user manipulates the input device 102 relative to the surface 104. In one embodiment, input data may be generated by the input device 102 without the use of buttons.
  • [0042]
    Note that in FIG. 1, the surface 104 is depicted as being flat or substantially flat. However, neither condition is necessary according to embodiments of the present invention. Also note that in some embodiments, the surface 104 need not necessarily be situated beneath the input device 102. For example, the surface 104 may be tilted, situated above the input device 102, inverted, or vertically oriented. Also note that in some embodiments, multiple surfaces 104 can be utilized.
  • [0043]
    A receiving device (such as the depicted computer 106) is adapted to receive input data generated from the input device 102. In one embodiment, the receiving device comprises at least one interface adapted to receive the generated data. The input device 102 can communicate with the receiving device over a wireless communication link (such as, for example, WiFi or Bluetooth) or over a wired communication link (such as a serial bus cable or other physical connector).
  • [0044]
    The receiving device is adapted to display a navigational object (for example, a pointer, cursor, selector box, or other such indicator) upon its display screen 108. During operation, when the user manipulates the input device 102 relative to the surface 104, the input signals are transmitted to the computer 106 and the navigational object responds according to the user's input. It is understood that the receiving device can be any type of computing device having a display such as an iMac™ computer or a personal computer having a separate display monitor, for example. Other types of computing devices having a display or in communication with a display (e.g., by a wired or wireless communication link) would be readily apparent to those of ordinary skill in the art.
  • [0045]
    FIG. 2 is a system diagram of a modular arrangement of the input device 102 according to one embodiment of the present invention. The input device 102 houses a printed circuit board 204 enabling communication and data transfer between the connected modules.
  • [0046]
    A power supply 206 provides a source of power to modules electrically connected to the printed circuit board 204. In some embodiments, power is supplied externally by one or more conductive wires, for example, from a power cable or a serial bus cable. In other embodiments, a battery may be used as a source of power.
  • [0047]
    A memory 212 comprises any type of module adapted to enable digital information to be stored, retained, and retrieved. Additionally, the memory 212 may comprise any combination of volatile and non-volatile storage devices, including without limitation RAM, DRAM, SRAM, ROM, and/or flash memory. Note also that the memory 212 may be organized in any number of architectural configurations utilizing, for example, registers, memory caches, data buffers, main memory, mass storage, and/or removable media. In some embodiments, the memory 212 is adapted to store gesture profiles comprising force and velocity thresholds and/or applicable ranges for each type of gesture. In alternative embodiments, the gesture profiles are stored in a remote memory source (e.g., the hard drive of the computer 106), and the input device 102 transmits raw data to the computer 106 for storage and processing by one or more memories and processors (not shown) contained within the receiving device.
  • [0048]
    One or more processors 208 are adapted to execute sequences of instructions by loading and storing data to the memory 212. Possible instructions include, without limitation, instructions for data conversions, formatting operations, communication instructions, and/or storage and retrieval operations. Additionally, the processors 208 may comprise any type of digital processing devices including, for example, reduced instruction set computer processors, general-purpose processors, microprocessors, digital signal processors, gate arrays, programmable logic devices, reconfigurable compute fabrics, array processors, and/or application-specific integrated circuits. Note also that the processors 208 may be contained on a single unitary IC die or distributed across multiple components.
  • [0049]
    An interface module 216 enables data to be transmitted and/or received between two or more devices. In one embodiment, data transmitted to a receiving device is first packetized and processed according to one or more standardized network protocols. The interface module 216 may accommodate any wired or wireless protocol including, without limitation, USB, FireWire, Ethernet, Gigabit Ethernet, MoCA, radio frequency tuners, modems, WiFi, Bluetooth, WiMax, and/or Infrared Data Association.
  • [0050]
    One or more motion sensors 210 enable the input device 102 to determine velocity values during a given instant, or alternatively, over a given period of time. Various types of motion sensors, such as gyroscopes, accelerometers, optical sensors, etc. may be used with the present invention. In one embodiment, the motion sensors 210 comprise one or more accelerometers adapted to detect the current acceleration of the input device 102. The accelerometers can measure acceleration of the input device 102 by any number of means, including, for example, inclination sensing, vibration, and/or shock values. In one embodiment, the input device 102 comprises at least one sensor which can track movement of the input device 102 irrespective of any interaction or contact with the surface 104.
  • [0051]
    In some embodiments, velocity values are determined by logic adapted to integrate a detected acceleration quantity. In one embodiment, the motion sensors 210 are implemented as part of a micro electro-mechanical system (MEMS). Optionally, the micro electro-mechanical system may comprise a dedicated microprocessor adapted to interact with one or more microsensors responsible for receiving external data.
  • [0052]
    In other embodiments, the motion of the input device 102 is externally determined, such as by a tracking system or one or more transceivers located within communicative range of the input device 102. In one embodiment, the time it takes the transceivers to receive subsequent signals transmitted by the input device 102 is used as a basis to calculate the past or present positional data of the input device 102.
  • [0053]
    One or more force sensors 214 determine the forces applied to the input device 102. Forces such as a normal force, shear force, frictional force, angular force, or any combination thereof may be measured and utilized in accordance with one or more embodiments of the present invention. Any type of force sensor or force sensor combination may be used for accomplishing force detection, including, for example, switch sensors, contact sensors, and/or accelerometers. Additionally, the force sensors 214 can comprise lightsource/slit/photosensor combinations, or one or more capacitive sensors on a compliant support. In one embodiment, the force sensors 214 detect force independently from the motion sensors 210. Also, according to one embodiment, forces are sensed as surface reactions to one or more input forces (as described in more detail below; see FIG. 5 a-5 c and accompanying text).
  • [0054]
    FIG. 3 is a flow diagram illustrating a method of indicating a gesture according to one embodiment of the present invention. At step 302, force values and velocity values are initially received. In some embodiments, force and velocity values are sensed directly from external environmental conditions. In other embodiments, the force and velocity values are derived after receiving data from one or more sensed inputs. For example, in one embodiment, velocity data is derived from integrating a sensed acceleration value or from differentiating a sensed positional value.
  • [0055]
    At decision block 304 it is determined whether a gesture has in fact been triggered. In one embodiment, a “gesture profile” (which may be stored either locally or remotely) comprises a set of force and velocity ranges and/or timing information which together define the corresponding gesture. Note that force and velocity data need not necessarily match their respective force and velocity ranges simultaneously in order for the gesture to be triggered. For example, in one embodiment, a “tilt and tap” gesture requires a force spike to be detected after a tilt has been detected and not released. In the tilt and tap gesture, the gesture is said to be “enabled” after the tilt occurs (i.e. the input device 102 enters a state where it is now possible for the gesture to be triggered, but the gesture will not be triggered until the occurrence of the last temporal condition required by the gesture profile).
  • [0056]
    The gesture is then indicated at step 306. In one embodiment, the indication comprises a signal or a set of data transmitted from the input device 102 to a receiving device. In another embodiment, the indication may be generated externally. After the gesture has been indicated, control of the method resumes at step 302 so that additional gestures can be detected.
  • [0057]
    FIG. 4 is a block diagram indicating an input device 102 with an applied velocity 404 according to one embodiment of the present invention. Notice that the velocity 404 comprises both a magnitude (length of arrow) and a direction, thus forming a velocity vector. As mentioned above, in some embodiments, one or more motion sensors 210 (such as accelerometers) comprised within the input device 102 are responsible for determining the velocity of the input device 102. In one embodiment, contact, positioning, or movement about the surface 104 is not necessary for the present velocity 404 of the input device 102 to be determined.
  • [0058]
    In some embodiments, velocity data is not generated when the input device 102 is stationary. However, in one embodiment, gesture profiles may still be defined for a stationary input device 102 by using force sensor 214 inputs. For example, a “press” gesture may be defined as a downward force (above and beyond normal gravitational forces) which is exerted upon a stationary input device 102.
  • [0059]
    Alternatively, frictional forces associated with a contact surface 104 may overtake lateral forces exerted upon the input, device 102 by the user, thus preventing the input device 102 from achieving a non-zero velocity, while still registering force data associated with a defined gesture profile. In other embodiments, a stationary input device 102 may attain a velocity after one or more dynamic forces have been applied to it.
  • [0060]
    FIG. 5 a is a block diagram indicating an input device 102 having a plurality of forces applied to it according to one embodiment of the present invention. As depicted in the figure, a downward force 502 pushes the input device 102 against the surface 104, while a lateral force 506 pushes the input device 102 across the surface 104. In cases where multiple forces are simultaneously acting upon the input device 102 (i.e. forces comprising separate magnitudes and directions), standard techniques known in physics may be used to calculate a scalar quantity from a given set of force vectors (for example, as by calculating a dot product).
  • [0061]
    According to one embodiment, one or more force sensors 214 comprised within the input device 102 are adapted to detect surface reactions to an input force. For example, as shown in FIG. 5 b, if the user applies a downward force 502 to the input device 102, the force sensors 214 will detect the force pushing up against the input device 102 (i.e., a normal force 504). As shown in FIG. 5 c, if a user applies a lateral force 506 to the input device 102, the force sensors 214 will detect a shear force 508 acting against the direction of the acting lateral force 506.
  • [0062]
    FIGS. 6-16 depict a plurality of input sequences useful with some embodiments of the present invention. Note that the following input sequences are exemplary in nature and provided herein for illustrative purposes. However, many other gestures and/or input sequences are possible according to embodiments of the present invention.
  • [0063]
    FIG. 6 a is a set of graphs depicting a first gesture profile according to one embodiment of the present invention. The set of graphs comprises a force graph and a velocity graph each plotted against a common time axis 600. The graphs together depict a “brush” gesture that is similar to a brushing motion. The brush gesture is comprised of an input sequence where the input device 102 is first moved laterally across a surface 104 and then is subsequently lifted 620 above it (as depicted in FIG. 6 b).
  • [0064]
    The input device 102 is initially at rest, and therefore its velocity is 0 m/s. Then, as the user applies a lateral force to the input device 102, the input device 102 attains a non-zero velocity. The motion commences at start motion 622 which is also depicted in FIG. 6 b.
  • [0065]
    As stated above, a gesture may not be triggered unless the gesture is first enabled. In the present case, the input device 102 must attain a minimum velocity Vmin 604 before the gesture is enabled. Note that certain gestures may become disabled if the condition for enablement is no longer satisfied before the condition to trigger the gesture becomes satisfied. For example, in the present embodiment, if the input device 102 suddenly stopped after achieving the minimum velocity Vmin 604, the gesture would become disabled, and subsequently lifting the stationary input device 102 off of the surface 104 would not trigger a brush gesture.
  • [0066]
    After the minimum velocity Vmin 604 has been attained (624 in FIG. 6 b), the brush gesture is then enabled. Even though the brush gesture is enabled, the gesture will not become triggered until the triggering condition is satisfied. Note that the minimum velocity Vmin 604 may be implemented for any number of reasons, among others, for preventing a lift of a stationary input device 102 from inadvertently becoming interpreted as a brush. The inadvertent detection of a brush gesture might occur as a result of infinitesimal lateral forces exerted upon the input device 102 as an expected part of the user lifting the input device 102 off of the surface 104. Alternatively, the Vmin 604 may be implemented to ensure that the lateral forces are of a certain velocity before being interpreted as a brush. For example, in one embodiment, multiple brush gestures are defined such that each gesture comprises a separate velocity threshold. An application may be set to handle “fast brushes” differently than “slow brushes,” Alternatively, the application may be set to only accept “fast brushes.” Thus, Vmin 604 may be set according to the application designer's specific preferences.
  • [0067]
    After the gesture is enabled, a sudden decrease in normal force 504 will then trigger the brush gesture. Since the input device 102 no longer maintains contact with the surface 104 as the input device 102 is lifted above the surface 104, the detected normal force 504 will decrease to zero. In some embodiments, a change in force of a certain magnitude will indicate that the brush gesture has been triggered. Conditions for determining whether the brush gesture has been triggered may be implemented in any number of ways. In one embodiment, the gesture is triggered upon detecting a discontinuity or a falling edge 614 after the latest instant in time when the gesture was enabled. In other embodiments, the actual magnitude of the detected normal force 504 determines whether the gesture has been triggered. Thus, if the present magnitude of the normal force 504 comprises a zero magnitude, the gesture will become triggered.
  • [0068]
    In still other cases, both a direction and a predetermined magnitude are used to determine whether the gesture has been triggered. For example, a falling edge 614 may determine a decrease in force, while a zero magnitude normal force satisfies the predetermined magnitude of the brush triggering condition. Thus, the gesture becomes triggered when the magnitude of the normal force vector decreases to zero (as indicated by reference numeral 626 in FIG. 6 b).
  • [0069]
    FIG. 7 is flow diagram illustrating one method of implementing the brush gesture profile depicted in FIGS. 6 a and 6 b. The gesture is initially disabled at step 702. At decision block 704, if the determined velocity is greater than or equal to a designated minimum value Vmin 604, the gesture becomes enabled at step 706. Otherwise, the gesture remains disabled at step 702.
  • [0070]
    If the gesture is enabled at step 706, control then passes to decision block 708. If the present velocity decreases beneath the predetermined minimum value Vmin 604, the gesture becomes disabled at step 702. Otherwise, control passes to decision block 710 where it is then determined whether the magnitude of the normal force has decreased to zero. If it has, the gesture is triggered at step 712 and the process ends. Otherwise, the process repeats at step 706.
  • [0071]
    FIG. 8 a is a set of graphs depicting a second gesture profile according to one embodiment of the present invention. As in FIG. 6 a, the set of graphs comprises a force graph and a velocity graph plotted against a common time axis 800. The graphs together depict a “scoop” gesture that is similar to a scooping motion. The scoop gesture is comprised of an input sequence where the input device 102 first moves laterally above a surface 104 and then subsequently establishes contact with the surface 104. The input sequence is depicted in FIG. 8 b.
  • [0072]
    The input device 102 is initially held stationary above the surface 104, thus comprising a velocity of 0 m/s. Then, as the user applies a lateral force to the input device 102, the input device 102 attains a non-zero velocity. The user's motion commences at start motion 822 which is also depicted in FIG. 8 b.
  • [0073]
    As before, the input device 102 must attain a certain minimum velocity Vmin 804 in order for the gesture to become enabled. Once the gesture becomes enabled, the gesture is triggered upon contact 820 with the surface 104. In one embodiment, determining whether the scoop gesture has been triggered is accomplished by detecting a discontinuity or a rising edge 814 occurring after the latest instant in time when the gesture was enabled. In other embodiments, the actual magnitude of the detected normal force 504 determines whether the gesture has been triggered. For example, the scoop gesture may be implemented such that the gesture becomes triggered when the present magnitude of the normal force 504 comprises a magnitude greater than or equal to a designated minimum value.
  • [0074]
    In other embodiments, both a direction and a predetermined magnitude are used in combination to determine whether the gesture has been triggered. For example, a rising edge 814 may determine an increase in force, while a normal force with a magnitude greater than zero satisfies the predetermined magnitude of the scoop triggering condition. Thus, the gesture becomes triggered when the force vector exceeds a zero magnitude normal force (as indicated by reference numeral 826 in FIG. 8 b).
  • [0075]
    FIG. 9 is flow diagram illustrating one method of implementing the scoop gesture profile depicted in FIGS. 8 a and 8 b. The gesture is initially disabled at step 902. At decision block 904, if the determined velocity is greater than or equal to a designated minimum value 804 and the magnitude of the detected normal force is zero, the gesture becomes enabled at step 906. Otherwise, the gesture remains disabled at step 902.
  • [0076]
    If the gesture is enabled at step 906, control then passes to decision block 908. If the present velocity decreases beneath the predetermined minimum value Vmin 904, the gesture becomes disabled at step 902. Otherwise, control passes to decision block 910 where it is then determined whether the magnitude of the normal force has exceeded zero. If it has, the gesture is triggered at step 912 and the process ends. Otherwise, the process repeats at step 706.
  • [0077]
    FIG. 10 a is a set of graphs depicting a third gesture profile according to one embodiment of the present invention. As above, the set of graphs comprises a force graph and a velocity graph plotted against a common time axis 1000. The graphs together depict a “nudge” gesture. The nudge gesture is comprised of an input sequence where a stationary input device receives an input force from a user. The input sequence is depicted in FIG. 10 b.
  • [0078]
    Unlike the previously described gestures, the nudge gesture is enabled when the input device 102 is stationary (i.e., its velocity is 0 m/s). Thus, a stationary input device 102 is already in a state where the gesture may become triggered upon occurrence of the triggering condition. In the present case, the triggering condition comprises the input device 102 receiving an external, non-gravitational force 1004.
  • [0079]
    The nudge gesture may be implemented in any number of ways. For example, in one embodiment, once the input device 102 receives a force 1004 comprising a magnitude greater than or equal to a designated minimum value, the gesture becomes triggered. In one embodiment, the minimum value is set to be equal to the minimum amount of force it takes to overtake opposing frictional forces (i.e. for sliding across the surface 104 to occur). Some embodiments also require the angle of the force to fall within a designated range. In one embodiment, for example, the vector angle must range from 10° to 20° when measured from a vertical axis.
  • [0080]
    In some embodiments, the nudge profile is adapted to reduce lag/dead zone from the motion sensor. The lag can be a result of energy conservation modes or limited report intervals on the input device 102. In some embodiments, non-activity places the input device 102 into a sleep mode, where certain triggers serve to “wake-up” the input device 102 in order to resume gesture detection. The sleep/wake-up function may be implemented in any number of ways. For example, in one embodiment, a period of non-activity places the motion sensors 210 into a sleep mode, while the force sensors 214 remain active. If the force sensors 214 detect a force exceeding a certain threshold, the detected force may serve to indicate a “just about to move” situation which initiates the wake-up process of the motion sensors 210. In one embodiment, motion occurring before or during the wake-up process is simply ignored.
  • [0081]
    FIG. 11 is flow diagram illustrating one method of implementing the nudge gesture profile depicted in FIGS. 10 a and 10 b. The gesture is initially enabled at step 1106. At decision block 1104, if the determined velocity is greater than or equal to a designated minimum velocity value, the gesture becomes disabled at step 1102. As illustrated in decision block 1108, the gesture may become enabled at a later period if the present velocity decreases beneath the minimum velocity value.
  • [0082]
    On the other hand, if the present velocity is not greater than or equal to the minimum velocity value at decision block 1104, control then passes to decision block 1110. At decision block 1110, if the presently detected force is greater than a designated minimum force value, and the vector angle is within a designated range, then the gesture is triggered at step 1112, and the process ends. Otherwise, the process resumes at step 1106.
  • [0083]
    FIG. 12 is a block diagram depicting a fourth gesture profile according to one embodiment of the present invention. The figure depicts a “tilt and drag” gesture comprising of an input sequence where the input device 102 is first tilted and then slid in a direction that is perpendicular to its tilt axis.
  • [0084]
    In one embodiment, the gesture is enabled upon detecting that the input device 102 has been tilted. Tilt detection may be implemented in any number of ways. For example, in one embodiment, the gesture is enabled if there is a detected decrease in z-direction acceleration in a first region of the input device 102 along with a detected increase in z-direction acceleration in a second region of the device (i.e., in the direction of tilt). In another embodiment, the input device 102 comprises a tilt sensor adapted to sense the inclination of the input device 102. The gesture is enabled when the detected inclination falls within a certain range when measured from a reference axis. In one embodiment, for example, the range comprises 20°-60° when measured from a horizontal axis.
  • [0085]
    Once the direction of tilt and/or tilt axis has been ascertained, the gesture will become triggered if a slide 1200 occurs in a direction that is perpendicular to the tilt axis. In some embodiments, force and/or vibration signals detected by the force sensors 214 are utilized as part of the slide detection process. In one embodiment, data generated from both the force sensors 214 and the motion sensors 210 are used to detect the magnitude and direction of the slide 1200.
  • [0086]
    FIG. 13 is a block diagram depicting a fifth gesture profile according to one embodiment of the present invention. FIG. 13 depicts a “tilt and drag” sequence that is very similar to the gesture already described in FIG. 12. However, instead of the slide 1300 occurring in a direction that is perpendicular to the tilt axis, as shown in FIG. 12, the slide 1300 in FIG. 13 occurs in a direction that is parallel with the tilt axis. In some embodiments, the tilt and drag gesture is implemented in the same manner as already described with respect to FIG. 12 while taking into account directional differences.
  • [0087]
    FIG. 14 is a flow diagram illustrating one method of implementing the tilt and drag gesture profiles depicted in FIGS. 12 and 13. The gesture is initially disabled at step 1402. At decision block 1404, if tilt has been detected, the gesture becomes enabled at step 1406. Otherwise, the gesture remains disabled at step 1404.
  • [0088]
    If the gesture is enabled at step 1406, control then passes to decision block 1408. If tilt has been released, the gesture becomes disabled at step 1402. Otherwise, control passes to decision block 1410 where it is then determined whether a slide has been detected. If a slide has been detected, the direction of the slide is then determined, and the corresponding gesture is triggered at step 1412. Otherwise, the process repeats at step 1406.
  • [0089]
    FIGS. 15 a and 15 h are block diagrams depicting a sixth gesture profile according to one embodiment of the present invention. These figures depict a “tilt and tap” gesture comprising of an input sequence where the input device 102 is first tilted and then tapped against a surface 104.
  • [0090]
    FIG. 15 a is a block diagram depicting a method of enabling the tilt and tap gesture according to one embodiment of the present invention. In some embodiments, the gesture is enabled in the same manner as described with respect to FIGS. 12-14. In one embodiment, the gesture is enabled if there is a detected decrease in z-direction acceleration in a first region of the input device 102 along with a detected increase in z-direction acceleration in a second region of the device (i.e., in the direction of tilt). In another embodiment, the input device 102 comprises a tilt sensor adapted to sense the inclination of the input device 102. The gesture is enabled when the detected inclination falls within a certain range when measured from a reference axis. In one embodiment, the range comprises 20°-60° when measured from a horizontal axis.
  • [0091]
    As shown in FIG. 15 b, once the direction of tilt and/or tilt axis has been ascertained, the gesture will become triggered if a tap is subsequently detected. Tap detection may be implemented in any number of ways. For example, according to one embodiment, an impulse in acceleration corresponding to the first region of the device and/or one or more force signals indicate motion of the input device 102 followed by contact with a surface 104 (i.e., a single tap). Note also that in some embodiments, additional taps trigger to separate gestures or map to separate gesture profiles (e.g., double-taps and triple-taps). In one embodiment, the tilt must be released before an additional tilt and tap sequence may be generated. In another embodiment, a temporal threshold indicates how much time is allowed for a subsequent tap to be received after a prior tap. If the temporal threshold is exceeded, any subsequent tap will indicate a single tap.
  • [0092]
    FIG. 16 is a flow diagram illustrating one method of implemented the tilt and tap gesture profile depicted in FIGS. 15 a and 15 b. The gesture is initially disabled at step 1602. At decision block 1604, if tilt has been detected, the gesture becomes enabled at step 1606. Otherwise, the gesture remains disabled at step 1604.
  • [0093]
    If the gesture is enabled at step 1606, control then passes to decision block 1608. If tilt has been released, the gesture becomes disabled at step 1602. Otherwise, control passes to decision block 1610 where it is then determined whether a tap has been detected. If a tap has been detected, the gesture is triggered at step 1612. Otherwise, the process repeats at step 1606.
  • [0094]
    Although the present invention has been fully described in connection with embodiments thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the present invention as defined by the appended claims.
  • [0095]
    Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as mean “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise. Similarly, a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should also be read as “and/or” unless expressly stated otherwise. Furthermore, although items, elements or components of the disclosure may be described or claimed in the singular, the plural is contemplated to be within the scope thereof unless limitation to the singular is explicitly stated. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.

Claims (1)

  1. 1. An input system comprising:
    a force sensor;
    a velocity sensor; and
    a processor in communication with the force and velocity sensors and operable to determine whether at least one of a plurality of gestures has been performed using an input device.
US13447909 2008-07-18 2012-04-16 Methods and apparatus for processing combinations of kinematical inputs Abandoned US20120326965A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12176148 US8159455B2 (en) 2008-07-18 2008-07-18 Methods and apparatus for processing combinations of kinematical inputs
US13447909 US20120326965A1 (en) 2008-07-18 2012-04-16 Methods and apparatus for processing combinations of kinematical inputs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13447909 US20120326965A1 (en) 2008-07-18 2012-04-16 Methods and apparatus for processing combinations of kinematical inputs

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12176148 Continuation US8159455B2 (en) 2008-07-18 2008-07-18 Methods and apparatus for processing combinations of kinematical inputs

Publications (1)

Publication Number Publication Date
US20120326965A1 true true US20120326965A1 (en) 2012-12-27

Family

ID=41529901

Family Applications (2)

Application Number Title Priority Date Filing Date
US12176148 Active 2030-12-12 US8159455B2 (en) 2008-07-18 2008-07-18 Methods and apparatus for processing combinations of kinematical inputs
US13447909 Abandoned US20120326965A1 (en) 2008-07-18 2012-04-16 Methods and apparatus for processing combinations of kinematical inputs

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12176148 Active 2030-12-12 US8159455B2 (en) 2008-07-18 2008-07-18 Methods and apparatus for processing combinations of kinematical inputs

Country Status (1)

Country Link
US (2) US8159455B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100287513A1 (en) * 2009-05-05 2010-11-11 Microsoft Corporation Multi-device gesture interactivity
US9329750B2 (en) 2013-09-10 2016-05-03 Google Inc. Three-dimensional tilt and pan navigation using a single gesture

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
US8111243B2 (en) * 2006-03-30 2012-02-07 Cypress Semiconductor Corporation Apparatus and method for recognizing a tap gesture on a touch sensing device
JP5079565B2 (en) * 2008-03-27 2012-11-21 株式会社ディーアンドエムホールディングス Reproducing apparatus and method
US9851813B2 (en) * 2008-09-18 2017-12-26 Apple Inc. Force sensing for fine tracking control of mouse cursor
US9041650B2 (en) * 2008-09-18 2015-05-26 Apple Inc. Using measurement of lateral force for a tracking input device
US9639187B2 (en) 2008-09-22 2017-05-02 Apple Inc. Using vibration to determine the motion of an input device
US8836648B2 (en) * 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US8239785B2 (en) * 2010-01-27 2012-08-07 Microsoft Corporation Edge gestures
US9411504B2 (en) * 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US20110185320A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Cross-reference Gestures
US8261213B2 (en) * 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US20110185299A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Stamp Gestures
US20110191704A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Contextual multiplexing gestures
US9519356B2 (en) * 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US20110191719A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Cut, Punch-Out, and Rip Gestures
US20110209088A1 (en) * 2010-02-19 2011-08-25 Microsoft Corporation Multi-Finger Gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US8799827B2 (en) * 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9274682B2 (en) * 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US8539384B2 (en) 2010-02-25 2013-09-17 Microsoft Corporation Multi-screen pinch and expand gestures
US9454304B2 (en) * 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US8751970B2 (en) * 2010-02-25 2014-06-10 Microsoft Corporation Multi-screen synchronous slide gesture
US20110209058A1 (en) * 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and tap gesture
US8473870B2 (en) 2010-02-25 2013-06-25 Microsoft Corporation Multi-screen hold and drag gesture
US9075522B2 (en) * 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
US8707174B2 (en) * 2010-02-25 2014-04-22 Microsoft Corporation Multi-screen hold and page-flip gesture
US20110209101A1 (en) * 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen pinch-to-pocket gesture
US20110209089A1 (en) * 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen object-hold and page-change gesture
US20110216004A1 (en) * 2010-03-08 2011-09-08 David Stephenson Tilt and position command system for input peripherals
US8267788B2 (en) * 2010-04-13 2012-09-18 Kulas Charles J Gamepiece controller using a movable position-sensing display device including a movement currency mode of movement
US8123614B2 (en) * 2010-04-13 2012-02-28 Kulas Charles J Gamepiece controller using a movable position-sensing display device including a movement currency mode of movement
US9164670B2 (en) * 2010-09-15 2015-10-20 Microsoft Technology Licensing, Llc Flexible touch-based scrolling
CN103270522B (en) * 2010-12-17 2018-01-26 皇家飞利浦电子股份有限公司 Posture control for monitoring vital signs
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
EP2505959A1 (en) * 2011-03-28 2012-10-03 Renishaw plc Coordinate positioning machine controller
US8315822B2 (en) * 2011-04-20 2012-11-20 Bertec Corporation Force measurement system having inertial compensation
US8315823B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force and/or motion measurement system having inertial compensation and method thereof
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US20130147850A1 (en) * 2011-12-08 2013-06-13 Motorola Solutions, Inc. Method and device for force sensing gesture recognition
US8970519B2 (en) * 2012-02-01 2015-03-03 Logitech Europe S.A. System and method for spurious signal detection and compensation on an input device
KR20130113285A (en) * 2012-04-05 2013-10-15 엘지전자 주식회사 Mobile terminal and control method thereof
WO2014009933A1 (en) * 2012-07-12 2014-01-16 Grant Neville Odgers Improvements in devices for use with computers
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
CN103941847A (en) * 2013-01-21 2014-07-23 深圳富泰宏精密工业有限公司 System and method for unlocking screen
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US20170115867A1 (en) * 2015-10-27 2017-04-27 Yahoo! Inc. Method and system for interacting with a touch screen

Citations (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159321A (en) * 1989-08-18 1992-10-27 Matsushita Electric Industrial Co., Ltd. Pen-type computer input device
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5294792A (en) * 1991-12-31 1994-03-15 Texas Instruments Incorporated Writing tip position sensing and processing apparatus
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US5776585A (en) * 1995-09-26 1998-07-07 Narumi China Corporation Mouse pad
US5790102A (en) * 1996-03-28 1998-08-04 Nassimi; Shary Pressure sensitive computer mouse
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5990869A (en) * 1996-08-20 1999-11-23 Alliance Technologies Corp. Force feedback mouse
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6028271A (en) * 1992-06-08 2000-02-22 Synaptics, Inc. Object position detector with edge motion feature and gesture recognition
US6115028A (en) * 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
US6243075B1 (en) * 1997-08-29 2001-06-05 Xerox Corporation Graspable device manipulation for controlling a computer display
US20020015024A1 (en) * 1998-01-26 2002-02-07 University Of Delaware Method and apparatus for integrating manual input
US6414671B1 (en) * 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US6424338B1 (en) * 1999-09-30 2002-07-23 Gateway, Inc. Speed zone touchpad
US6489948B1 (en) * 2000-04-20 2002-12-03 Benny Chi Wah Lau Computer mouse having multiple cursor positioning inputs and method of operation
US6496179B1 (en) * 1998-05-09 2002-12-17 Multidesign Limited Moving position detector
US20030137494A1 (en) * 2000-05-01 2003-07-24 Tulbert David J. Human-machine interface
US20030156098A1 (en) * 1999-11-04 2003-08-21 Synaptics, Inc. Capacitive mouse
US20040017355A1 (en) * 2002-07-24 2004-01-29 Youngtack Shim Cursor control systems and methods
US20040061678A1 (en) * 2002-09-30 2004-04-01 Goh Chun B. High resolution input detection
US20040066371A1 (en) * 2002-10-02 2004-04-08 Huang Mark Po-Shaw Mouse device and method with the wireless transmission function
US20040130532A1 (en) * 2003-01-07 2004-07-08 Gordon Gary B. Apparatus for controlling a screen pointer with a frame rate based on velocity
US20040140962A1 (en) * 2003-01-21 2004-07-22 Microsoft Corporation Inertial sensors integration
US6844871B1 (en) * 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US20050180618A1 (en) * 1999-02-10 2005-08-18 Black Gerald R. Method for identity verification
US20050195387A1 (en) * 2004-03-08 2005-09-08 Zhang Guanghua G. Apparatus and method for determining orientation parameters of an elongate object
US20050212751A1 (en) * 2004-03-23 2005-09-29 Marvit David L Customizable gesture mappings for motion controlled handheld devices
US20050233859A1 (en) * 2004-04-05 2005-10-20 Motoyuki Takai Electronic apparatus, input device, and input method
US20050280636A1 (en) * 2004-06-04 2005-12-22 Polyvision Corporation Interactive communication systems
US20060007124A1 (en) * 2004-06-28 2006-01-12 Microsoft Corporation Disposing identifying codes on a user's hand to provide input to an interactive display application
US20060007151A1 (en) * 2004-06-08 2006-01-12 Pranil Ram Computer Apparatus with added functionality
US20060038778A1 (en) * 2001-09-13 2006-02-23 E-Book Systems Pte Ltd Electromechanical information browsing device
US20060044260A1 (en) * 2004-08-30 2006-03-02 Jonah Harley Puck-based input device with rotation detection
US20060066582A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. Raw data track pad device and system
US7030860B1 (en) * 1999-10-08 2006-04-18 Synaptics Incorporated Flexible transparent touch sensing system for electronic devices
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US20060250353A1 (en) * 2005-05-09 2006-11-09 Taizo Yasutake Multidimensional input device
US7136710B1 (en) * 1991-12-23 2006-11-14 Hoffberg Steven M Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US20060262545A1 (en) * 2005-05-23 2006-11-23 Color Kinetics Incorporated Led-based light-generating modules for socket engagement, and methods of assembling, installing and removing same
US20060267933A1 (en) * 2005-05-25 2006-11-30 Li Chong Tai Eliminating mechanical spring with magnetic forces
US20060274042A1 (en) * 2005-06-03 2006-12-07 Apple Computer, Inc. Mouse with improved input mechanisms
US7154477B1 (en) * 2003-09-03 2006-12-26 Apple Computer, Inc. Hybrid low power computer mouse
US20070132733A1 (en) * 2004-06-08 2007-06-14 Pranil Ram Computer Apparatus with added functionality
US20070146325A1 (en) * 2005-12-27 2007-06-28 Timothy Poston Computer input device enabling three degrees of freedom and related input and feedback methods
US20070152966A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Mouse with optical sensing surface
US20070176906A1 (en) * 2006-02-01 2007-08-02 Synaptics Incorporated Proximity sensor and method for indicating extended interface results
US20070176896A1 (en) * 2006-01-31 2007-08-02 Hillcrest Laboratories, Inc. 3D Pointing devices with keysboards
US20070216648A1 (en) * 2006-03-16 2007-09-20 Benq Corporation Mouse
US20070242042A1 (en) * 2006-04-12 2007-10-18 Honeywell International Inc. Free standing joystick
US20070257887A1 (en) * 2006-05-04 2007-11-08 Sunplus Technology Co., Ltd. Apparatus and method for cursor control
US20070259717A1 (en) * 2004-06-18 2007-11-08 Igt Gesture controlled casino gaming system
US20070291009A1 (en) * 2006-06-19 2007-12-20 Cypress Semiconductor Corporation Apparatus and method for detecting a touch-sensor pad gesture
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US20080048979A1 (en) * 2003-07-09 2008-02-28 Xolan Enterprises Inc. Optical Method and Device for use in Communication
US20080048997A1 (en) * 1992-06-08 2008-02-28 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US20080062143A1 (en) * 2000-01-19 2008-03-13 Immersion Corporation Haptic interface for touch screen embodiments
US20080106523A1 (en) * 2006-11-07 2008-05-08 Conrad Richard H Ergonomic lift-clicking method and apparatus for actuating home switches on computer input devices
US20080134784A1 (en) * 2006-12-12 2008-06-12 Industrial Technology Research Institute Inertial input apparatus with six-axial detection ability and the operating method thereof
US20080154573A1 (en) * 2006-10-02 2008-06-26 Microsoft Corporation Simulating new input devices using old input devices
US20080170046A1 (en) * 2007-01-16 2008-07-17 N-Trig Ltd. System and method for calibration of a capacitive touch digitizer system
US20080231595A1 (en) * 2007-03-20 2008-09-25 At&T Knowledge Ventures, Lp Remote control apparatus and method of interacting with a multimedia timeline user interface
US20080254822A1 (en) * 2007-04-12 2008-10-16 Patrick Tilley Method and System for Correlating User/Device Activity with Spatial Orientation Sensors
US20080266257A1 (en) * 2007-04-24 2008-10-30 Kuo-Ching Chiang User motion detection mouse for electronic device
US20080278447A1 (en) * 2007-05-08 2008-11-13 Ming-Yen Lin Three-demensional mouse appratus
US20080291163A1 (en) * 2004-04-30 2008-11-27 Hillcrest Laboratories, Inc. 3D Pointing Devices with Orientation Compensation and Improved Usability
US20100021022A1 (en) * 2008-02-25 2010-01-28 Arkady Pittel Electronic Handwriting
US20100026623A1 (en) * 2008-07-30 2010-02-04 Apple Inc. Velocity stabilization for accelerometer based input devices
US20100029242A1 (en) * 2005-11-10 2010-02-04 Research In Motion Limited System and method for activating an electronic device
US20100039394A1 (en) * 2008-08-15 2010-02-18 Apple Inc. Hybrid inertial and touch sensing input device
US20100084203A1 (en) * 2008-10-03 2010-04-08 Inventec Appliances Corp. Electric pen
US20100124949A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
US7791590B1 (en) * 1995-10-06 2010-09-07 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Optical mouse with uniform level detection
US20110148794A1 (en) * 2001-05-04 2011-06-23 Immersion Corporation Haptic Interface for Palpation Simulation
US8125445B1 (en) * 2007-01-26 2012-02-28 Cypress Semiconductor Corporation Horizontal capacitively sensed pointing device
US20120170758A1 (en) * 2007-04-13 2012-07-05 Apple Inc. Multi-channel sound panner
US20120262407A1 (en) * 2010-12-17 2012-10-18 Microsoft Corporation Touch and stylus discrimination and rejection for contact sensitive computing devices
US20130194185A1 (en) * 2012-02-01 2013-08-01 Logitech Europe S.A. Multi-sensor input device
US8526767B2 (en) * 2008-05-01 2013-09-03 Atmel Corporation Gesture recognition

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US6281879B1 (en) * 1994-06-16 2001-08-28 Microsoft Corporation Timing and velocity control for displaying graphical information
WO1996011434A1 (en) * 1994-10-07 1996-04-18 Interlink Electronics, Inc. Isometric pointing device with integrated click and method therefor
JPH0954653A (en) * 1995-08-10 1997-02-25 Csk Corp Pen type pointing device
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US7096454B2 (en) * 2000-03-30 2006-08-22 Tyrsted Management Aps Method for gesture based modeling
US6649905B2 (en) * 2001-03-26 2003-11-18 Aaron E. Grenlund Accelerometer and devices using the same
US8684839B2 (en) * 2004-06-18 2014-04-01 Igt Control of wager-based game using gesture recognition
JP2006221297A (en) * 2005-02-09 2006-08-24 Sharp Corp Operation device and operation system
US7656393B2 (en) * 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
JP2007018274A (en) * 2005-07-07 2007-01-25 Sharp Corp Operation unit and operation system
US8781151B2 (en) * 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information

Patent Citations (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159321A (en) * 1989-08-18 1992-10-27 Matsushita Electric Industrial Co., Ltd. Pen-type computer input device
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US7136710B1 (en) * 1991-12-23 2006-11-14 Hoffberg Steven M Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5294792A (en) * 1991-12-31 1994-03-15 Texas Instruments Incorporated Writing tip position sensing and processing apparatus
US6028271A (en) * 1992-06-08 2000-02-22 Synaptics, Inc. Object position detector with edge motion feature and gesture recognition
US20080048997A1 (en) * 1992-06-08 2008-02-28 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US6414671B1 (en) * 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US20030112228A1 (en) * 1992-06-08 2003-06-19 Gillespie David W. Object position detector with edge motion feature and gesture recognition
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US5776585A (en) * 1995-09-26 1998-07-07 Narumi China Corporation Mouse pad
US7791590B1 (en) * 1995-10-06 2010-09-07 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Optical mouse with uniform level detection
US5790102A (en) * 1996-03-28 1998-08-04 Nassimi; Shary Pressure sensitive computer mouse
US5990869A (en) * 1996-08-20 1999-11-23 Alliance Technologies Corp. Force feedback mouse
US6115028A (en) * 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6243075B1 (en) * 1997-08-29 2001-06-05 Xerox Corporation Graspable device manipulation for controlling a computer display
US20060238519A1 (en) * 1998-01-26 2006-10-26 Fingerworks, Inc. User interface gestures
US20080041639A1 (en) * 1998-01-26 2008-02-21 Apple Inc. Contact tracking and identification module for touch sensing
US20020015024A1 (en) * 1998-01-26 2002-02-07 University Of Delaware Method and apparatus for integrating manual input
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US6496179B1 (en) * 1998-05-09 2002-12-17 Multidesign Limited Moving position detector
US20050180618A1 (en) * 1999-02-10 2005-08-18 Black Gerald R. Method for identity verification
US6424338B1 (en) * 1999-09-30 2002-07-23 Gateway, Inc. Speed zone touchpad
US7030860B1 (en) * 1999-10-08 2006-04-18 Synaptics Incorporated Flexible transparent touch sensing system for electronic devices
US20030156098A1 (en) * 1999-11-04 2003-08-21 Synaptics, Inc. Capacitive mouse
US20060038783A1 (en) * 1999-11-04 2006-02-23 Shaw Scott J Capacitive mouse
US6844871B1 (en) * 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US20080062143A1 (en) * 2000-01-19 2008-03-13 Immersion Corporation Haptic interface for touch screen embodiments
US6489948B1 (en) * 2000-04-20 2002-12-03 Benny Chi Wah Lau Computer mouse having multiple cursor positioning inputs and method of operation
US20030137494A1 (en) * 2000-05-01 2003-07-24 Tulbert David J. Human-machine interface
US20110148794A1 (en) * 2001-05-04 2011-06-23 Immersion Corporation Haptic Interface for Palpation Simulation
US20060038778A1 (en) * 2001-09-13 2006-02-23 E-Book Systems Pte Ltd Electromechanical information browsing device
US20040017355A1 (en) * 2002-07-24 2004-01-29 Youngtack Shim Cursor control systems and methods
US20040061678A1 (en) * 2002-09-30 2004-04-01 Goh Chun B. High resolution input detection
US20040066371A1 (en) * 2002-10-02 2004-04-08 Huang Mark Po-Shaw Mouse device and method with the wireless transmission function
US20040130532A1 (en) * 2003-01-07 2004-07-08 Gordon Gary B. Apparatus for controlling a screen pointer with a frame rate based on velocity
US20040140962A1 (en) * 2003-01-21 2004-07-22 Microsoft Corporation Inertial sensors integration
US20080048979A1 (en) * 2003-07-09 2008-02-28 Xolan Enterprises Inc. Optical Method and Device for use in Communication
US7154477B1 (en) * 2003-09-03 2006-12-26 Apple Computer, Inc. Hybrid low power computer mouse
US20050195387A1 (en) * 2004-03-08 2005-09-08 Zhang Guanghua G. Apparatus and method for determining orientation parameters of an elongate object
US20050212751A1 (en) * 2004-03-23 2005-09-29 Marvit David L Customizable gesture mappings for motion controlled handheld devices
US20050233859A1 (en) * 2004-04-05 2005-10-20 Motoyuki Takai Electronic apparatus, input device, and input method
US20080291163A1 (en) * 2004-04-30 2008-11-27 Hillcrest Laboratories, Inc. 3D Pointing Devices with Orientation Compensation and Improved Usability
US20050280636A1 (en) * 2004-06-04 2005-12-22 Polyvision Corporation Interactive communication systems
US20070132733A1 (en) * 2004-06-08 2007-06-14 Pranil Ram Computer Apparatus with added functionality
US20060007151A1 (en) * 2004-06-08 2006-01-12 Pranil Ram Computer Apparatus with added functionality
US20070259717A1 (en) * 2004-06-18 2007-11-08 Igt Gesture controlled casino gaming system
US20060007124A1 (en) * 2004-06-28 2006-01-12 Microsoft Corporation Disposing identifying codes on a user's hand to provide input to an interactive display application
US20060044260A1 (en) * 2004-08-30 2006-03-02 Jonah Harley Puck-based input device with rotation detection
US20060066582A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. Raw data track pad device and system
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US20060250353A1 (en) * 2005-05-09 2006-11-09 Taizo Yasutake Multidimensional input device
US20060262545A1 (en) * 2005-05-23 2006-11-23 Color Kinetics Incorporated Led-based light-generating modules for socket engagement, and methods of assembling, installing and removing same
US20060267933A1 (en) * 2005-05-25 2006-11-30 Li Chong Tai Eliminating mechanical spring with magnetic forces
US20060274042A1 (en) * 2005-06-03 2006-12-07 Apple Computer, Inc. Mouse with improved input mechanisms
US20100029242A1 (en) * 2005-11-10 2010-02-04 Research In Motion Limited System and method for activating an electronic device
US20070146325A1 (en) * 2005-12-27 2007-06-28 Timothy Poston Computer input device enabling three degrees of freedom and related input and feedback methods
US20070152966A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Mouse with optical sensing surface
US20070176896A1 (en) * 2006-01-31 2007-08-02 Hillcrest Laboratories, Inc. 3D Pointing devices with keysboards
US20070176906A1 (en) * 2006-02-01 2007-08-02 Synaptics Incorporated Proximity sensor and method for indicating extended interface results
US20070216648A1 (en) * 2006-03-16 2007-09-20 Benq Corporation Mouse
US20070242042A1 (en) * 2006-04-12 2007-10-18 Honeywell International Inc. Free standing joystick
US20070257887A1 (en) * 2006-05-04 2007-11-08 Sunplus Technology Co., Ltd. Apparatus and method for cursor control
US20070291009A1 (en) * 2006-06-19 2007-12-20 Cypress Semiconductor Corporation Apparatus and method for detecting a touch-sensor pad gesture
US20080154573A1 (en) * 2006-10-02 2008-06-26 Microsoft Corporation Simulating new input devices using old input devices
US20080106523A1 (en) * 2006-11-07 2008-05-08 Conrad Richard H Ergonomic lift-clicking method and apparatus for actuating home switches on computer input devices
US20080134784A1 (en) * 2006-12-12 2008-06-12 Industrial Technology Research Institute Inertial input apparatus with six-axial detection ability and the operating method thereof
US20080170046A1 (en) * 2007-01-16 2008-07-17 N-Trig Ltd. System and method for calibration of a capacitive touch digitizer system
US8125445B1 (en) * 2007-01-26 2012-02-28 Cypress Semiconductor Corporation Horizontal capacitively sensed pointing device
US20080231595A1 (en) * 2007-03-20 2008-09-25 At&T Knowledge Ventures, Lp Remote control apparatus and method of interacting with a multimedia timeline user interface
US20080254822A1 (en) * 2007-04-12 2008-10-16 Patrick Tilley Method and System for Correlating User/Device Activity with Spatial Orientation Sensors
US20120170758A1 (en) * 2007-04-13 2012-07-05 Apple Inc. Multi-channel sound panner
US20080266257A1 (en) * 2007-04-24 2008-10-30 Kuo-Ching Chiang User motion detection mouse for electronic device
US20080278447A1 (en) * 2007-05-08 2008-11-13 Ming-Yen Lin Three-demensional mouse appratus
US20100021022A1 (en) * 2008-02-25 2010-01-28 Arkady Pittel Electronic Handwriting
US8526767B2 (en) * 2008-05-01 2013-09-03 Atmel Corporation Gesture recognition
US20100026623A1 (en) * 2008-07-30 2010-02-04 Apple Inc. Velocity stabilization for accelerometer based input devices
US20100039394A1 (en) * 2008-08-15 2010-02-18 Apple Inc. Hybrid inertial and touch sensing input device
US20100084203A1 (en) * 2008-10-03 2010-04-08 Inventec Appliances Corp. Electric pen
US20100124949A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
US20120262407A1 (en) * 2010-12-17 2012-10-18 Microsoft Corporation Touch and stylus discrimination and rejection for contact sensitive computing devices
US20130194185A1 (en) * 2012-02-01 2013-08-01 Logitech Europe S.A. Multi-sensor input device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100287513A1 (en) * 2009-05-05 2010-11-11 Microsoft Corporation Multi-device gesture interactivity
US9329750B2 (en) 2013-09-10 2016-05-03 Google Inc. Three-dimensional tilt and pan navigation using a single gesture

Also Published As

Publication number Publication date Type
US20100013768A1 (en) 2010-01-21 application
US8159455B2 (en) 2012-04-17 grant

Similar Documents

Publication Publication Date Title
US20060033701A1 (en) Systems and methods using computer vision and capacitive sensing for cursor control
US20110050629A1 (en) Information processing apparatus, information processing method and program
US20090051660A1 (en) Proximity sensor device and method with activation confirmation
US20110102455A1 (en) Scrolling and zooming of a portable device display with device motion
US20090167723A1 (en) Input devices
US20070176906A1 (en) Proximity sensor and method for indicating extended interface results
US20070180409A1 (en) Apparatus and method for controlling speed of moving between menu list items
US20140176495A1 (en) Active stylus for touch sensing applications
US20140313130A1 (en) Display control device, display control method, and computer program
US20130154948A1 (en) Force sensing input device and method for determining force information
US20100225601A1 (en) Information processing apparatus, information processing method and information processing program
Kim et al. 3-d hand motion tracking and gesture recognition using a data glove
US20100315439A1 (en) Using motion detection to process pan and zoom functions on mobile computing devices
US20120169482A1 (en) System and Method for Selecting a Device for Remote Control Based on Determined Navigational State of a Remote Control Device
US20120249470A1 (en) Electronic device and control method
US20080284756A1 (en) Method and device for handling large input mechanisms in touch screens
US20140092031A1 (en) System and method for low power input object detection and interaction
WO2006003586A2 (en) Zooming in 3-d touch interaction
US20160132139A1 (en) System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
US20090153466A1 (en) Method and System for Optimizing Scrolling and Selection Activity
US8120586B2 (en) Electronic devices with touch-sensitive navigational mechanisms, and associated methods
US20110080430A1 (en) Information Processing Apparatus, Information Processing Method, and Information Processing Program
US20110022990A1 (en) Method for operation to a multi-touch environment screen by using a touchpad
US20110205175A1 (en) Method and device for determining rotation gesture
US20080303697A1 (en) Input apparatus, control apparatus, control system, control method, and program therefor

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEUNG, OMAR S.;REEL/FRAME:028889/0593

Effective date: 20080718