US20100066670A1 - Force sensing for fine tracking control of mouse cursor - Google Patents
Force sensing for fine tracking control of mouse cursor Download PDFInfo
- Publication number
- US20100066670A1 US20100066670A1 US12/233,502 US23350208A US2010066670A1 US 20100066670 A1 US20100066670 A1 US 20100066670A1 US 23350208 A US23350208 A US 23350208A US 2010066670 A1 US2010066670 A1 US 2010066670A1
- Authority
- US
- United States
- Prior art keywords
- input device
- lateral force
- motion
- force
- navigational object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
Definitions
- the invention relates to input devices for computing systems, and more particularly, to methods and apparatuses for fine control of navigational objects, such as a cursors.
- An input device can be manipulated by a user to generate input data in a computer system.
- an input device is positioned on a surface and moved relative to that surface, but other forms of input devices operating in different fashions are also available.
- the operations performed on an input device generally correspond to moving a navigational object (e.g., a cursor) and/or making selections on a display screen.
- a navigational object e.g., a cursor
- a “mouse” is a common type of input device that functions as a pointing device for a computer by detecting motion imparted by a user.
- the mouse's motion is typically translated into motion of a navigational object (e.g., cursor) on a graphical user interface (GUI) provided on a display screen.
- GUI graphical user interface
- a mouse generally comprises a small case, held in a user's hand, with one or more input buttons. Additionally, a mouse may have other elements, such as a scroll wheel, that allow a user to perform enhanced operations.
- a user may perform a large, coarse motion with the input device to move a navigational object from one side of a graphical display to another side of the graphical display.
- a user may perform a small, fine motion with the input device to move the navigational object a relatively small number pixels (e.g., 1 to 100 pixels).
- a user may want to move a navigational object a relatively small number of pixels when homing in on a small target area, such as a space between two adjacent characters in a text file.
- many conventional mouse-type input devices slip or jerk when the user attempts to move the input device in small increments due to static/kinetic friction transitions. This can cause unstable control of the navigational object.
- fine control of a navigational object using an electronic input device can be a difficult challenge when the input device is also used for coarse cursor control.
- a system in accordance with one embodiment includes a motion-based input device having a force detection module operable to detect a lateral force applied to the input device.
- the system further includes a processor coupled to the force detection module.
- the processor is operable to generate navigational object movement information based on the detected lateral force.
- the navigational object movement information may include moving the navigational object in relatively small increments or in relatively large increments.
- a computer readable medium capable of storing computer executable instructions.
- the computer readable medium includes code for detecting a lateral force applied to the input device and detecting a motion of the input device relative to a surface.
- the computer readable medium includes code for calculating a change in position of the navigational object based on a detected lateral force contribution and a detected motion contribution.
- a method for controlling movement of a navigational object displayed on a user graphical interface includes measuring a lateral force applied to an input device and estimating a change in magnitude of the applied lateral force. The method also generates a control signal based on the estimated change in magnitude of the applied lateral force, wherein the control signal is indicative of a change in position of a navigational object on a graphical display.
- FIG. 1 illustrates an exemplary computing system using an input device according to various embodiments of this invention.
- FIG. 2 is a system diagram of a modular arrangement of an input device according to various embodiments of this invention.
- FIG. 3 is a block diagram of forces exerted on an input device according to various embodiments of this invention.
- FIGS. 4A-4D are graphs of force and motion parameters of an input device moving form a first point to a second point according to various embodiments of this invention.
- FIG. 5 is a flow diagram illustrating a process of controlling movement of a graphical object according to various embodiments of this invention.
- FIG. 6 is a flow diagram illustrating another process of controlling movement of a graphical object according to various embodiments of this invention.
- a motion-based input device can include one or more force sensors capable of detecting forces acting upon the input device and generating a signal representative of the detected forces. A system can then initiate (e.g., trigger) one or more events based on the signal.
- event can refer to any function or process performed by a computer system in response to user input.
- An event need not be a function traditionally initiated by a user using a motion-based input device, but can also include functions initiated by a user using other types of input devices, including keyboards, touch pads, switches, buttons, dials or any other electrical, mechanical and/or optical mechanism that generates input data in response to a user input.
- a few non-limiting examples of an event can include moving a cursor displayed on a graphical user interface, making a selection indication (e.g., similar to depressing a selection button on a mouse, for example), changing a mode of operation, turning volume up or down, changing channels, paging back and forth in a software application, initiating a startup or wakeup sequence of a receiving device or the input device, and increasing a data collection rate to decrease lag.
- a selection indication e.g., similar to depressing a selection button on a mouse, for example
- changing a mode of operation turning volume up or down, changing channels, paging back and forth in a software application, initiating a startup or wakeup sequence of a receiving device or the input device, and increasing a data collection rate to decrease lag.
- motion-based input device can refer to an input device that detects multi-dimensional motion of the input device relative to a surface.
- Motion-based input devices can utilize a variety of sensors for detecting movement of the input device relative to a surface and generate an input signal indicating information pertaining to the detected movement.
- Non-limiting examples of motion-based input devices include electro-mechanical mice (also known as “ball mice”), and optical mice.
- FIG. 1 illustrates a typical environment or system 100 in which a motion-based input device 102 in accordance with one embodiment may be used.
- the input device 102 can be positioned upon a surface 104 such as a desk or a tabletop.
- a user can move the input device 102 relative to the surface 104 to generate output signals indicative of the movement of the input device.
- the surface 104 is depicted as being flat or substantially flat. However, this is not strictly necessary according to other embodiments. Also note that the surface 104 need not necessarily be situated beneath the input device 102 . For example, the surface 104 may be tilted, situated above the input device 102 , or vertically oriented. Also note that multiple surfaces 104 can be utilized.
- a receiving device 106 can be adapted to receive input signals generated by the input device 102 .
- the terms “receiving device” and “receiver” include without limitation video game consoles, set-top boxes, televisions, personal computers (whether desktop, laptop, or otherwise), digital video recorders, communications equipment, terminals, and display devices.
- the receiving device 106 can comprise at least one interface adapted to receive the input signals transmitted from the input device 102 .
- the input device 102 can be physically coupled to the receiving device via one or more communication links (such as via a serial bus cable), or the input device 102 can be adapted to wirelessly communicate with the receiving device 106 .
- a display device 108 in communication with the receiving device 106 can be adapted to display a navigational object upon its display screen (for example, a pointer, cursor, selector box, or other such indicator).
- a navigational object upon its display screen (for example, a pointer, cursor, selector box, or other such indicator).
- display device can include any type of device adapted to display information, including without limitation cathode ray tube displays (CRTs), liquid crystal displays (LCDs), thin film transistor displays (TFTs), digital light processor displays (DLPs), plasma displays, light emitting diodes (LEDs) or diode arrays, incandescent devices, and fluorescent devices.
- Display devices may also include less dynamic devices such as printers, e-ink devices, and other similar structures.
- FIG. 2 is a system diagram of a modular arrangement of the input device 102 according to one embodiment of the present invention.
- the input device 102 includes a printed circuit board 204 comprising electrical leads that enable various modules to communicate with other coupled modules.
- a power supply 206 provides a source of power to modules electrically coupled to the printed circuit board 204 .
- power is supplied externally from one or more conductive wires, for example, through the use of a power cable or a serial bus cable.
- a battery may be used as a source of power.
- a memory 212 comprises any type of module adapted to enable digital information to be stored, retained, and retrieved. Additionally, the memory 212 may comprise any combination of volatile and non-volatile storage devices, including without limitation RAM, DRAM, SRAM, ROM, and/or flash memory. Note also that the memory 212 may be organized in any number of architectural configurations by the use of registers, memory caches, data buffers, main memory, mass storage, and/or removable media, for example.
- processors 208 can be adapted to execute sequences of instructions by loading and storing data to the memory 212 .
- Possible instructions include, without limitation, instructions for data conversions, formatting operations, communication instructions, and/or storage and retrieval operations.
- the processors 208 may comprise any type of digital processing devices including, for example, reduced instruction set computer processors, general-purpose processors, microprocessors, digital signal processors, gate arrays, programmable logic devices, reconfigurable compute fabrics, array processors, and/or application-specific integrated circuits. Note also that the processors 208 may be contained on a single unitary integrated circuit (IC) die or distributed across multiple components.
- IC integrated circuit
- Interface module 216 enables data to be transmitted and/or received over one or more communication networks.
- the data can be transmitted or received wirelessly or through the use of wires.
- data transmitted to a receiving device is first packetized and processed according to one or more standardized network protocols.
- the interface module 216 comprises a plurality of network layers such that each layer provides services to the layer above it and receives services from the layer below it.
- the interface module 216 may accommodate any wired or wireless protocol including, without limitation, USB, FireWire, Ethernet, Gigabit Ethernet, MoCA, radio frequency tuners, modems, WiFi, Blutooth, WiMax, and/or Infrared Data Association.
- a motion detection module 220 comprises sensors and logic adapted to detect and/or measure motion parameters, such as acceleration, speed, velocity and/or position of the input device 102 at a specific instant in time, or alternatively, over a period of time.
- the motion detection sensors can be an optical sensor, an electro-mechanical sensor, or any other sensor used in a motion-based input device capable of detecting motion of the input device 102 .
- a force detection module 222 can include sensors and logic adapted to detect forces acting upon the input device 102 during an instant in time, or alternatively, over a period of time.
- the force detection module can include one or more force detection sensors operable to detect external forces acting upon the input device 102 .
- the force detection module 220 can detect forces acting upon the input device 102 in one dimension (e.g., an x-dimension), in other embodiments the force detection module 222 can sense forces acting upon the input device 102 in two dimensions (e.g., x and y dimensions), and in further embodiments the force detection module 222 can detect forces acting upon the input device 102 in three dimensions (e.g., x, y and z dimensions).
- the input device 102 can include one or more force sensors.
- a three-component force sensor can detect the forces exerted on the input device in three dimensions (e.g., x, y and z dimensions).
- Suitable three-component force sensors include Kistler 3-Component Force Sensors, models 9167 A, 9168 A, 916 AB, or 9168 AB, offered by Kistler North America located in Amherst, N.Y., USA.
- separate force sensors can be used to detect the forces exerted on the input device 102 .
- FIG. 3 is a block diagram indicating a force F total applied to the input device 102 positioned on surface 104 .
- the force F total may be applied to the input device 102 by a user to move the input device in a desired direction in a plane of motion.
- a “plane of motion” can be defined as an x-y plane in a Cartesian coordinate system in which a user moves the input device 102 .
- the x-y plane has an x-axis and a y-axis perpendicular to the x-axis.
- a z-axis extends perpendicularly from the x-y plane.
- the x-y plane is parallel to the surface 104 .
- the force F total applied to the input device can comprise a lateral force component and a normal force component (the normal force can also be referred to as a vertical force).
- the lateral force component further includes a first lateral force component, Fx, in a direction along the x-axis, and a second lateral force, Fy, component in a direction along the y-axis.
- the normal force component, Fz is in a direction along the z-axis.
- the direction of the lateral force is mathematically related to the lateral force components, Fx and Fy, applied to the input device 102 in the plane of motion. This relationship can be expressed as:
- logic residing in force detection module 220 or computer 106 can estimate a total magnitude of the lateral force and corresponding directional unit vectors of the applied lateral force.
- Fx and Fy applied to the input device 102
- logic residing in force detection module 220 or computer 106 can estimate a total magnitude of the lateral force and corresponding directional unit vectors of the applied lateral force.
- other standard techniques known in physics may be used to calculate a scalar quantity of force from a given set of force vectors.
- the logic may be implemented as any combination of software, firmware and/or hardware.
- motion and force information can be written to a local memory source (not shown) (such as a register or local cache) before being provided as input.
- this data can be directly written and retrieved from memory 212 .
- this data can be stored in external memory (e.g. a hard drive of the computer 106 ) and the input device 102 can transmit raw data to the computer 106 for processing.
- one or more force sensors 222 can be utilized to generate force information pertaining to forces acting upon the input device 102 .
- the input device 102 can detect lateral components of forces applied to the input device 102 in two directions in a plane of motion of the input device; the first direction being substantially perpendicular to the second direction.
- Information relating to the detected lateral force components can then be used to calculate an estimated magnitude and direction of a lateral force acting upon the input device 102 , among other things.
- a system in accordance with various embodiments can then move a navigational object based on the estimated magnitude and direction, for example.
- FIGS. 4A-4D are graphs of force, acceleration, velocity and displacement, respectively, versus time of an exemplary movement of input device 102 moving in a straight line from a first point A to a second point B. Furthermore, FIGS. 4A-4D illustrate exemplary first through fifth states 401 - 405 , respectively, of the input device 102 while it moves from point A to point B. For illustrative purposes, the following description of FIGS. 4A-4D may refer to elements mentioned above in connection with FIGS. 1-3 .
- a force is applied to input device 102 , for example by a user, but the applied force does not exceed the static (maximum) frictional force between input device 102 and surface 104 . Consequently, in the first state 401 , input device 102 is not yet moving (see FIGS. 4C and 4D ) despite a force being applied to the input device 102 .
- the applied force exceeds the static frictional force, which results in movement of the input device 102 , as illustrated in FIGS. 4A and 4D , for example.
- the coefficient for static frictional force is typically larger than the coefficient for kinetic frictional force.
- the frictional force between the input device 102 and surface 104 typically decreases once the input device 102 begins sliding on surface 104 .
- the frictional force decreases at the transition between the first state 401 and the second state 402 . This transition from static friction to kinetic friction can result in unstable control of the input device 102 due to what is commonly referred to as a “stick-slip” phenomenon.
- This phenomenon is due to a user needing to apply a force that exceeds the static friction to initiate sliding on the input device 102 , which can feel like the input device 102 is “sticking” to surface 104 . But once the static frictional force is exceeded, the smaller kinetic coefficient applies to the frictional force, which can cause a slip or jerk type motion due to the reduction in frictional force. This “stick-slip” phenomenon can make it difficult for a user to control an input device 102 when moving the input device a small distance, for example.
- the applied force is equal to the frictional force. This results in no acceleration ( FIG. 4B ) and a constant velocity ( FIG. 4C ) of the input device 102 along the plane of motion.
- the magnitude of the applied force is less than the frictional force. This results in a deceleration of the input device 102 ( FIG. 4B ).
- the input device 102 is stopped.
- the applied force no longer exceeds the static frictional force ( FIG. 4A ), and there is no motion (zero velocity) ( FIG. 4C ), no acceleration ( FIG. 4B ), and no change in displacement ( FIG. 4D ).
- FIG. 4A the static frictional force
- FIG. 4C the applied force
- FIG. 4B the applied force
- FIG. 4D the transition between the fourth state 404 and the fifth state 405 can also result in unstable movement of the input device 102 due to the transition from kinetic friction to static friction.
- a system incorporating input device 102 can determine which state 401 - 405 (e.g., moving or not moving) the input device 102 is in at a given time period by measuring one or more of the parameters described in FIGS. 4A-4D . Using these parameters, the system 100 determines which state the input device 102 is in and performs different actions (also referred to herein as “initiating events”) depending upon the particular state.
- a system may initiate a fine control mode where a navigational object is moved in relatively small increments (e.g., a relatively small number of pixels); whereas, while in states 402 , 403 and 404 , the system may initiate a coarse control mode where the navigational object is moved in relatively large increments (e.g., a larger number of pixels than in the fine control mode).
- the detected lateral force information can be used to provide an early indication about an impending motion.
- the impending motion can be determined by using force information generated by the force sensors to estimate a direction of the applied force using expressions (1) and (2), for example.
- a system incorporating the input device 102 can prepare the system for an impending motion of the input device 102 by waking components of the system from a sleep mode and/or increasing a data collection rate.
- FIG. 5 is a flow diagram illustrating an exemplary process 500 of controlling the movement of a navigational object in two modes: a fine control mode and a coarse control mode.
- the various tasks performed in connection with process 500 may be performed by hardware, software, firmware, or any combination thereof. It should be appreciated that process 500 may include any number of additional or alternative tasks. The tasks shown in FIG. 5 need not be performed in the illustrated order, and process 500 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. For illustrative purposes, the following description of process 500 may refer to elements mentioned above in connection with FIGS. 1-4 .
- Process 500 begins by detecting lateral force compnents applied to the input device 102 (e.g., the Fx and Fy force components) and detecting motion of the input device 102 relative to surface 104 , in step 502 .
- the motion is detected by one or more accelerometers, vibration sensors, optical sensors, electro-mechanical sensors, some other sensor operable to detect motion of the input device 102 , or a combination thereof.
- process 500 can periodically detect the applied lateral forces and motion.
- the lateral forces and motion may be measured every 8 milliseconds, but other lengths of time may be used.
- the length of time between measurements for the lateral force and the motion need not be the same, and, moreover, the lateral force and the motion need not be measured at the same time, but instead can be measured successively.
- process 500 determines if the input device 102 is moving with respect to the surface 104 by analyzing the motion detected in step 502 .
- Process 500 can then determine the input device 102 is moving relative to the surface 104 if the velocity is non-zero.
- Other methods of determining if the input device 102 is moving relative to the surface 104 can be used depending upon the motion parameters detected by the input device 102 , as would be apparent to one skilled in the art after reading this disclosure.
- a coarse control mode is initiated in step 506 .
- an associated navigational object moves in relatively large, coarse increments based on the motion of the input device 102 measured in step 502 .
- movement of the associated navigational object can be based on the motion of the input device and also in part on the detected lateral forces exerted on the input device.
- the input device 102 controls movement of the navigational object in a similar manner as a conventional mouse controls movement of a navigational object.
- the measured motion includes an acceleration measurement using one or more accelerometers.
- An estimated speed of input device 102 can then be calculated by integrating the acceleration.
- an estimated direction of motion can be calculated using the lateral force measurements (step 502 ) using expressions (1) and (2), above.
- An associated navigational object can then be moved based on the estimated speed and estimated direction of motion.
- the navigational object can be moved across a graphical user interface in the estimated direction of motion with a speed that is proportional to the estimated speed.
- the estimated speed and estimated direction of motion can be used to estimate a change in position of the navigational object.
- step 508 magnitudes of the lateral force components,
- Process 500 determines whether either of the force magnitude
- the increasing force criteria of step 510 are considered met when the most recently measured force magnitude (e.g.,
- process 500 proceeds to step 512 , where process 500 indicates that no change in position of the navigational object is to occur.
- indicating that no change in position is to occur includes setting both a change in x position value, ⁇ x, and a change in y position value, ⁇ y, to zero.
- process 500 proceeds to a fine control mode in steps 514 and 516 .
- step 514 a change in position of the navigational object is calculated and a value for the positional change is set in memory, such as memory 212 .
- the change in position is calculated using expressions (3) and (4):
- ⁇ x and ⁇ y are change in position values of the navigational object along an x-axis and a y-axis, respectively, in a Cartesian coordinate system; gain is a predetermined gain factor; ⁇ Fx is a change in the measured lateral force component along the x-axis; and ⁇ Fy is a change in measured lateral force component along the y-axis.
- the gain factor can correspond to a desired precision or granularity of moving the navigational object in the fine control mode. In some embodiments, the gain factor has a value corresponding to a total range of 1 to 100 pixels for ⁇ x and ⁇ y. Of course, other values for the gain factor may be used depending upon various factors, such as a desired precision, and the size and resolution of the display upon which the navigational object is displayed and moved.
- ⁇ x and ⁇ y are predetermined values.
- the navigational object is moved in an x-direction by a number of pixels corresponding to the predetermined value of ⁇ x. In this manner, an increasing lateral force imparted on the input device 102 results in the navigational object moving by a predetermined number of pixels.
- the navigational object is moved based on the change in position values, ⁇ x and ⁇ y, in step 516 .
- process 500 utilizes two navigational control modes: a coarse control mode when the input device 102 is moving relative to surface 104 and a fine control mode when the input device 102 is not moving relative to surface 104 but a lateral force applied to the input device 102 is increasing.
- a user can move the navigational object a relatively small distance for precise targeting of an area by applying a force to the input device 102 without initiating sliding of the input device relative to the surface 104 .
- a user desires to move the navigational object a relatively large distance, for example from one side of a graphical display to another side of a graphical display, then the user can simply slide the input device 102 relative to surface 104 to cause the navigational object to quickly move the larger distance.
- Process 500 can also prevent or reduce overshooting or undershooting a target area due to the unstable static/kinetic friction transition when transitioning between the first state 401 and the second state 402 and/or the fourth state 404 and the fifth state 405 (see FIGS. 4A-4D ). For example, once the input device 102 is no longer moving relative to surface 104 , then the process 500 switches to fine control mode, permitting a user to move the navigational object in a precise manner to the target area.
- the measured lateral force is used to move an associated navigational object in the fine control mode if the measured lateral force is increasing (step 510 ).
- a reason for step 510 is so that the position of the associated navigational object does not change when the force sensor returns to its “home” or “zero position”.
- a user can move a navigational object by applying a lateral force to the input device 102 .
- the force sensor may detect a decrease in the applied lateral force. If the detected decrease in applied lateral force is also used to move the navigational object, then the navigational object could be moved away from a targeted area as a result of the decreasing applied force.
- process 500 does not use a decrease in lateral force to move the associated navigational object.
- a user can move the navigational object to a target area by applying a force to the input device 102 and not be concerned about the navigational object moving away from the target area once the user reduces or ceases to apply a force to the input device 102 .
- the navigational object can be moved based on the measured lateral force regardless of whether the measured lateral force is increasing or decreasing, since doing so may be advantageous in some applications.
- the transition between the two navigational modes need not be abrupt, but instead can be smooth and seamless.
- a mathematical formula can be used such that the fine control is dominant initially, e.g., when a user first applies a force to move the input device 102 , and then the coarse control can gradually become dominant as the input device accelerates, for example.
- FIG. 6 is a flow diagram illustrating a further exemplary process 600 of controlling the movement of a navigational object based on both the measured lateral force and measured motion of the input device 102 , regardless of whether or not the input device is moving relative to the surface 104 .
- the various tasks performed in connection with process 600 may be performed by hardware, software, firmware, or any combination thereof. It should be appreciated that process 600 may include any number of additional or alternative tasks. The tasks shown in FIG. 6 need not be performed in the illustrated order, and process 600 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. For illustrative purposes, the following description of process 600 may refer to elements mentioned above in connection with FIGS. 1-4 .
- Process 600 measures the lateral force components applied to the input device 102 and the motion of the input device relative to surface 104 in step 602 .
- step 602 can be performed in a similar manner as described above with respect to step 502 of process 500 .
- Process 600 then calculates change in position values of the navigational object, ⁇ x and ⁇ y, based on both the measured force and measured motion in step 602 .
- equations (5) and (6), below, can be used to calculate the values ⁇ x and ⁇ y:
- ⁇ x ⁇ Fx (gain force )+ m x (gain movement ) (5)
- ⁇ y ⁇ Fy (gain force )+ m y (gain movement ) (6)
- ⁇ x and ⁇ y are a change in position of the navigational object along an x-axis and y-axis, respectively, of a Cartesian coordinate system
- ⁇ Fx and ⁇ Fy are changes in measured lateral force components along the x-axis and y-axis of the plane of motion, respectively
- m x and m y are estimated movements of the input device 102 along the x- and y-axis of the plane of motion, respectively
- gain force is a predetermined gain factor value corresponding to a desired granularity of moving the navigational object based on lateral force applied to the input device 102
- gain movement is a predetermined gain factor value corresponding to a desired granularity of moving the navigational object based on the movement of the input device 102 .
- the estimated movements, m x and m y can be an estimated speed, acceleration, change in position or other parameter pertaining to the motion of the input device 102 relative to the surface 104 .
- equations (5) and (6) may be used in place of equations (5) and (6).
- quadratic and/or exponential mathematical relationships may be used to calculate change in position values based on measured force and motion.
- the value of ⁇ Fx or ⁇ Fy is non-zero when the lateral force applied along the x-axis or y-axis is increasing. Otherwise, the value for ⁇ Fx or ⁇ Fy is zero, thereby resulting in no lateral force contribution to the change in position of the navigational object as set forth in equations (5) and (6).
- other embodiments may use a non-zero value for the change in lateral force, ⁇ Fx or ⁇ Fy, regardless of whether the lateral force is increasing or decreasing.
- the value of the change in lateral force components, ⁇ Fx or ⁇ Fy can be negative or positive, thereby possibly providing a negative or positive contribution to equations (5) or (6), for example.
- values for gain force and gain movement are selected to provide smooth and seamless transitions between the first state 401 and the second state 402 and/or the fourth state 404 and the fifth state 405 ( FIGS. 4A-4D ).
- the values for gain force and gain movement need not be constant.
- mathematical formulas can be used to calculate variable force gain values and variable movement gain values based on various factors, such as a state of the input device 102 (e.g. first-fifth states 401 - 405 , respectively) or other desired parameters.
- step 606 the navigational object is moved based on the values of ⁇ x and ⁇ y.
- various functions described in processes 500 and 600 can be performed by the input device 102 , the receiving device 106 or by a combination of the two devices.
- the input device 102 may generate and transmit one or more signals to the receiving device 106 indicative of the detected lateral force and motion.
- the receiving device 106 can then calculate the values of ⁇ x and ⁇ y based on the one or more signals indicative of the detected lateral force and motion.
- the input device 102 need not have a processor, such as processor 208 depicted in FIG. 2 .
- the input device 102 may calculate the values of ⁇ x and ⁇ y using processor 208 ( FIG. 2 ), generate one or more signals indicative of the values of ⁇ x and ⁇ y, and transmit the one or more signals to the receiving device 106 .
- Such may be rendered in any programming language or environment including, for example, C/C++, Fortran, COBOL, PASCAL, Perl, Prolog, assembly language, scripting languages, markup languages (e.g., HTML, SGML, XML, VOXML), functional languages (e.g., APL, Erlang, Haskell, Lisp, ML, F# and Scheme), as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), JavaTM (including J2ME, Java Beans, etc.).
- CORBA Common Object Request Broker Architecture
- JavaTM including J2ME, Java Beans, etc.
- module does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed across multiple locations.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- The invention relates to input devices for computing systems, and more particularly, to methods and apparatuses for fine control of navigational objects, such as a cursors.
- An input device can be manipulated by a user to generate input data in a computer system. Typically, an input device is positioned on a surface and moved relative to that surface, but other forms of input devices operating in different fashions are also available. The operations performed on an input device generally correspond to moving a navigational object (e.g., a cursor) and/or making selections on a display screen. There are many kinds of electronic input devices, such as buttons or keys, pens, digitizing pads, game controllers, trackballs, touch screens, touch pads, mice, and the like. A “mouse” is a common type of input device that functions as a pointing device for a computer by detecting motion imparted by a user. The mouse's motion is typically translated into motion of a navigational object (e.g., cursor) on a graphical user interface (GUI) provided on a display screen. A mouse generally comprises a small case, held in a user's hand, with one or more input buttons. Additionally, a mouse may have other elements, such as a scroll wheel, that allow a user to perform enhanced operations.
- When tracking the motion of an electronic input device, there can be a wide range of motions from large coarse motions to small fine motions. For example, a user may perform a large, coarse motion with the input device to move a navigational object from one side of a graphical display to another side of the graphical display. In contrast, a user may perform a small, fine motion with the input device to move the navigational object a relatively small number pixels (e.g., 1 to 100 pixels). A user may want to move a navigational object a relatively small number of pixels when homing in on a small target area, such as a space between two adjacent characters in a text file. However, many conventional mouse-type input devices slip or jerk when the user attempts to move the input device in small increments due to static/kinetic friction transitions. This can cause unstable control of the navigational object.
- Accordingly, fine control of a navigational object using an electronic input device can be a difficult challenge when the input device is also used for coarse cursor control.
- Various aspects of the present invention relate systems and methods for controlling a navigational object using an input device. A system in accordance with one embodiment includes a motion-based input device having a force detection module operable to detect a lateral force applied to the input device. The system further includes a processor coupled to the force detection module. The processor is operable to generate navigational object movement information based on the detected lateral force. The navigational object movement information may include moving the navigational object in relatively small increments or in relatively large increments.
- In accordance with a further embodiment, a computer readable medium capable of storing computer executable instructions is provided. When executed by a computer, the computer readable medium includes code for detecting a lateral force applied to the input device and detecting a motion of the input device relative to a surface. In addition, the computer readable medium includes code for calculating a change in position of the navigational object based on a detected lateral force contribution and a detected motion contribution.
- In one embodiment, a method for controlling movement of a navigational object displayed on a user graphical interface is provided. The method includes measuring a lateral force applied to an input device and estimating a change in magnitude of the applied lateral force. The method also generates a control signal based on the estimated change in magnitude of the applied lateral force, wherein the control signal is indicative of a change in position of a navigational object on a graphical display.
- Certain embodiments of the invention have other aspects in addition to or in place of those mentioned or obvious from the above. The aspects will become apparent to those skilled in the art from a reading of the following detailed description when taken with reference to the accompanying drawings.
- The following drawings are provided for purposes of illustration only and merely depict exemplary embodiments of the disclosure. These drawings are provided to facilitate the reader's understanding of the disclosure and should not be considered limiting of the breadth, scope, or applicability of the disclosure. It should be noted that for clarity and ease of illustration these drawings are not necessarily made to scale.
-
FIG. 1 illustrates an exemplary computing system using an input device according to various embodiments of this invention. -
FIG. 2 is a system diagram of a modular arrangement of an input device according to various embodiments of this invention. -
FIG. 3 is a block diagram of forces exerted on an input device according to various embodiments of this invention. -
FIGS. 4A-4D are graphs of force and motion parameters of an input device moving form a first point to a second point according to various embodiments of this invention. -
FIG. 5 is a flow diagram illustrating a process of controlling movement of a graphical object according to various embodiments of this invention. -
FIG. 6 is a flow diagram illustrating another process of controlling movement of a graphical object according to various embodiments of this invention. - In the following description of exemplary embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments in which the invention can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the invention.
- In accordance with various embodiments, a motion-based input device can include one or more force sensors capable of detecting forces acting upon the input device and generating a signal representative of the detected forces. A system can then initiate (e.g., trigger) one or more events based on the signal.
- As used herein, the term “event” can refer to any function or process performed by a computer system in response to user input. An event need not be a function traditionally initiated by a user using a motion-based input device, but can also include functions initiated by a user using other types of input devices, including keyboards, touch pads, switches, buttons, dials or any other electrical, mechanical and/or optical mechanism that generates input data in response to a user input. A few non-limiting examples of an event can include moving a cursor displayed on a graphical user interface, making a selection indication (e.g., similar to depressing a selection button on a mouse, for example), changing a mode of operation, turning volume up or down, changing channels, paging back and forth in a software application, initiating a startup or wakeup sequence of a receiving device or the input device, and increasing a data collection rate to decrease lag.
- As used herein, “motion-based input device” can refer to an input device that detects multi-dimensional motion of the input device relative to a surface. Motion-based input devices can utilize a variety of sensors for detecting movement of the input device relative to a surface and generate an input signal indicating information pertaining to the detected movement. Non-limiting examples of motion-based input devices include electro-mechanical mice (also known as “ball mice”), and optical mice.
-
FIG. 1 illustrates a typical environment orsystem 100 in which a motion-basedinput device 102 in accordance with one embodiment may be used. Theinput device 102 can be positioned upon asurface 104 such as a desk or a tabletop. A user can move theinput device 102 relative to thesurface 104 to generate output signals indicative of the movement of the input device. - Note that in
FIG. 1 , thesurface 104 is depicted as being flat or substantially flat. However, this is not strictly necessary according to other embodiments. Also note that thesurface 104 need not necessarily be situated beneath theinput device 102. For example, thesurface 104 may be tilted, situated above theinput device 102, or vertically oriented. Also note thatmultiple surfaces 104 can be utilized. - A
receiving device 106 can be adapted to receive input signals generated by theinput device 102. As used herein, the terms “receiving device” and “receiver” include without limitation video game consoles, set-top boxes, televisions, personal computers (whether desktop, laptop, or otherwise), digital video recorders, communications equipment, terminals, and display devices. In accordance with various embodiments, thereceiving device 106 can comprise at least one interface adapted to receive the input signals transmitted from theinput device 102. Theinput device 102 can be physically coupled to the receiving device via one or more communication links (such as via a serial bus cable), or theinput device 102 can be adapted to wirelessly communicate with the receivingdevice 106. - A
display device 108 in communication with the receivingdevice 106 can be adapted to display a navigational object upon its display screen (for example, a pointer, cursor, selector box, or other such indicator). During operation, when the user manipulates theinput device 102 relative to thesurface 104, the input signals generated by the input device are received at the receivingdevice 106 and the navigational object responds according to the user's input. As used herein, the term “display device” can include any type of device adapted to display information, including without limitation cathode ray tube displays (CRTs), liquid crystal displays (LCDs), thin film transistor displays (TFTs), digital light processor displays (DLPs), plasma displays, light emitting diodes (LEDs) or diode arrays, incandescent devices, and fluorescent devices. Display devices may also include less dynamic devices such as printers, e-ink devices, and other similar structures. -
FIG. 2 is a system diagram of a modular arrangement of theinput device 102 according to one embodiment of the present invention. Theinput device 102 includes a printedcircuit board 204 comprising electrical leads that enable various modules to communicate with other coupled modules. - A
power supply 206 provides a source of power to modules electrically coupled to the printedcircuit board 204. In some embodiments, power is supplied externally from one or more conductive wires, for example, through the use of a power cable or a serial bus cable. In other embodiments, a battery may be used as a source of power. - A
memory 212 comprises any type of module adapted to enable digital information to be stored, retained, and retrieved. Additionally, thememory 212 may comprise any combination of volatile and non-volatile storage devices, including without limitation RAM, DRAM, SRAM, ROM, and/or flash memory. Note also that thememory 212 may be organized in any number of architectural configurations by the use of registers, memory caches, data buffers, main memory, mass storage, and/or removable media, for example. - One or
more processors 208 can be adapted to execute sequences of instructions by loading and storing data to thememory 212. Possible instructions include, without limitation, instructions for data conversions, formatting operations, communication instructions, and/or storage and retrieval operations. Additionally, theprocessors 208 may comprise any type of digital processing devices including, for example, reduced instruction set computer processors, general-purpose processors, microprocessors, digital signal processors, gate arrays, programmable logic devices, reconfigurable compute fabrics, array processors, and/or application-specific integrated circuits. Note also that theprocessors 208 may be contained on a single unitary integrated circuit (IC) die or distributed across multiple components. -
Interface module 216 enables data to be transmitted and/or received over one or more communication networks. The data can be transmitted or received wirelessly or through the use of wires. In one embodiment, data transmitted to a receiving device is first packetized and processed according to one or more standardized network protocols. In one embodiment, theinterface module 216 comprises a plurality of network layers such that each layer provides services to the layer above it and receives services from the layer below it. Theinterface module 216 may accommodate any wired or wireless protocol including, without limitation, USB, FireWire, Ethernet, Gigabit Ethernet, MoCA, radio frequency tuners, modems, WiFi, Blutooth, WiMax, and/or Infrared Data Association. - A
motion detection module 220 comprises sensors and logic adapted to detect and/or measure motion parameters, such as acceleration, speed, velocity and/or position of theinput device 102 at a specific instant in time, or alternatively, over a period of time. In accordance with various embodiments, the motion detection sensors can be an optical sensor, an electro-mechanical sensor, or any other sensor used in a motion-based input device capable of detecting motion of theinput device 102. - A
force detection module 222 can include sensors and logic adapted to detect forces acting upon theinput device 102 during an instant in time, or alternatively, over a period of time. In accordance with some embodiments, the force detection module can include one or more force detection sensors operable to detect external forces acting upon theinput device 102. In some embodiments, theforce detection module 220 can detect forces acting upon theinput device 102 in one dimension (e.g., an x-dimension), in other embodiments theforce detection module 222 can sense forces acting upon theinput device 102 in two dimensions (e.g., x and y dimensions), and in further embodiments theforce detection module 222 can detect forces acting upon theinput device 102 in three dimensions (e.g., x, y and z dimensions). - As mentioned above, the
input device 102 can include one or more force sensors. In some embodiments, a three-component force sensor can detect the forces exerted on the input device in three dimensions (e.g., x, y and z dimensions). Suitable three-component force sensors include Kistler 3-Component Force Sensors, models 9167A, 9168A, 916AB, or 9168AB, offered by Kistler North America located in Amherst, N.Y., USA. In other embodiments, separate force sensors can be used to detect the forces exerted on theinput device 102. - In accordance with one embodiment, directions and magnitudes of forces acting upon the
input device 102 can be derived from information generated by the force sensors.FIG. 3 is a block diagram indicating a force Ftotal applied to theinput device 102 positioned onsurface 104. As an example, the force Ftotal may be applied to theinput device 102 by a user to move the input device in a desired direction in a plane of motion. As used herein, a “plane of motion” can be defined as an x-y plane in a Cartesian coordinate system in which a user moves theinput device 102. The x-y plane has an x-axis and a y-axis perpendicular to the x-axis. A z-axis extends perpendicularly from the x-y plane. In one embodiment, the x-y plane is parallel to thesurface 104. - As depicted in
FIG. 3 , the force Ftotal applied to the input device can comprise a lateral force component and a normal force component (the normal force can also be referred to as a vertical force). The lateral force component further includes a first lateral force component, Fx, in a direction along the x-axis, and a second lateral force, Fy, component in a direction along the y-axis. The normal force component, Fz, is in a direction along the z-axis. - The direction of the lateral force is mathematically related to the lateral force components, Fx and Fy, applied to the
input device 102 in the plane of motion. This relationship can be expressed as: -
|F|=√{square root over ((Fx 2 +Fy 2))} (1) - Where |F| is a total magnitude of the lateral force applied to the
input device 102. Corresponding directional vectors can then be derived using the following expression: -
X direction=Fx/|F|, Y direction=Fy/|F| (2) - Thus, using the lateral force components, Fx and Fy, applied to the
input device 102, logic residing inforce detection module 220 orcomputer 106, for example, can estimate a total magnitude of the lateral force and corresponding directional unit vectors of the applied lateral force. Of course, other standard techniques known in physics may be used to calculate a scalar quantity of force from a given set of force vectors. In accordance with various embodiments, the logic may be implemented as any combination of software, firmware and/or hardware. - Note that in one embodiment, motion and force information can be written to a local memory source (not shown) (such as a register or local cache) before being provided as input. In other embodiments, this data can be directly written and retrieved from
memory 212. In still other embodiments, this data can be stored in external memory (e.g. a hard drive of the computer 106) and theinput device 102 can transmit raw data to thecomputer 106 for processing. - As mentioned above, one or
more force sensors 222 can be utilized to generate force information pertaining to forces acting upon theinput device 102. In accordance with one embodiment, theinput device 102 can detect lateral components of forces applied to theinput device 102 in two directions in a plane of motion of the input device; the first direction being substantially perpendicular to the second direction. Information relating to the detected lateral force components can then be used to calculate an estimated magnitude and direction of a lateral force acting upon theinput device 102, among other things. A system in accordance with various embodiments can then move a navigational object based on the estimated magnitude and direction, for example. -
FIGS. 4A-4D are graphs of force, acceleration, velocity and displacement, respectively, versus time of an exemplary movement ofinput device 102 moving in a straight line from a first point A to a second point B. Furthermore,FIGS. 4A-4D illustrate exemplary first through fifth states 401-405, respectively, of theinput device 102 while it moves from point A to point B. For illustrative purposes, the following description ofFIGS. 4A-4D may refer to elements mentioned above in connection withFIGS. 1-3 . - With particular reference to
FIG. 4A , during thefirst state 401, a force is applied toinput device 102, for example by a user, but the applied force does not exceed the static (maximum) frictional force betweeninput device 102 andsurface 104. Consequently, in thefirst state 401,input device 102 is not yet moving (seeFIGS. 4C and 4D ) despite a force being applied to theinput device 102. - During the
second state 402, the applied force exceeds the static frictional force, which results in movement of theinput device 102, as illustrated inFIGS. 4A and 4D , for example. Note that in accordance with known principles of physics, the coefficient for static frictional force is typically larger than the coefficient for kinetic frictional force. As a consequence, the frictional force between theinput device 102 andsurface 104 typically decreases once theinput device 102 begins sliding onsurface 104. Hence, as depicted inFIG. 4A , the frictional force decreases at the transition between thefirst state 401 and thesecond state 402. This transition from static friction to kinetic friction can result in unstable control of theinput device 102 due to what is commonly referred to as a “stick-slip” phenomenon. This phenomenon is due to a user needing to apply a force that exceeds the static friction to initiate sliding on theinput device 102, which can feel like theinput device 102 is “sticking” tosurface 104. But once the static frictional force is exceeded, the smaller kinetic coefficient applies to the frictional force, which can cause a slip or jerk type motion due to the reduction in frictional force. This “stick-slip” phenomenon can make it difficult for a user to control aninput device 102 when moving the input device a small distance, for example. - During the
third state 403, the applied force is equal to the frictional force. This results in no acceleration (FIG. 4B ) and a constant velocity (FIG. 4C ) of theinput device 102 along the plane of motion. - During the fourth state 4, the magnitude of the applied force is less than the frictional force. This results in a deceleration of the input device 102 (
FIG. 4B ). - Finally, during the fifth state 405, the
input device 102 is stopped. In this state, the applied force no longer exceeds the static frictional force (FIG. 4A ), and there is no motion (zero velocity) (FIG. 4C ), no acceleration (FIG. 4B ), and no change in displacement (FIG. 4D ). Note that the transition between thefourth state 404 and the fifth state 405 can also result in unstable movement of theinput device 102 due to the transition from kinetic friction to static friction. - In accordance with various embodiments, a system incorporating
input device 102, such as thesystem 100 depicted inFIG. 1 , for example, can determine which state 401-405 (e.g., moving or not moving) theinput device 102 is in at a given time period by measuring one or more of the parameters described inFIGS. 4A-4D . Using these parameters, thesystem 100 determines which state theinput device 102 is in and performs different actions (also referred to herein as “initiating events”) depending upon the particular state. For example, while in the first andfifth states 401 and 405, a system may initiate a fine control mode where a navigational object is moved in relatively small increments (e.g., a relatively small number of pixels); whereas, while instates input device 102 is in thefirst state 401, the detected lateral force information can be used to provide an early indication about an impending motion. The impending motion can be determined by using force information generated by the force sensors to estimate a direction of the applied force using expressions (1) and (2), for example. In another embodiment, ifinput device 102 is in thefirst state 401, a system incorporating theinput device 102 can prepare the system for an impending motion of theinput device 102 by waking components of the system from a sleep mode and/or increasing a data collection rate. -
FIG. 5 is a flow diagram illustrating anexemplary process 500 of controlling the movement of a navigational object in two modes: a fine control mode and a coarse control mode. The various tasks performed in connection withprocess 500 may be performed by hardware, software, firmware, or any combination thereof. It should be appreciated thatprocess 500 may include any number of additional or alternative tasks. The tasks shown inFIG. 5 need not be performed in the illustrated order, andprocess 500 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. For illustrative purposes, the following description ofprocess 500 may refer to elements mentioned above in connection withFIGS. 1-4 . -
Process 500 begins by detecting lateral force compnents applied to the input device 102 (e.g., the Fx and Fy force components) and detecting motion of theinput device 102 relative to surface 104, instep 502. In some embodiments, the motion is detected by one or more accelerometers, vibration sensors, optical sensors, electro-mechanical sensors, some other sensor operable to detect motion of theinput device 102, or a combination thereof. - Further to step 502,
process 500 can periodically detect the applied lateral forces and motion. In some embodiments, the lateral forces and motion may be measured every 8 milliseconds, but other lengths of time may be used. In addition, in some embodiments, the length of time between measurements for the lateral force and the motion need not be the same, and, moreover, the lateral force and the motion need not be measured at the same time, but instead can be measured successively. - In
step 504,process 500 determines if theinput device 102 is moving with respect to thesurface 104 by analyzing the motion detected instep 502.Process 500 can then determine theinput device 102 is moving relative to thesurface 104 if the velocity is non-zero. Of course other methods of determining if theinput device 102 is moving relative to thesurface 104 can be used depending upon the motion parameters detected by theinput device 102, as would be apparent to one skilled in the art after reading this disclosure. - If the
input device 102 is moving relative to thesurface 104, then a coarse control mode is initiated instep 506. In coarse control mode, an associated navigational object moves in relatively large, coarse increments based on the motion of theinput device 102 measured instep 502. Alternatively, while in coarse control mode, movement of the associated navigational object can be based on the motion of the input device and also in part on the detected lateral forces exerted on the input device. In one embodiment, while in coarse control mode, theinput device 102 controls movement of the navigational object in a similar manner as a conventional mouse controls movement of a navigational object. - Further to step 506, in some embodiments, the measured motion includes an acceleration measurement using one or more accelerometers. An estimated speed of
input device 102 can then be calculated by integrating the acceleration. Furthermore, an estimated direction of motion can be calculated using the lateral force measurements (step 502) using expressions (1) and (2), above. An associated navigational object can then be moved based on the estimated speed and estimated direction of motion. For example, the navigational object can be moved across a graphical user interface in the estimated direction of motion with a speed that is proportional to the estimated speed. In an alternative embodiment, the estimated speed and estimated direction of motion can be used to estimate a change in position of the navigational object. These are merely a few illustrative examples and it is contemplated that other methods of moving a navigational object based on the measured motion of theinput device 102 can be used in other embodiments. - If the input device is not moving relative to the
surface 104 instep 504, then, instep 508, magnitudes of the lateral force components, |Fx| and |Fy|, are calculated and compared to force components derived from a prior measurement of the lateral force components. In other words, the magnitudes of successively measured lateral force components are compared to one another instep 508.Process 500 then determines whether either of the force magnitude |Fx| or |Fy| is increasing as compared to the corresponding previously measured force magnitude indecision step 510. In accordance with one embodiment, the increasing force criteria ofstep 510 are considered met when the most recently measured force magnitude (e.g., |Fxt2| or |Fyt2| measured at time t2) is greater than the previously measured force magnitude (e.g., |Fxt1| or |Fyt1| measured at time t1). - If neither of the lateral force magnitudes calculated in
step 508 is increasing, then process 500 proceeds to step 512, whereprocess 500 indicates that no change in position of the navigational object is to occur. In one embodiment, indicating that no change in position is to occur includes setting both a change in x position value, Δx, and a change in y position value, Δy, to zero. - On the other hand, if either of the lateral force component magnitudes |Fx| or |Fy| is increasing, then process 500 proceeds to a fine control mode in
steps step 514, a change in position of the navigational object is calculated and a value for the positional change is set in memory, such asmemory 212. In one embodiment, the change in position is calculated using expressions (3) and (4): -
Δx=(gain)×(ΔFx) (3) -
Δy=(gain)×(ΔFy) (4) - Where Δx and Δy are change in position values of the navigational object along an x-axis and a y-axis, respectively, in a Cartesian coordinate system; gain is a predetermined gain factor; ΔFx is a change in the measured lateral force component along the x-axis; and ΔFy is a change in measured lateral force component along the y-axis. The gain factor can correspond to a desired precision or granularity of moving the navigational object in the fine control mode. In some embodiments, the gain factor has a value corresponding to a total range of 1 to 100 pixels for Δx and Δy. Of course, other values for the gain factor may be used depending upon various factors, such as a desired precision, and the size and resolution of the display upon which the navigational object is displayed and moved.
- In alternative embodiments, Δx and Δy are predetermined values. As an example, if the lateral force component measured along the x-axis is increasing in
step 510, then the navigational object is moved in an x-direction by a number of pixels corresponding to the predetermined value of Δx. In this manner, an increasing lateral force imparted on theinput device 102 results in the navigational object moving by a predetermined number of pixels. - Referring again to
FIG. 5 , once the change in position values, Δx and Δy, are calculated and set instep 514, the navigational object is moved based on the change in position values, Δx and Δy, instep 516. - In one embodiment,
process 500 utilizes two navigational control modes: a coarse control mode when theinput device 102 is moving relative to surface 104 and a fine control mode when theinput device 102 is not moving relative to surface 104 but a lateral force applied to theinput device 102 is increasing. In this manner, a user can move the navigational object a relatively small distance for precise targeting of an area by applying a force to theinput device 102 without initiating sliding of the input device relative to thesurface 104. On the other hand, when a user desires to move the navigational object a relatively large distance, for example from one side of a graphical display to another side of a graphical display, then the user can simply slide theinput device 102 relative to surface 104 to cause the navigational object to quickly move the larger distance. -
Process 500 can also prevent or reduce overshooting or undershooting a target area due to the unstable static/kinetic friction transition when transitioning between thefirst state 401 and thesecond state 402 and/or thefourth state 404 and the fifth state 405 (seeFIGS. 4A-4D ). For example, once theinput device 102 is no longer moving relative to surface 104, then theprocess 500 switches to fine control mode, permitting a user to move the navigational object in a precise manner to the target area. - Furthermore, in
process 500, the measured lateral force is used to move an associated navigational object in the fine control mode if the measured lateral force is increasing (step 510). A reason forstep 510 is so that the position of the associated navigational object does not change when the force sensor returns to its “home” or “zero position”. In other words, in operation, a user can move a navigational object by applying a lateral force to theinput device 102. But when the user reduces or stops applying a lateral force to theinput device 102, the force sensor may detect a decrease in the applied lateral force. If the detected decrease in applied lateral force is also used to move the navigational object, then the navigational object could be moved away from a targeted area as a result of the decreasing applied force. This may be undesirable in certain applications. Accordingly, in one embodiment,process 500 does not use a decrease in lateral force to move the associated navigational object. Thus, usingprocess 500, a user can move the navigational object to a target area by applying a force to theinput device 102 and not be concerned about the navigational object moving away from the target area once the user reduces or ceases to apply a force to theinput device 102. However, in alternative embodiments, the navigational object can be moved based on the measured lateral force regardless of whether the measured lateral force is increasing or decreasing, since doing so may be advantageous in some applications. - In some embodiments, the transition between the two navigational modes need not be abrupt, but instead can be smooth and seamless. A mathematical formula can be used such that the fine control is dominant initially, e.g., when a user first applies a force to move the
input device 102, and then the coarse control can gradually become dominant as the input device accelerates, for example. -
FIG. 6 is a flow diagram illustrating a furtherexemplary process 600 of controlling the movement of a navigational object based on both the measured lateral force and measured motion of theinput device 102, regardless of whether or not the input device is moving relative to thesurface 104. The various tasks performed in connection withprocess 600 may be performed by hardware, software, firmware, or any combination thereof. It should be appreciated thatprocess 600 may include any number of additional or alternative tasks. The tasks shown inFIG. 6 need not be performed in the illustrated order, andprocess 600 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. For illustrative purposes, the following description ofprocess 600 may refer to elements mentioned above in connection withFIGS. 1-4 . - Process 600 measures the lateral force components applied to the
input device 102 and the motion of the input device relative to surface 104 instep 602. In one embodiment, step 602 can be performed in a similar manner as described above with respect to step 502 ofprocess 500. -
Process 600 then calculates change in position values of the navigational object, Δx and Δy, based on both the measured force and measured motion instep 602. As an example, equations (5) and (6), below, can be used to calculate the values Δx and Δy: -
Δx=ΔFx(gainforce)+m x(gainmovement) (5) -
Δy=ΔFy(gainforce)+m y(gainmovement) (6) - Where Δx and Δy are a change in position of the navigational object along an x-axis and y-axis, respectively, of a Cartesian coordinate system; ΔFx and ΔFy are changes in measured lateral force components along the x-axis and y-axis of the plane of motion, respectively; mx and my are estimated movements of the
input device 102 along the x- and y-axis of the plane of motion, respectively; gainforce is a predetermined gain factor value corresponding to a desired granularity of moving the navigational object based on lateral force applied to theinput device 102; and gainmovement is a predetermined gain factor value corresponding to a desired granularity of moving the navigational object based on the movement of theinput device 102. The estimated movements, mx and my, can be an estimated speed, acceleration, change in position or other parameter pertaining to the motion of theinput device 102 relative to thesurface 104. - It is understood that other equations or mathematical relationships may be used in place of equations (5) and (6). For example, quadratic and/or exponential mathematical relationships may be used to calculate change in position values based on measured force and motion.
- Similar to process 500, in
process 600, the value of ΔFx or ΔFy is non-zero when the lateral force applied along the x-axis or y-axis is increasing. Otherwise, the value for ΔFx or ΔFy is zero, thereby resulting in no lateral force contribution to the change in position of the navigational object as set forth in equations (5) and (6). However, other embodiments may use a non-zero value for the change in lateral force, ΔFx or ΔFy, regardless of whether the lateral force is increasing or decreasing. Furthermore, in some embodiments, the value of the change in lateral force components, ΔFx or ΔFy, can be negative or positive, thereby possibly providing a negative or positive contribution to equations (5) or (6), for example. - In accordance with one embodiment, values for gainforce and gainmovement are selected to provide smooth and seamless transitions between the
first state 401 and thesecond state 402 and/or thefourth state 404 and the fifth state 405 (FIGS. 4A-4D ). Furthermore, the values for gainforce and gainmovement need not be constant. For example, mathematical formulas can be used to calculate variable force gain values and variable movement gain values based on various factors, such as a state of the input device 102 (e.g. first-fifth states 401-405, respectively) or other desired parameters. - In
step 606, the navigational object is moved based on the values of Δx and Δy. - It can be noted that various functions described in
processes input device 102, the receivingdevice 106 or by a combination of the two devices. For example, in accordance with one embodiment, after sensors of theinput device 102 detect the lateral force and motion (e.g., step 502 or step 602 inprocess input device 102 may generate and transmit one or more signals to the receivingdevice 106 indicative of the detected lateral force and motion. The receivingdevice 106 can then calculate the values of Δx and Δy based on the one or more signals indicative of the detected lateral force and motion. In such an embodiment, theinput device 102 need not have a processor, such asprocessor 208 depicted inFIG. 2 . In other embodiments, however, theinput device 102 may calculate the values of Δx and Δy using processor 208 (FIG. 2 ), generate one or more signals indicative of the values of Δx and Δy, and transmit the one or more signals to the receivingdevice 106. - While this invention has been described in terms of several exemplary embodiments, there many possible alterations, permutations, and equivalents of these exemplary embodiments. For example, the term “computer” does not necessarily mean any particular kind of device, combination of hardware and/or software, nor should it be considered restricted to either a multi purpose or single purpose device.
- It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents. In addition, as used herein, the terms “computer program” and “software” can refer to any sequence of human or machine cognizable steps that are adapted to be processed by a computer. Such may be rendered in any programming language or environment including, for example, C/C++, Fortran, COBOL, PASCAL, Perl, Prolog, assembly language, scripting languages, markup languages (e.g., HTML, SGML, XML, VOXML), functional languages (e.g., APL, Erlang, Haskell, Lisp, ML, F# and Scheme), as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), Java™ (including J2ME, Java Beans, etc.).
- Moreover, terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as mean “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise. Similarly, a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should also be read as “and/or” unless expressly stated otherwise. Furthermore, although items, elements or components of the invention may be described or claimed in the singular, the plural is contemplated to be within the scope thereof unless limitation to the singular is explicitly stated. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “module” does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed across multiple locations.
Claims (30)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/233,502 US9851813B2 (en) | 2008-09-18 | 2008-09-18 | Force sensing for fine tracking control of mouse cursor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/233,502 US9851813B2 (en) | 2008-09-18 | 2008-09-18 | Force sensing for fine tracking control of mouse cursor |
Publications (2)
Publication Number | Publication Date |
---|---|
US20100066670A1 true US20100066670A1 (en) | 2010-03-18 |
US9851813B2 US9851813B2 (en) | 2017-12-26 |
Family
ID=42006780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/233,502 Expired - Fee Related US9851813B2 (en) | 2008-09-18 | 2008-09-18 | Force sensing for fine tracking control of mouse cursor |
Country Status (1)
Country | Link |
---|---|
US (1) | US9851813B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120146935A1 (en) * | 2010-12-14 | 2012-06-14 | Synaptics Incorporated | System and method for determining object information using an estimated rigid motion response |
US9639187B2 (en) | 2008-09-22 | 2017-05-02 | Apple Inc. | Using vibration to determine the motion of an input device |
US9665214B2 (en) | 2012-03-29 | 2017-05-30 | Synaptics Incorporated | System and methods for determining object information using selectively floated electrodes |
US10261619B2 (en) | 2015-08-31 | 2019-04-16 | Synaptics Incorporated | Estimating force applied by an input object to a touch sensor |
US11137837B2 (en) | 2017-10-23 | 2021-10-05 | Hewlett-Packard Development Company, L.P. | Input device with precision control |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5191641A (en) * | 1988-09-26 | 1993-03-02 | Sharp Kabushiki Kaisha | Cursor shift speed control system |
US5195179A (en) * | 1986-01-29 | 1993-03-16 | Hitachi, Ltd. | Coordinate input apparatus |
US5508719A (en) * | 1992-05-01 | 1996-04-16 | Ast Research, Inc. | Pressure-actuated pointing device |
US5999169A (en) * | 1996-08-30 | 1999-12-07 | International Business Machines Corporation | Computer graphical user interface method and system for supporting multiple two-dimensional movement inputs |
US6489948B1 (en) * | 2000-04-20 | 2002-12-03 | Benny Chi Wah Lau | Computer mouse having multiple cursor positioning inputs and method of operation |
US20040017355A1 (en) * | 2002-07-24 | 2004-01-29 | Youngtack Shim | Cursor control systems and methods |
US6894678B2 (en) * | 1997-08-23 | 2005-05-17 | Immersion Corporation | Cursor control using a tactile feedback device |
US20050110745A1 (en) * | 2003-11-25 | 2005-05-26 | International Business Machines Corporation | Controller, system and method for controlling a cursor |
US20050134556A1 (en) * | 2003-12-18 | 2005-06-23 | Vanwiggeren Gregory D. | Optical navigation based on laser feedback or laser interferometry |
US6975302B1 (en) * | 2000-06-23 | 2005-12-13 | Synaptics, Inc. | Isometric joystick usability |
US7154477B1 (en) * | 2003-09-03 | 2006-12-26 | Apple Computer, Inc. | Hybrid low power computer mouse |
US20070080940A1 (en) * | 2005-10-07 | 2007-04-12 | Sharp Kabushiki Kaisha | Remote control system, and display device and electronic device using the remote control system |
US20070290998A1 (en) * | 2006-06-08 | 2007-12-20 | Samsung Electronics Co., Ltd. | Input device comprising geomagnetic sensor and acceleration sensor, display device for displaying cursor corresponding to motion of input device, and cursor display method thereof |
US20100013768A1 (en) * | 2008-07-18 | 2010-01-21 | Apple Inc. | Methods and apparatus for processing combinations of kinematical inputs |
US20100066669A1 (en) * | 2008-09-18 | 2010-03-18 | Apple Inc. | Using measurement of lateral force for a tracking input device |
-
2008
- 2008-09-18 US US12/233,502 patent/US9851813B2/en not_active Expired - Fee Related
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5195179A (en) * | 1986-01-29 | 1993-03-16 | Hitachi, Ltd. | Coordinate input apparatus |
US5191641A (en) * | 1988-09-26 | 1993-03-02 | Sharp Kabushiki Kaisha | Cursor shift speed control system |
US5508719A (en) * | 1992-05-01 | 1996-04-16 | Ast Research, Inc. | Pressure-actuated pointing device |
US5999169A (en) * | 1996-08-30 | 1999-12-07 | International Business Machines Corporation | Computer graphical user interface method and system for supporting multiple two-dimensional movement inputs |
US6894678B2 (en) * | 1997-08-23 | 2005-05-17 | Immersion Corporation | Cursor control using a tactile feedback device |
US6489948B1 (en) * | 2000-04-20 | 2002-12-03 | Benny Chi Wah Lau | Computer mouse having multiple cursor positioning inputs and method of operation |
US6975302B1 (en) * | 2000-06-23 | 2005-12-13 | Synaptics, Inc. | Isometric joystick usability |
US20040017355A1 (en) * | 2002-07-24 | 2004-01-29 | Youngtack Shim | Cursor control systems and methods |
US7154477B1 (en) * | 2003-09-03 | 2006-12-26 | Apple Computer, Inc. | Hybrid low power computer mouse |
US20050110745A1 (en) * | 2003-11-25 | 2005-05-26 | International Business Machines Corporation | Controller, system and method for controlling a cursor |
US20050134556A1 (en) * | 2003-12-18 | 2005-06-23 | Vanwiggeren Gregory D. | Optical navigation based on laser feedback or laser interferometry |
US20070080940A1 (en) * | 2005-10-07 | 2007-04-12 | Sharp Kabushiki Kaisha | Remote control system, and display device and electronic device using the remote control system |
US20070290998A1 (en) * | 2006-06-08 | 2007-12-20 | Samsung Electronics Co., Ltd. | Input device comprising geomagnetic sensor and acceleration sensor, display device for displaying cursor corresponding to motion of input device, and cursor display method thereof |
US20100013768A1 (en) * | 2008-07-18 | 2010-01-21 | Apple Inc. | Methods and apparatus for processing combinations of kinematical inputs |
US20100066669A1 (en) * | 2008-09-18 | 2010-03-18 | Apple Inc. | Using measurement of lateral force for a tracking input device |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9639187B2 (en) | 2008-09-22 | 2017-05-02 | Apple Inc. | Using vibration to determine the motion of an input device |
US20120146935A1 (en) * | 2010-12-14 | 2012-06-14 | Synaptics Incorporated | System and method for determining object information using an estimated rigid motion response |
CN103270423A (en) * | 2010-12-14 | 2013-08-28 | 辛纳普蒂克斯公司 | System and method for determining object information using an estimated rigid motion response |
US8618428B2 (en) * | 2010-12-14 | 2013-12-31 | Synaptics Incorporated | System and method for determining object information using an estimated rigid motion response |
US9195339B2 (en) | 2010-12-14 | 2015-11-24 | Synaptics Incorporated | System and method for determining object information using an estimated rigid motion response |
US9665214B2 (en) | 2012-03-29 | 2017-05-30 | Synaptics Incorporated | System and methods for determining object information using selectively floated electrodes |
US10261619B2 (en) | 2015-08-31 | 2019-04-16 | Synaptics Incorporated | Estimating force applied by an input object to a touch sensor |
US11137837B2 (en) | 2017-10-23 | 2021-10-05 | Hewlett-Packard Development Company, L.P. | Input device with precision control |
Also Published As
Publication number | Publication date |
---|---|
US9851813B2 (en) | 2017-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9658698B2 (en) | Using measurement of lateral force for a tracking input device | |
US8957909B2 (en) | System and method for compensating for drift in a display of a user interface state | |
US9207801B2 (en) | Force sensing input device and method for determining force information | |
US8743071B2 (en) | Hybrid inertial and touch sensing input device | |
US8159455B2 (en) | Methods and apparatus for processing combinations of kinematical inputs | |
US6104969A (en) | Methods and apparatus for operating an input device in a turbulent environment | |
US8199031B2 (en) | Input apparatus, control apparatus, control system, control method, and program therefor | |
US9851813B2 (en) | Force sensing for fine tracking control of mouse cursor | |
US8395583B2 (en) | Input apparatus, control apparatus, control system, control method, and handheld apparatus | |
US20120256835A1 (en) | Motion control used as controlling device | |
US9189088B2 (en) | Electronic device and touch operating method thereof | |
US20100174506A1 (en) | System and Method for Determining an Attitude of a Device Undergoing Dynamic Acceleration Using a Kalman Filter | |
WO2007124614A1 (en) | Process for controlling cursor speed in user interface | |
CN112738886B (en) | Positioning method, positioning device, storage medium and electronic equipment | |
US8380459B2 (en) | Motion plane correction for MEMS-based input devices | |
US7724244B2 (en) | Slide-type input device, portable device having the input device and method and medium using the input device | |
US8933875B2 (en) | Velocity stabilization for accelerometer based input devices | |
EP2538308A2 (en) | Motion-based control of a controllled device | |
US10466814B2 (en) | Electronic system, indicating device and operating method thereof | |
WO2015033682A1 (en) | Manipulation input device, portable information terminal, method for control of manipulation input device, program, and recording medium | |
US8717290B2 (en) | Method and module for modifying an angular pointer signal generated using three dimensional pointing, and three dimensional pointing device | |
JP4450569B2 (en) | Pointer cursor control device and electronic apparatus equipped with the device | |
KR100682579B1 (en) | Pointing device | |
CN102566804A (en) | Locus compensation method, system and computer program product of touch control input device | |
US20090322710A1 (en) | Extent calibration for absolute input sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC.,CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMM, DAVID T.;LEUNG, OMAR S.;SIGNING DATES FROM 20080912 TO 20080916;REEL/FRAME:021560/0149 Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMM, DAVID T.;LEUNG, OMAR S.;SIGNING DATES FROM 20080912 TO 20080916;REEL/FRAME:021560/0149 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20211226 |