EP3218792A1 - System and methods for controlling a cursor based on finger pressure and direction - Google Patents

System and methods for controlling a cursor based on finger pressure and direction

Info

Publication number
EP3218792A1
EP3218792A1 EP15801566.9A EP15801566A EP3218792A1 EP 3218792 A1 EP3218792 A1 EP 3218792A1 EP 15801566 A EP15801566 A EP 15801566A EP 3218792 A1 EP3218792 A1 EP 3218792A1
Authority
EP
European Patent Office
Prior art keywords
virtual mouse
processor
touchscreen
touch
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15801566.9A
Other languages
German (de)
French (fr)
Inventor
Junchen Du
Bo Zhou
Ning Bi
Joon Mo Koh
Jun Hyung Kwon
Homayoun Dowlat
Suhail Jalil
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of EP3218792A1 publication Critical patent/EP3218792A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • GUI Graphical User Interface
  • GUI Graphical User Interface
  • Systems, methods, and devices of various embodiments may enable a computing device configured with a touchscreen to implement a virtual mouse on the touchscreen by activating the virtual mouse during single-handed use of the computing device by a user, determining a position of the virtual mouse on the touchscreen, and projecting a cursor icon onto the touchscreen using the calculated vector.
  • the projected cursor icon may be positioned to extend beyond a reach of a user's thumb or finger during single-handed use.
  • determining a position of the virtual mouse on the touchscreen may include identifying a touch area associated with a user touch event, collecting touch data from the identified touch area, determining pressure and direction parameters associated with the user touch event, and calculating a vector representing the position of the virtual mouse based on the pressure and direction parameters associated with the user touch event.
  • activating the virtual mouse may include detecting a touch event in a predetermined virtual mouse activation area of a touchscreen display of the computing device. Some embodiments may further include determining, while the virtual mouse is activated, whether a touch event is detected in the predetermined virtual mouse activation area, and deactivating the virtual mouse in response to determining that a touch event has been detected in the predetermined virtual mouse activation area while the virtual mouse is activated.
  • activating the virtual mouse may include automatically initiating activation upon detecting that the computing device is held in a manner consistent with single-handed use by the user.
  • determining the direction associated with the user touch event may be based at least in part on an orientation of a major axis of an ellipse fitted to the touch area.
  • determining the pressure parameter associated with the user touch event may be based on at least one of an area of the ellipse fitted to the touch area, and a touch pressure
  • calculating the position of the virtual mouse may include calculating a vector representing the position of the virtual mouse in which a magnitude of the calculated vector may be based at least in part on the determined pressure parameter.
  • Some embodiments may further include determining whether the user touch event has ended while the projected cursor icon is positioned over a Graphical User Interface (GUI) element displayed on the touchscreen, and executing an operation associated with the GUI element in response to determining that the user touch event has ended while the projected cursor icon is positioned over the displayed GUI element.
  • GUI Graphical User Interface
  • Some embodiments may further include automatically deactivating the virtual mouse after the execution of the operation associated with the GUI element.
  • Some embodiments may further include detecting whether the projected cursor icon is positioned within a threshold distance from an operable Graphical User Interface (GUI) element displayed on the touchscreen, and drawing the projected cursor icon to the operable GUI element in response to detecting that the projected cursor icon is positioned within the threshold distance. Some embodiments may further include detecting whether the projected cursor icon has moved more than a predetermined non-zero distance away from a currently-selected operable Graphical User Interface (GUI) element, and deselecting the operable GUI element in response to detecting that the cursor has moved more than the predetermined non-zero distance from the currently-selected operable GUI element.
  • GUI Graphical User Interface
  • Various embodiments include computing device configured with a
  • touchscreen and including a processor configured with processor-executable instructions to perform operations of the methods described above.
  • Various embodiments also include a non-transitory processor-readable medium on which is stored processor-executable instructions configured to cause a processor of a computing device to perform operations of the methods described above.
  • Various embodiments include a computing device having means for performing functions of the methods described above.
  • FIG. IB is a block diagram illustrating an example system for implementing a virtual mouse system on a device according to various embodiments.
  • FIGs. 4A-4C are illustrations of an example smartphone device touchscreen display showing use of an example virtual mouse interface according to various embodiments.
  • FIGs. 6A and 6B are process flow diagrams illustrating an example method for implementing a virtual mouse according to various embodiments.
  • a virtual mouse interface (also referred to as "virtual mouse”) may mitigate the inconvenience of single-handed use of a smartphone due to a mismatch between the size of the display and the user's hand size.
  • the virtual mouse provides a cursor that may be controlled by a single finger (e.g., thumb or other finger).
  • the virtual mouse may interact with GUI elements display in various locations on the touchscreen display. This may include GUI elements that are not easily reachable by a fmger or thumb during single-hand use.
  • the smartphone device 100 may include a power source 122 coupled to the processor 102, such as a disposable or rechargeable battery.
  • the rechargeable battery may also be coupled to the peripheral device connection port to receive a charging current from a source external to the smartphone device 100.
  • the display device 160 may generally be configured to display a graphical user interface (GUI) that enables interaction between a user of the computer system and the operating system or application running thereon.
  • GUI graphical user interface
  • the GUI may represent programs, files and operational options with graphical images.
  • the graphical images may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images may be arranged in predefined layouts, or may be created dynamically to serve the specific actions being taken by a user.
  • the user may select and activate various graphical images in order to initiate functions and tasks associated therewith.
  • a user may select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program.
  • the touchscreen system in the various embodiments may be coupled to a touchscreen input/output (I/O) controller 162 that enables input of information from the sensor(s) 158 (e.g., touch events) and output of information to the display device 160 (e.g., GUI presentation).
  • the touchscreen I/O controller may receive information from the touch sensor(s) 158 based on the user's touch, and may send the information to specific modules configured to be executed by the general purpose processor(s) 152 in order to interpret touch events.
  • single point touches and multipoint touches may be interpreted.
  • single point touch refers to a touch event defined by interaction with a single portion of a single finger (or instrument), although the interaction could occur over time.
  • single point touch input include a simple touch (e.g., a single tap), touch-and-drag, and double-touch (e.g., a double-tap— two taps in quick succession).
  • multi-point touch may refer to a touch event defined by
  • the general purpose processor 152 may implement one or more program modules stored in memory 154 to identify/interpret the touch event and control various components of the smartphone.
  • a touch identification module 164 may identify events that correspond to commands for performing actions in applications 166 stored in the memory 154, modifying GUI elements shown on the display device 160, modifying data stored in memory 154, etc.
  • the touch identifier module may identify an input as a single point touch event on the touchscreen system 156.
  • a single point touch may initiate cursor tracking and/or selection.
  • cursor movement may be controlled by the user moving a single finger on a touch sensitive surface of the touchscreen system 156.
  • tracking may involve interpreting touch events by the touch identifier module 164, and generating signals for producing corresponding movement of a cursor icon on the display device 160.
  • the virtual mouse manager 168 may interpret touch events and generate signals for producing scaled movement of the cursor icon on the display device 160.
  • interpreting touch events while the virtual mouse is activated may involve extracting features from the touch data (e.g., number of touches, position and shape of touches, etc.), as well as computing parameters (e.g., touch pressure and/or best fit ellipse to touch area, etc.).
  • touch data and computing parameters may be computed by the touchscreen I/O interface 162.
  • a cursor calculation module 170 may use the measured/sensed touch data and computing parameters obtained from the touchscreen I/O interface 162 to determine a cursor location.
  • Other functions, including filtering signals and conversion into different formats, as well as interpreting touch event when the virtual mouse is not activated may be performed using any of a variety of additional programs/modules stored in memory 154.
  • the general purpose processor(s) 152, memory 154, and touchscreen I/O controller 162 may be included in a system-on-chip device 172.
  • the one or more subscriber identity modules (SIMs) and corresponding interface(s) may be external to the system-on-chip device 172, as well as various peripheral devices (e.g., additional input and/or output devices) that may be coupled to components of the system-on-chip device 172, such as interfaces or controllers.
  • SIMs subscriber identity modules
  • peripheral devices e.g., additional input and/or output devices
  • Fig. 2 is an illustration of conventional single-handed use of a smartphone device 200.
  • the smartphone device 200 may be similar to the smartphones 100, 150 described with reference to FIGs. 1A-1B.
  • the smartphone device 200 may be configured with a touchscreen display 220 (e.g., display device 160). Holding the smartphone device 200 in one hand 230 and interacting with the GUI displayed on the touchscreen display 220 of the smartphone device with only the thumb 240 (or other finger) of hand 230 may be a preferable mode of using the smartphone device under many circumstances.
  • the larger the touchscreen display 220 the more difficult it is to reach every corner with a single finger.
  • the upper region of the touchscreen display 220 may be especially difficult to reach with the thumb 240 (or other finger) of the hand 230 holding the smartphone device.
  • Fig. 2 illustrates a first region 250 of the touchscreen display 220 that is easily reachable by the thumb 240, and a second region 260 of the touchscreen display 220 that is difficult to reach by the thumb 240.
  • the various embodiments utilize additional inputs made available by processing touch event data generated by the touchscreen to implement a virtual mouse in order to overcome the inconveniences to single-hand use of the smartphone device caused by the mismatch between the size of the touchscreen display and the hand size.
  • the virtual mouse includes a cursor/icon that may interact with different elements of the GUI.
  • the cursor may be movable in the whole region of the touchscreen display by a thumb's corresponding rotation and movement and/or change in pressure on the touchscreen display.
  • the user may interact with elements of the GUI on the touchscreen display that is not easily reachable in the single-handed use scenario using the cursor/icon of the virtual mouse while keeping the thumb within the region of the touchscreen display that is easily reachable.
  • pressure may be calculated indirectly based on the area of the finger in contact with the touchscreen surface. That is, the relative size of the touch area may serve as a proxy for the touch pressure, where a larger area translates to more pressure. In this manner, instead of actual pressure measurements, the smartphone may calculate an estimated pressure based on the touch area, thereby avoiding a need for additional hardware or sensing circuitry on the device.
  • a represents the semi-major axis and b represents the semi-minor axis of the ellipse, with the semi-major and semi-minor axes aligning on x and y Cartesian axes in which the ellipse center is at the origin point (0,0).
  • Fig. 3A is a diagram showing an example ellipse function 300 corresponding to a touch area of a user's finger in various embodiments.
  • Conventional touchscreen technologies provide only the positioning (i.e., x, y coordinates) of the touch events.
  • an orientation of the touch area and a pressure associated with the touch event may be provided in addition to the position of the touch area.
  • the ellipse function 300 is fitted to an approximate touch area 310, and characterized based on a semi-major axis 320 and semi-minor axis 330.
  • a pressure associated with the touch event may also be provided.
  • the size of the touch area 310 may be used as to estimate pressure because the touch area expands as the touch pressure increases when the touch event is created by an extendable object, such as a finger.
  • the virtual mouse may be displayed on the touchscreen at a location calculated based on the various touch parameters.
  • the location of the virtual mouse may be calculated as a vector calculated based on various touch properties.
  • a cursor icon (or other icon) may be displayed to represent the location of the virtual mouse.
  • touch properties used to calculate the virtual mouse location may be represented as vectors.
  • the orientation of the major axis of the best fitting ellipse may be represented by a vector /based on a direction pointing toward the top edge of the touchscreen and/or away from the virtual mouse activation area.
  • the touch position of the user's finger may be represented by a vector c from a starting or reference point to the center point of the touch area.
  • the position of the closest corner to the actual touch position may be represented by a vector r from the starting reference point to the closest corner.
  • the starting or initial reference point of vectors c and r may be the same as the projection point from which the calculated virtual mouse vector is projected out onto the touchscreen— that is, the point at the virtual mouse activation area.
  • the location of the virtual mouse may be calculated using Equation 3 :
  • Virtual mouse location c + kpf Eq. 3
  • c represents a vector to the center point of the actual touch position (i.e., a position in Cartesian space),/ represents a vector corresponding to the orientation of the major axis of an ellipse best fitting the boundary of the touch area, /? is a pressure measurement, and k is a scaling factor so that the virtual mouse covers the entire touchscreen.
  • FIG. 3B illustrates a representative determination of the virtual mouse location on a smartphone device 350 using Equation 3.
  • the smartphone device 350 may be similar to the smartphones 100, 150, 200 described with reference to FIGs. 1 A-2.
  • the smartphone device 350 may be configured with a touchscreen display 352 (e.g., 160, 220), and a user may interact with the GUI displayed on the touchscreen display 352 with only one finger 354.
  • a touchscreen display 352 e.g., 160, 220
  • vector 356 provides direction and distance from an initial reference point to the center of the touch area 310 of the finger 354, corresponding to c in Equation 3. While the top left corner of the touchscreen display 352 is used as the initial reference point for the embodiment shown in FIG. 3, the location of the initial reference point is arbitrary, as any of the corners or other points on the touchscreen display 52 may provide the initial reference point.
  • Vector 358 provides a direction representing the orientation of the major axis 340 of an ellipse (e.g., 300) best fitting the boundary of the touch area 310, corresponding to /in Equation 3.
  • the magnitude of vector 358 may be the actual length of the major axis 340. In other embodiments, the magnitude of vector 358 may be a fixed representative value similar to the scaling factor k
  • Vector 360 on the touchscreen display 352 is a resultant vector from
  • vector 358 by a scalar, and corresponding to kpf in Equation 3.
  • Adding vector 360 to vector 356, a resultant vector 362 provides direction and distance from the initial reference point to the virtual mouse location 363 on the touchscreen display 352. That is, vector 362 corresponds to the calculation in Equation 3 of c + kpf.
  • the location of the virtual mouse may be calculated using Equation 4:
  • Virtual mouse location c + kp (c— r) Eq. 4 where r represents a vector to the corner of the touchscreen closest to the actual touch location (i.e., a position in Cartesian space).
  • FIGS. 3C illustrates a representative computation of a vector c - r for use in determining the virtual mouse location on the smartphone device 350 using Equation 4.
  • vector 356 provides direction and distance from an initial reference point at the top left corner of the touchscreen display 352 to the center of the touch area. Similar to Equation 3, vector 356 corresponds to c in Equation 4.
  • vector 364 provides direction and distance from an initial reference point to the corner closest to the actual touch location, corresponding to r in Equation 4. Subtracting vector 364 from vector 356 provides a resultant vector 366, which corresponds to c - r in Equation 4.
  • Vector 368 on the touchscreen display 352 is a vector resulting from
  • FIGs. 4A and 4B illustrate a smartphone device 400 in which an embodiment of the disclosure is implemented.
  • Smartphone device 400 includes a touchscreen display 410, on which a GUI is displayed.
  • a predetermined area 420 on the touchscreen display 410 may be designated as the virtual mouse activation area.
  • a user may activate the virtual mouse by touching the activation area 420 with, e.g., a thumb and maintaining the touch (e.g., by not removing the thumb).
  • the virtual mouse activation area 420 is in the bottom right corner of the touchscreen display 410.
  • the actual placement of the virtual mouse activation area may be user-customizable. For example, a user intending to operate the smartphone device 410 with the right hand may designate the bottom right corner as the virtual mouse activation area, and a user intending to operate the smartphone device 410 with the left had may designate the bottom left corner as the virtual mouse activation area.
  • a user may additionally or alternatively activate the virtual mouse by applying a sufficient amount of force at any area on the touchscreen display 410.
  • the virtual mouse may be activated in response to detecting a touch input with an amount of pressure that is above a threshold value.
  • a cursor icon 430 may be displayed on the touchscreen display 410 to signify the same.
  • the GUI element(s) selected by the virtual mouse are indicated by the location of the cursor icon 430, which, as will be described below, may be controlled by the rotation and movement and/or pressure change of the maintained touch by, e.g., a thumb.
  • the virtual mouse may be automatically activated when a processor determines that the
  • smartphone device 400 is being held in a hand in a manner that is consistent with single-hand use.
  • Fig. 4C illustrates a smartphone device 400 in which a virtual mouse is activated.
  • a user may activate the virtual mouse for example by touching the virtual mouse activation area with a finger 440 (e.g., a thumb) and maintaining the contact between the finger 440 and touchscreen display 410.
  • the user may wish to activate the virtual mouse when the user intends to operate GUI elements on a region of the touchscreen display 410 that is not easily reachable by the fmger 440.
  • the user may control the location of the cursor icon 430 by rotating the finger 440 and changing at least one of the position of the fmger 440 on the touchscreen display 410 and/or the touch pressure.
  • the location of the cursor icon 430 may be determined by evaluating the expression c + kpf from (Equation 3) or c + kp(c - r) (Equation 4).
  • c is a vector representing the position of the touch area (e.g., a vector from the virtual mouse activation area or initial reference point to a center of the current touch area).
  • r is a vector representing the position of the closest corner of the touchscreen (e.g., a vector from the virtual mouse activation area or initial reference point to the corner closest to c).
  • Equation 4 is a vector representing the position of the closest corner of the touchscreen (e.g., a vector from the virtual mouse activation area or initial reference point to the corner closest to c).
  • Equation 3 /is a vector representing the orientation of the touch area (e.g., a unit vector indicating the orientation of the touch area).
  • p is the touch pressure
  • k is a scaling factor chosen so that the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 with movements of the thumb 440 that are within the easily reachable region of the touchscreen display 410.
  • the position of the current touch area, the orientation of the current touch area, and the current touch pressure are all taken into consideration in the determination of the location of the cursor icon 430.
  • only the position and the orientation of the current touch area are taken into consideration in the determination of the location of the cursor icon 430 (i.e., /? in c + kpf or c + kp(c - r) is made constant).
  • only the orientation of the current touch area and the current touch pressure are taken into consideration in the determination of the location of the cursor icon 430 (i.e., c in c + kpfi ' s made constant).
  • the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 while keeping the thumb within the region of the touchscreen display 410 that is easily reachable.
  • the scaling factor k that may be utilized in the above virtual mouse location calculations may be calibrated to adjust the amount of change in cursor location per movement of the user's fmger.
  • the user receives constant visual feedback from the touchscreen display in the form of the change in location of the displayed cursor icon. Therefore, the user may adjust the relative force and/or motion being employed by the user to achieve desired results.
  • the smartphone upon first powering on, the smartphone may be configured to perform some training with a user in order to detect properties of the user's finger size and pressing activity. In this manner, the scaling factor may be adjusted to
  • the smartphone may store each user-customized scaling factor for future use for the user (e.g., within a user profile), and may evolve the user's scaling factor over time as details regarding particular touch patterns are collected.
  • each user-customized scaling factor for future use for the user (e.g., within a user profile)
  • the manufacturer may specify preset maximum and minimum scaling factors (i.e., a scaling factor range) based on the size of the particular display and the relative size and strength of an average human touch input. While these ranges may be used initially, some embodiments provide for eventual customization of a scaling factor over time based on users, effectively replacing a generalized scaling factor with specifically developed values. Such customizations may also be made available for the sensitivity and/or speed of the virtual mouse movement, which may be changed by applying an exponential function in place of the pressure value (i.e., replacing p with p x , where x may be configurable based on user training and/or customization over time.
  • the user may manually adjust parameters, such as the scaling factor k, the exponential function applied to the pressure p, and/or the threshold values for selecting and/or deselecting GUI elements, etc., such as via various user input mechanisms.
  • an operation may be performed with respect to the GUI element at the location of the cursor.
  • the processor may determine that the cursor icon 430 is at the desired location on the GUI based on a decrease in velocity of the virtual mouse or pressure of the user's touch that exceeds a threshold value.
  • the operation performed when the cursor icon 430 is at the desired location may be the selection of an icon that causes an application (e.g., a game application) to be launched.
  • the operation may cause a selection of an item (e.g., selection of text, a menu item selection, etc.).
  • the operation may in some embodiments be performed in response to an additional user input with respect to the cursor icon 430.
  • Such an additional user input may include, for example, a recognized gesture by the finger (e.g., click, double click, swipe, etc.) that is received within a threshold time after the cursor icon 430 is at the desired location on the GUI.
  • the additional user input may be a gesture (e.g., click, double click, swipe, etc.) received from another of the user's fingers.
  • the additional user input that triggers performing an operation may be an increase in touch force (i.e., increase in pressure) applied by the user's finger.
  • touch force may be used to prompt performance of an operation (e.g. , launching an application, etc.) provided a differentiator is used to distinguish the virtual mouse movement and the operation. For example, a brief pause in touch pressure may be used as a differentiator.
  • maintaining the cursor icon 430 in one location for a threshold amount of time may differentiate touch pressure for performing an operation from pressure used to calculate the cursor icon 430 location.
  • a user may configure one or more additional gestures that trigger the operation through settings on the smartphone device 400.
  • the operation may be performed in response to detecting termination of the movement of the cursor icon 430 (e.g., indicated by the user removing the thumb from the touchscreen display 410).
  • the processor may distinguish between the sudden decrease in touch pressure caused by the ending of the touch, which indicates that the user intends to execute a GUI operation, and the gradual change in touch pressure caused by the user intentionally changing the touch pressure in order to move the cursor icon 430, where appropriate.
  • the processor of the smartphone may be configured such that when the cursor icon 430 is moved near an operable GUI element (i.e., within a threshold distance), such as an icon for launching an application or other item (e.g., text, menu item), the cursor icon 430 may be automatically "drawn" to the operable GUI element.
  • the operable GUI element may be enlarged and/or highlighted by the processor once the cursor icon 430 is over it to signify selection.
  • an already- selected operable GUI element i.e., an operable GUI element over which the cursor icon 430 is located
  • the virtual mouse may be deactivated based on receiving additional user input via the GUI. For example, in an embodiment the user may deactivate the virtual mouse by moving the finger to an area (e.g., the activation area 420) on the GUI, and removing the finger from the touchscreen display 410. In another embodiment, the virtual mouse may be deactivated in response to the user removing the finger from the touchscreen display 410 while the cursor icon 430 is in an area on the GUI that is not within a threshold distance from any operable GUI element.
  • an area e.g., the activation area 420
  • the virtual mouse may be deactivated in response to the user removing the finger from the touchscreen display 410 while the cursor icon 430 is in an area on the GUI that is not within a threshold distance from any operable GUI element.
  • the virtual mouse may be automatically deactivated after performing an operation (e.g., selection of an application or item).
  • the user may deactivate the virtual mouse by performing a particular recognized gesture on the touchscreen display 410.
  • the processor may be configured to deactivate the virtual mouse in response to a double click, a swipe left, a swipe right, a combination thereof, etc. on the touchscreen display 410.
  • a user may preset one or more particular gestures to trigger deactivation of the virtual mouse.
  • FIG. 5 illustrates a method 500 for implementing a virtual mouse on a smartphone according to some embodiments.
  • the operations of method 500 may be implemented by one or more processors of the smartphone device (e.g., 100, 150), such as a general purpose processor (e.g., 152).
  • the operations of method 500 may be implemented by a separate controller (not shown) that may be coupled to memory (e.g., 154), the touchscreen (e.g., 1 15), and to the one or more processors (e.g., 1 10).
  • a virtual mouse may be activated by a processor of the
  • the virtual mouse may be activated by the processor upon detection of a touch event in the virtual mouse activation area on the touchscreen display, coupled with a continued touch contact. In other embodiments, the virtual mouse may be automatically activated by the processor upon detecting that the smartphone device is being held in a hand in a manner consistent with single-hand use. A cursor or icon may be displayed by the processor to signify the activation of the virtual mouse.
  • a location of the cursor or icon associated with the virtual mouse may be calculated or otherwise determined by the processor.
  • the location of the cursor/icon may be determined by the processor by evaluating the expression c + kpf (Equation 3) or the expression c + kp(c - r) (Equation 4), both of which yield a vector to the location of the cursor/icon (e.g., a vector from an initial reference point to the current location of the cursor icon).
  • Equations 3 and 4 c is the position of the touch area (e.g., a vector from an initial reference point to the current touch area), r is the position of the closest corner of the touchscreen (e.g., a vector from the initial reference point to the closest corner to c), is the orientation vector of the touch area (e.g., a unit vector indicating the orientation of the touch area), /? is the touch pressure, and k is a scaling factor chosen so that the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 with movements of the thumb 440 that are within the easily reachable region of the touchscreen display 410.
  • the location of the cursor icon may be calculated or otherwise determined by the processor based at least in part on an orientation of the touch area and at least one of 1) a position of the touch area and 2) a touch pressure.
  • the calculated location of the cursor or icon is used to display a cursor or icon on the display.
  • the location of the cursor or icon on the display may be calculated continuously until the virtual mouse is deactivated by the processor in block 530.
  • the virtual mouse may be automatically deactivated by the processor after a GUI operation, such as an application launch, has been executed by the user ending the touch while the cursor icon is over an operable GUI element.
  • the virtual mouse may also be deactivated by the processor upon detecting that the user has requested a deactivation of the virtual mouse. For example, the processor may detect that the user has performed an operation indicating a deactivation of the virtual mouse (e.g. the user has moved his fmger back to the virtual mouse activation area on the touchscreen display and/or ended the touch).
  • FIGs. 6A and 6B illustrate a method 600 for providing a virtual mouse according to various embodiments.
  • the operations of method 600 may be implemented by one or more processors (e.g., 1 10) of a smartphone (e.g., 100, 150), such as a general purpose processor(s) (e.g., 1 10, 152).
  • the operations of the method 600 may be implemented by a separate controller (not shown) that may be coupled to memory (e.g., 154), the touchscreen (e.g., 1 15) and to the one or more processor 152.
  • a processor of the smartphone may monitor touch sensor input on the smartphone (e.g., input to the touch sensor(s) 158, received via the touchscreen I/O controller 162).
  • the processor may determine whether a trigger activating the virtual mouse is detected.
  • the processor may identify a touch area associated with the user's fmger in block 606, which may be the position of the input detected on the touch-sensitive surface through touch sensor(s) (e.g., 158).
  • the processor may collect touch data in the identified touch area. For example, data may be sensed/measured by the touchscreen system 156 that includes a size and shape of the touch area, pressure being applied by the user's finger (if using a pressure-sensitive device), etc.
  • the processor may determine touch pressure and direction parameters based on information received from the touchscreen. As discussed above, in some embodiments the touch pressure may be determined as actual pressure if the smartphone is configured with a pressure-sensitive touchscreen. In other words,
  • the touch pressure may be an estimated pressure value based on calculating the area of an ellipse function fitted to the boundary of the touch area.
  • the direction parameter may be based on an orientation of a major axis of such ellipse function, or may be based on the position of the center of the touch area with reference to a closest corner of the touchscreen.
  • the processor may calculate a location of the virtual mouse based on the pressure and direction parameters.
  • the processor may display a cursor icon on the touchscreen using the calculated location.
  • the processor may determine whether the virtual mouse has been deactivated, such as by any of a number of deactivation triggers that may be configured.
  • the processor may return to block 602 and monitor sensor input on the touchscreen system in block 602. In response to determining that the virtual mouse is deactivated, the processor may also terminate displaying the icon displayed in block 614.
  • the processor may draw the projected cursor icon to the GUI element in block 619.
  • the processor may determine whether an operation input (e.g., a click, a touch release, a predefined gesture, etc.) is detected, which may be used to initiate an operation relating to that GUI element.
  • the processor may perform an action corresponding to the GUI selection in block 622, for example, opening an application on the smartphone, entering another mode, etc.
  • the processor may deselect the GUI element in block 626, and return to determination block 618 to determine whether the cursor icon is within a threshold distance of a GUI element.
  • a virtual mouse has been previously described in detail. It should be appreciated that the virtual mouse application or system, as previously described, may be implemented as software, firmware, hardware, combinations thereof, etc. In one embodiment, the previous described functions may be implemented by one or more processors (e.g., processor(s) 1 10) of a smartphone device 100 to achieve the previously desired functions (e.g., the method operations of FIGs. 5 and 6).
  • processors e.g., processor(s) 1 10
  • teachings herein may be incorporated into (e.g., implemented within or performed by) a variety of apparatuses (e.g., devices).
  • one or more embodiments taught herein may be incorporated into a general device, a desktop computer, a mobile computer, a mobile device, a phone (e.g., a cellular phone), a personal data assistant, a tablet, a laptop computer, a tablet, an entertainment device (e.g., a music or video device), a headset (e.g., headphones, an earpiece, etc.), a medical device (e.g., a biometric sensor, a heart rate monitor, a pedometer, an electrocardiography "EKG" device, etc.), a user I/O device, a computer, a server, a point-of-sale device, an entertainment device, a set-top box, a wearable device (e.g., watch, head mounted display, virtual reality glasses, etc.), an electronic device within an automobile
  • a smartphone device may include an access device (e.g., a Wi-Fi access point) for a communication system.
  • an access device may provide, for example, connectivity to another network through transceiver (e.g., a wide area network such as the Internet or a cellular network) via a wired or wireless communication link.
  • the access device may enable another device (e.g., a Wi-Fi station) to access the other network or some other functionality.
  • the devices may be portable or, in some cases, relatively non-portable.
  • embodiments are mobile or smartphone devices that such devices may communicate via one or more wireless communication links through a wireless network that are based on or otherwise support any suitable wireless communication technology.
  • the smartphone device and other devices may associate with a network including a wireless network.
  • the network may include a body area network or a personal area network (e.g., an ultra- wideband network).
  • the network may include a local area network or a wide area network.
  • a smartphone device may support or otherwise use one or more of a variety of wireless communication technologies, protocols, or standards such as, for example, 3G, Long Term Evolution (LTE), LTE Advanced, 4G, Code-Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Orthogonal Frequency Division Multiplexing (OFDM), Orthogonal Frequency
  • a smartphone device may support or otherwise use one or more of a variety of corresponding modulation or multiplexing schemes.
  • a smartphone device may thus include appropriate components (e.g., air interfaces) to establish and communicate via one or more wireless communication links using the above or other wireless communication technologies.
  • a device may include a wireless transceiver with associated transmitter and receiver components (e.g., a transmitter and a receiver) that may include various components (e.g., signal generators and signal processors) that facilitate communication over a wireless medium.
  • a smartphone device may therefore wirelessly communicate with other mobile devices, cell phones, other wired and wireless computers, Internet web-sites, etc.
  • Information and signals may be represented using any of a variety of different technologies and techniques.
  • data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of
  • microprocessors one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a Compact Disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions or modules may be stored on or transmitted over as one or more instructions or code on a non- transitory computer-readable medium.
  • Computer-readable media can include both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable media.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Disclosed is a method and apparatus for implementing a virtual mouse. In one embodiment, the functions implemented include activating the virtual mouse, determining a location of a cursor icon associated with the virtual mouse, and deactivating the virtual mouse. In various embodiments, the position of virtual mouse is determined by a processor based upon an orientation or position of a finger touching a touchscreen and a measured or calculated pressure applied by the finger to the touchscreen.

Description

TITLE
System and Methods for Controlling a Cursor Based on Finger Pressure and Direction RELATED APPLICATIONS
[0001] This application claims the benefit of priority to U.S. Provisional Application No. 62/078,356 entitled "Virtual Mouse Based on Improve Touch Shape Feature" filed November 1 1 , 2014, the entire contents of which are hereby incorporated by reference.
FIELD
[0002] The present disclosure relates generally to electronic devices. Various embodiments are related to methods for operating a Graphical User Interface (GUI) on an electronic device.
BACKGROUND
[0003] Holding a smartphone device in one hand and interacting with the Graphical User Interface (GUI) displayed on the touchscreen display of the smartphone device with only the thumb of the hand holding the smartphone device may be a preferable mode of using the smartphone device under many circumstances. However, as the size of touchscreen display of the smartphone device increases, such single-hand use may become cumbersome or even impossible for at least the reason that given the limited hand size, reaching every corner, especially the top region of the touchscreen display with the thumb of the hand holding the device, may become a challenge.
SUMMARY
[0004] Systems, methods, and devices of various embodiments may enable a computing device configured with a touchscreen to implement a virtual mouse on the touchscreen by activating the virtual mouse during single-handed use of the computing device by a user, determining a position of the virtual mouse on the touchscreen, and projecting a cursor icon onto the touchscreen using the calculated vector. In some embodiments, the projected cursor icon may be positioned to extend beyond a reach of a user's thumb or finger during single-handed use. In some embodiments, determining a position of the virtual mouse on the touchscreen may include identifying a touch area associated with a user touch event, collecting touch data from the identified touch area, determining pressure and direction parameters associated with the user touch event, and calculating a vector representing the position of the virtual mouse based on the pressure and direction parameters associated with the user touch event.
[0005] In some embodiments, activating the virtual mouse may include detecting a touch event in a predetermined virtual mouse activation area of a touchscreen display of the computing device. Some embodiments may further include determining, while the virtual mouse is activated, whether a touch event is detected in the predetermined virtual mouse activation area, and deactivating the virtual mouse in response to determining that a touch event has been detected in the predetermined virtual mouse activation area while the virtual mouse is activated.
[0006] In some embodiments, activating the virtual mouse may include automatically initiating activation upon detecting that the computing device is held in a manner consistent with single-handed use by the user. In some embodiments, determining the direction associated with the user touch event may be based at least in part on an orientation of a major axis of an ellipse fitted to the touch area. In some
embodiments, determining the pressure parameter associated with the user touch event may be based on at least one of an area of the ellipse fitted to the touch area, and a touch pressure, and calculating the position of the virtual mouse may include calculating a vector representing the position of the virtual mouse in which a magnitude of the calculated vector may be based at least in part on the determined pressure parameter. [0007] Some embodiments may further include determining whether the user touch event has ended while the projected cursor icon is positioned over a Graphical User Interface (GUI) element displayed on the touchscreen, and executing an operation associated with the GUI element in response to determining that the user touch event has ended while the projected cursor icon is positioned over the displayed GUI element. Some embodiments may further include automatically deactivating the virtual mouse after the execution of the operation associated with the GUI element.
[0008] Some embodiments may further include detecting whether the projected cursor icon is positioned within a threshold distance from an operable Graphical User Interface (GUI) element displayed on the touchscreen, and drawing the projected cursor icon to the operable GUI element in response to detecting that the projected cursor icon is positioned within the threshold distance. Some embodiments may further include detecting whether the projected cursor icon has moved more than a predetermined non-zero distance away from a currently-selected operable Graphical User Interface (GUI) element, and deselecting the operable GUI element in response to detecting that the cursor has moved more than the predetermined non-zero distance from the currently-selected operable GUI element.
[0009] Various embodiments include computing device configured with a
touchscreen, and including a processor configured with processor-executable instructions to perform operations of the methods described above. Various embodiments also include a non-transitory processor-readable medium on which is stored processor-executable instructions configured to cause a processor of a computing device to perform operations of the methods described above. Various embodiments include a computing device having means for performing functions of the methods described above.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments, and together with the general description given above and the detailed description given below, serve to explain the features of the claims.
[0011] FIG. 1A is a block diagram illustrating a smartphone device suitable for use with various embodiments.
[0012] FIG. IB is a block diagram illustrating an example system for implementing a virtual mouse system on a device according to various embodiments.
[0013] FIG. 2 is an illustration of conventional single-handed use of a smartphone device according to various embodiments.
[0014] FIG. 3A is a schematic diagram illustrating example touch parameters used to calculate cursor movement according to various embodiments.
[0015] FIGs. 3B and 3C are illustrations of an example smartphone device showing calculations used to determine a virtual mouse location according to various embodiments.
[0016] FIGs. 4A-4C are illustrations of an example smartphone device touchscreen display showing use of an example virtual mouse interface according to various embodiments.
[0017] FIG. 5 is a process flow diagram illustrating an example method for
implementing a virtual mouse according to various embodiments.
[0018] FIGs. 6A and 6B are process flow diagrams illustrating an example method for implementing a virtual mouse according to various embodiments.
DETAILED DESCRIPTION
[0019] The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to specific examples and implementations are for illustrative purposes, and are not intended to limit the scope of the claims.
[0020] The systems, methods, and devices of the various embodiments improve mobile device user experience by providing a virtual mouse pointer for touchscreen- enabled devices. Specifically, in various embodiments, a virtual mouse interface (also referred to as "virtual mouse") may mitigate the inconvenience of single-handed use of a smartphone due to a mismatch between the size of the display and the user's hand size. The virtual mouse provides a cursor that may be controlled by a single finger (e.g., thumb or other finger). The virtual mouse may interact with GUI elements display in various locations on the touchscreen display. This may include GUI elements that are not easily reachable by a fmger or thumb during single-hand use.
[0021] In operation, a user may activate the virtual mouse, for example, by tapping a portion of a touchscreen corresponding to a GUI element representing the virtual mouse (e.g., a virtual mouse icon) displayed on the touchscreen. When the virtual mouse is activated, a cursor icon may be displayed by the touchscreen. The displayed cursor icon may indicate the position of the virtual mouse with reference to GUI elements. Properties of a user's fmger or thumb on the touchscreen may be calculated by a processor of the smartphone. A processor using signals received from the touchscreen may calculate the touch pressure and orientation of the user's fmger (where orientation refers to the angular placement of the user's finger). The position of the virtual mouse may be determined based at least in part on the calculated touch pressure and orientation of the user's fmger. In some embodiments, the position of the virtual mouse may be calculated as a vector extending from a center point of the portion of the touchscreen touched by the finger to a distal position on the
touchscreen. The vector may have a length or magnitude calculated based on the calculated touch pressure. The vector may have an angular orientation based on the calculated orientation of the fmger. The cursor icon may be positioned on the touchscreen display at the distal end of the calculated vector. When the virtual mouse is near a GUI element that is selectable, the cursor icon may be drawn to the GUI element (e.g., an icon), which may be simultaneously enlarged and/or highlighted within the GUI displayed on the touchscreen. The GUI element may be selected by physically lifting the finger off the touchscreen (i.e., away from the smartphone). Lifting the finger from the touchscreen when the cursor is on the object may prompt the processor of the smartphone to launch an associated application or other action. The user may also deactivate the virtual mouse by moving the finger back to the virtual mouse icon (i.e., returning to the portion of a touchscreen corresponding to the GUI element representing the virtual mouse).
[0022] As used herein, the terms "smartphone device," "smartphone," and "mobile computing device" refer to any of a variety of mobile computing devices of a size in which single handed operation is possible, such as cellular telephones, tablet computers, personal data assistants (PDAs), wearable device (e.g., watch, head mounted display, virtual reality glasses, etc.), palm- top computers, notebook computers, laptop computers, wireless electronic mail receivers and cellular telephone receivers, multimedia Internet enabled cellular telephones, multimedia enabled smartphones (e.g., Android ® and Apple iPhone ®), and similar electronic devices that include a programmable processor, memory, and a touchscreen display/user interface. FIG. 1A is a component diagram of a mobile computing device that may be adapted for a virtual mouse. Smartphones are particularly suitable for implementing the various embodiments, and therefore are used as examples in the figures and the descriptions of various embodiments. However, the claims are not intended to be limited to smartphones unless explicitly recited and encompass any mobile computing device of a size suitable for single handed use.
[0023] Smartphone device 100 is shown comprising hardware elements that can be electrically coupled via a bus 105 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processor(s) 1 10, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like), one or more input devices, which include a touchscreen 1 15, and further include without limitation a mouse, a keyboard, keypad, camera, microphone and/or the like; and one or more output devices 120, which include without limitation an interface 120 (e.g., a universal serial bus (USB)) for coupling to external output devices, a display device, a speaker 1 16, a printer, and/or the like.
[0024] The smartphone device 100 may further include (and/or be in communication with) one or more non- transitory storage devices 125, which can include, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
[0025] The smartphone device 100 may also include a communications subsystem 130, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device and/or chipset (such as a Bluetooth device, an 802.1 1 device, a Wi-Fi device, a WiMAX device, cellular communication facilities, etc.), and/or the like. The communications subsystem 130 may permit data to be exchanged with a network, other devices, and/or any other devices described herein. In one embodiment, the device 100 may further include a memory 135, which may include a RAM or ROM device, as described above. The smartphone device 100 may be a mobile device or a non-mobile device, and may have wireless and/or wired connections.
[0026] The smartphone device 100 may include a power source 122 coupled to the processor 102, such as a disposable or rechargeable battery. The rechargeable battery may also be coupled to the peripheral device connection port to receive a charging current from a source external to the smartphone device 100.
[0027] The smartphone device 100 may also include software elements, shown as being currently located within the working memory 135, including an operating system 140, device drivers, executable libraries, and/or other code, such as one or more application programs 145, which may include or may be designed to implement methods, and/or configure systems, provided by embodiments, as will be described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed below may be implemented as code and/or instructions executable by the smartphone device 100 (and/or a processor(s) 1 10 within the smartphone device 100). In an embodiment, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
[0028] A set of these instructions and/or code may be stored on a non-transitory computer-readable storage medium, such as the storage device(s) 125 described above. In some cases, the storage medium may be incorporated within a device, such as the smartphone device 100. In other embodiments, the storage medium might be separate from a device (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the
instructions/code stored thereon. These instructions may take the form of executable code, which is executable by the smartphone device 100 and/or may take the form of source and/or installable code, which, upon compilation and/or installation on the smartphone device 100 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code. Application programs 145 may include one or more applications adapted for a virtual mouse. It should be appreciated that the functionality of the applications may be alternatively implemented in hardware or different levels of software, such as an operating system (OS) 140, a firmware, a computer vision module, etc.
[0029] FIG. IB is a functional block diagram of a smartphone 150 showing elements that may be used for implementing a virtual mouse interface according to various embodiments. According to various embodiments, the smartphone 150 may be similar to the smartphone device 100 described with reference to FIG. 1A. As shown, the smartphone 150 includes at least one controller, such as general purpose processor(s) 152 (e.g., 1 10), which may be coupled to at least one memory 154 (e.g., 135). The memory 154 may be a non-transitory tangible computer readable storage medium that stores processor-executable instructions. The memory 154 may store the operating system (OS) (140), as well as user application software and executable instructions.
[0030] The smartphone 150 may also include a touchscreen 1 15 (also referred to as a "touchscreen system" and/or "touchscreen display") that includes one or more touch sensor(s) 158 and a display device 160. The touch sensor(s) 158 may be configured to sense the touch contact caused by the user with a touch-sensitive surface. For example, the touch-sensitive surface may be based on capacitive sensing, optical sensing, resistive sensing, electric field sensing, surface acoustic wave sensing, pressure sensing and/or other technologies. In some embodiments, the touchscreen system 156 may be configured to recognize touches, as well as the position and magnitude of touches on the touch sensitive surface.
[0031] The display device 160 may be a light emitting diode (LED) display, a liquid crystal display (LCD) (e.g., active matrix, passive matrix) and the like. Alternatively, the display device 160 may be a monitor such as a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable- graphics-array (VGA) display, super VGA display, cathode ray tube (CRT), and the like. The display device may also correspond to a plasma display or a display implemented with electronic inks. [0032] In various embodiments, the display device 160 may generally be configured to display a graphical user interface (GUI) that enables interaction between a user of the computer system and the operating system or application running thereon. The GUI may represent programs, files and operational options with graphical images. The graphical images may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images may be arranged in predefined layouts, or may be created dynamically to serve the specific actions being taken by a user. During operation, the user may select and activate various graphical images in order to initiate functions and tasks associated therewith. By way of example, a user may select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program.
[0033] The touchscreen system in the various embodiments may be coupled to a touchscreen input/output (I/O) controller 162 that enables input of information from the sensor(s) 158 (e.g., touch events) and output of information to the display device 160 (e.g., GUI presentation). In various embodiments, the touchscreen I/O controller may receive information from the touch sensor(s) 158 based on the user's touch, and may send the information to specific modules configured to be executed by the general purpose processor(s) 152 in order to interpret touch events. In various embodiments, single point touches and multipoint touches may be interpreted. The term "single point touch" as used herein refers to a touch event defined by interaction with a single portion of a single finger (or instrument), although the interaction could occur over time. Examples of single point touch input include a simple touch (e.g., a single tap), touch-and-drag, and double-touch (e.g., a double-tap— two taps in quick succession). A "multi-point touch" may refer to a touch event defined by
combinations of different fingers or finger parts.
[0034] In various embodiments, the smartphone may include other input/output (I/O) devices that, in combination with or independent of the touchscreen system 156, may be configured to transfer data into the smartphone. For example, the touchscreen I/O controller 162 may be used to perform tracking and to make selections with respect to the GUI on the display device, as well as to issue commands. Such commands may be associated with zooming, panning, scrolling, paging, rotating, sizing, etc. Further, the commands may also be associated with launching a particular program, opening a file or document, viewing a menu, making a selection, executing instructions, logging onto the computer system, loading a user profile associated with a user's preferred arrangement, etc. In some embodiments such commands may involve triggering activation of a virtual mouse manager, discussed in further detail below.
[0035] When touch input is received through the touchscreen I/O controller 162, the general purpose processor 152 may implement one or more program modules stored in memory 154 to identify/interpret the touch event and control various components of the smartphone. For example, a touch identification module 164 may identify events that correspond to commands for performing actions in applications 166 stored in the memory 154, modifying GUI elements shown on the display device 160, modifying data stored in memory 154, etc. In some embodiments, the touch identifier module may identify an input as a single point touch event on the touchscreen system 156.
[0036] In some embodiments, the touch input may be identified as triggering activation of a virtual mouse, for example, based on the position of a cursor in proximity to a GUI element (e.g., an icon) representing the virtual mouse. Once activated, control of the cursor in the smartphone may be passed to a virtual mouse manager 168. In various embodiments, the virtual mouse manager 168 may be a program module stored in memory 154, which may be executed by one or more controller (e.g., general purpose processor(s) 152).
[0037] In various embodiments, a single point touch may initiate cursor tracking and/or selection. During tracking, cursor movement may be controlled by the user moving a single finger on a touch sensitive surface of the touchscreen system 156. When the virtual mouse is not active, such tracking may involve interpreting touch events by the touch identifier module 164, and generating signals for producing corresponding movement of a cursor icon on the display device 160.
[0038] While the virtual mouse is active, the virtual mouse manager 168 may interpret touch events and generate signals for producing scaled movement of the cursor icon on the display device 160. In various embodiments, interpreting touch events while the virtual mouse is activated may involve extracting features from the touch data (e.g., number of touches, position and shape of touches, etc.), as well as computing parameters (e.g., touch pressure and/or best fit ellipse to touch area, etc.). In various embodiments, such touch data and computing parameters may be computed by the touchscreen I/O interface 162. Further, a cursor calculation module 170 may use the measured/sensed touch data and computing parameters obtained from the touchscreen I/O interface 162 to determine a cursor location. Other functions, including filtering signals and conversion into different formats, as well as interpreting touch event when the virtual mouse is not activated, may be performed using any of a variety of additional programs/modules stored in memory 154.
[0039] In some embodiments, the general purpose processor(s) 152, memory 154, and touchscreen I/O controller 162 may be included in a system-on-chip device 172. The one or more subscriber identity modules (SIMs) and corresponding interface(s) may be external to the system-on-chip device 172, as well as various peripheral devices (e.g., additional input and/or output devices) that may be coupled to components of the system-on-chip device 172, such as interfaces or controllers.
[0040] Holding a smartphone device in one hand and interacting with the GUI displayed on the touchscreen display of the smartphone device with only the thumb of the hand holding the smartphone device may be a preferable mode of using the smartphone device under many circumstances. However, as the sizes of the touchscreen displays of smartphone devices increase, such single-hand use may become cumbersome or even impossible. The problems of reaching all portions of the touchscreen display, especially the top region of the touchscreen display, with the thumb or other finger of the hand holding the device may become a challenge, especially for those with small hands.
[0041] Fig. 2 is an illustration of conventional single-handed use of a smartphone device 200. According to various embodiments, the smartphone device 200 may be similar to the smartphones 100, 150 described with reference to FIGs. 1A-1B. The smartphone device 200 may be configured with a touchscreen display 220 (e.g., display device 160). Holding the smartphone device 200 in one hand 230 and interacting with the GUI displayed on the touchscreen display 220 of the smartphone device with only the thumb 240 (or other finger) of hand 230 may be a preferable mode of using the smartphone device under many circumstances. However, the larger the touchscreen display 220, the more difficult it is to reach every corner with a single finger. The upper region of the touchscreen display 220 may be especially difficult to reach with the thumb 240 (or other finger) of the hand 230 holding the smartphone device. For example, Fig. 2 illustrates a first region 250 of the touchscreen display 220 that is easily reachable by the thumb 240, and a second region 260 of the touchscreen display 220 that is difficult to reach by the thumb 240.
[0042] The various embodiments utilize additional inputs made available by processing touch event data generated by the touchscreen to implement a virtual mouse in order to overcome the inconveniences to single-hand use of the smartphone device caused by the mismatch between the size of the touchscreen display and the hand size. The virtual mouse includes a cursor/icon that may interact with different elements of the GUI. The cursor may be movable in the whole region of the touchscreen display by a thumb's corresponding rotation and movement and/or change in pressure on the touchscreen display. With a smartphone device that implements embodiments of the disclosure, the user may interact with elements of the GUI on the touchscreen display that is not easily reachable in the single-handed use scenario using the cursor/icon of the virtual mouse while keeping the thumb within the region of the touchscreen display that is easily reachable. [0043] The virtual mouse may be controlled by any of a number of properties associated with a user's single-point touch. In various embodiments, such properties may be determined using a plurality of mechanisms, depending on the particular configurations, settings, and capabilities of the smartphone. The virtual mouse may be implemented by projecting a cursor icon onto the touchscreen in which the location is calculated based on data from the touchscreen. The location may for example be calculated based on an orientation and pressure of the touch determined from the data. For example, in some embodiments, the smartphone may be configured with a pressure-sensitive touchscreen capable of measuring actual touch pressure. Such pressure-sensitive touchscreen may utilize a combination of capacitive touch and infrared light sensing to determine the touch force. In other embodiments, pressure may be calculated indirectly based on the area of the finger in contact with the touchscreen surface. That is, the relative size of the touch area may serve as a proxy for the touch pressure, where a larger area translates to more pressure. In this manner, instead of actual pressure measurements, the smartphone may calculate an estimated pressure based on the touch area, thereby avoiding a need for additional hardware or sensing circuitry on the device.
[0044] The direction of a user's touch may be determined based on the orientation of the major axis of an ellipse that is approximated by the touch area. Alternatively, the direction may be determined based on a line or vector originating from the closest corner of the screen and extending through the touch position.
[0045] In some embodiments, the touch direction may be determined based on calculations from the shape of an ellipse approximated by the touch area boundary. Alternatively, the direction may be determined based on the center of the touch area with respect to the closest corner of the touchscreen.
[0046] While calculation of the location of the cursor may occur during
implementation, various equations referred to in the various embodiments may not be calculated during implementation of the invention, but rather provide models that describe relationships between components of the invention implementation. As discussed above, when the virtual mouse is activated, the properties of input to the touchscreen may be determined by sensing/measuring data of a touch area associated with the user's finger (e.g., thumb) on the touchscreen (i.e., "touch data"). In various embodiments, such touch data may include the location of points forming the boundary of the touch area, and a center of the touch area. In some embodiments, the properties derived from the touch data may include an ellipse function that best fits the boundary of the touch area, and which may be identified using a nonlinear regression analysis. For exam le, a best fitting ellipse may be defined using Equation 1 :
where a represents the semi-major axis and b represents the semi-minor axis of the ellipse, with the semi-major and semi-minor axes aligning on x and y Cartesian axes in which the ellipse center is at the origin point (0,0).
[0047] In various embodiments, the major axis of the best fitting ellipse function may be determined by solving for a, where the major axis is equal to 2a. Further, an estimated pressure based on the size of the touch area may be determined by calculating the area of the best fitting ellipse using Equation 2:
Area = π * ab Eq. 2 where a represents the semi-major axis and b represents the semi-minor axis of the ellipse.
[0048] Fig. 3A is a diagram showing an example ellipse function 300 corresponding to a touch area of a user's finger in various embodiments. Conventional touchscreen technologies provide only the positioning (i.e., x, y coordinates) of the touch events. In various embodiments, for each touch event, an orientation of the touch area and a pressure associated with the touch event may be provided in addition to the position of the touch area. The ellipse function 300 is fitted to an approximate touch area 310, and characterized based on a semi-major axis 320 and semi-minor axis 330. In addition to the position of the touch area 310, an orientation of the touch area 310 may be determined as an angle 312 between the positive x-axis and a line segment corresponding to the major axis 340 of the touch area 310. Utilizing the orientation of the major axis to establish touch direction and assuming that the user holds the smartphone device from the edge located closest to the bottom of the touchscreen, the cursor icon may be positioned along a line that is projected out toward the point on the major ellipse that is closest to the top of the touchscreen. Therefore, as shown with respect to the touch area 310, using the left hand may provide an angle 312 that is between 0 degrees (i.e., finger completely horizontal) and 90 degrees (i.e., finger completely vertical). In embodiments using the right hand (not shown), the angle 312 may be between 90 degrees (i.e., finger completely vertical) and 180 degrees (i.e., finger completely horizontal).
[0049] Furthermore, a pressure associated with the touch event may also be provided. In some embodiments, the size of the touch area 310 may be used as to estimate pressure because the touch area expands as the touch pressure increases when the touch event is created by an extendable object, such as a finger.
[0050] The virtual mouse may be displayed on the touchscreen at a location calculated based on the various touch parameters. In some embodiments, the location of the virtual mouse may be calculated as a vector calculated based on various touch properties. A cursor icon (or other icon) may be displayed to represent the location of the virtual mouse.
[0051] In various embodiments, touch properties used to calculate the virtual mouse location may be represented as vectors. For example, the orientation of the major axis of the best fitting ellipse may be represented by a vector /based on a direction pointing toward the top edge of the touchscreen and/or away from the virtual mouse activation area. In another example, the touch position of the user's finger may be represented by a vector c from a starting or reference point to the center point of the touch area. Similarly, the position of the closest corner to the actual touch position may be represented by a vector r from the starting reference point to the closest corner. In various embodiments, the starting or initial reference point of vectors c and r may be the same as the projection point from which the calculated virtual mouse vector is projected out onto the touchscreen— that is, the point at the virtual mouse activation area.
[0052] In some embodiments the location of the virtual mouse may be calculated using Equation 3 :
Virtual mouse location = c + kpf Eq. 3 where c represents a vector to the center point of the actual touch position (i.e., a position in Cartesian space),/ represents a vector corresponding to the orientation of the major axis of an ellipse best fitting the boundary of the touch area, /? is a pressure measurement, and k is a scaling factor so that the virtual mouse covers the entire touchscreen.
[0053] FIG. 3B illustrates a representative determination of the virtual mouse location on a smartphone device 350 using Equation 3. According to various embodiments, the smartphone device 350 may be similar to the smartphones 100, 150, 200 described with reference to FIGs. 1 A-2. The smartphone device 350 may be configured with a touchscreen display 352 (e.g., 160, 220), and a user may interact with the GUI displayed on the touchscreen display 352 with only one finger 354. On the
touchscreen display 352, vector 356 provides direction and distance from an initial reference point to the center of the touch area 310 of the finger 354, corresponding to c in Equation 3. While the top left corner of the touchscreen display 352 is used as the initial reference point for the embodiment shown in FIG. 3, the location of the initial reference point is arbitrary, as any of the corners or other points on the touchscreen display 52 may provide the initial reference point. Vector 358 provides a direction representing the orientation of the major axis 340 of an ellipse (e.g., 300) best fitting the boundary of the touch area 310, corresponding to /in Equation 3. In some embodiments, the magnitude of vector 358 may be the actual length of the major axis 340. In other embodiments, the magnitude of vector 358 may be a fixed representative value similar to the scaling factor k
[0054] Vector 360 on the touchscreen display 352 is a resultant vector from
multiplying vector 358 by a scalar, and corresponding to kpf in Equation 3. Adding vector 360 to vector 356, a resultant vector 362 provides direction and distance from the initial reference point to the virtual mouse location 363 on the touchscreen display 352. That is, vector 362 corresponds to the calculation in Equation 3 of c + kpf.
[0055] In other embodiments, the location of the virtual mouse may be calculated using Equation 4:
Virtual mouse location = c + kp (c— r) Eq. 4 where r represents a vector to the corner of the touchscreen closest to the actual touch location (i.e., a position in Cartesian space).
[0056] FIGS. 3C illustrates a representative computation of a vector c - r for use in determining the virtual mouse location on the smartphone device 350 using Equation 4. As described with respect to FIG. 3B, vector 356 provides direction and distance from an initial reference point at the top left corner of the touchscreen display 352 to the center of the touch area. Similar to Equation 3, vector 356 corresponds to c in Equation 4. On the touchscreen display 352 in FIG. 3C, vector 364 provides direction and distance from an initial reference point to the corner closest to the actual touch location, corresponding to r in Equation 4. Subtracting vector 364 from vector 356 provides a resultant vector 366, which corresponds to c - r in Equation 4.
[0057] Vector 368 on the touchscreen display 352 is a vector resulting from
multiplying vector 366 by a scalar and translating its position, corresponding to kp(c - r) in Equation 4. Adding vector 368 to vector 356 results in vector 370, which provides direction and distance from the initial reference point to the virtual mouse location 372 on the touchscreen display 352. That is, vector 372 corresponds to the calculation in Equation 4 of c + kp(c - r). [0058] FIGs. 4A and 4B illustrate a smartphone device 400 in which an embodiment of the disclosure is implemented. Smartphone device 400 includes a touchscreen display 410, on which a GUI is displayed. In various embodiments, a predetermined area 420 on the touchscreen display 410 may be designated as the virtual mouse activation area. As will be described in detail below, a user may activate the virtual mouse by touching the activation area 420 with, e.g., a thumb and maintaining the touch (e.g., by not removing the thumb). In Figs. 4A and 4B, the virtual mouse activation area 420 is in the bottom right corner of the touchscreen display 410. In some embodiments, the actual placement of the virtual mouse activation area may be user-customizable. For example, a user intending to operate the smartphone device 410 with the right hand may designate the bottom right corner as the virtual mouse activation area, and a user intending to operate the smartphone device 410 with the left had may designate the bottom left corner as the virtual mouse activation area. In some embodiments, a user may additionally or alternatively activate the virtual mouse by applying a sufficient amount of force at any area on the touchscreen display 410. For example, the virtual mouse may be activated in response to detecting a touch input with an amount of pressure that is above a threshold value.
[0059] Once the virtual mouse is activated, a cursor icon 430 may be displayed on the touchscreen display 410 to signify the same. The GUI element(s) selected by the virtual mouse are indicated by the location of the cursor icon 430, which, as will be described below, may be controlled by the rotation and movement and/or pressure change of the maintained touch by, e.g., a thumb. In some embodiments, the virtual mouse may be automatically activated when a processor determines that the
smartphone device 400 is being held in a hand in a manner that is consistent with single-hand use.
[0060] Fig. 4C illustrates a smartphone device 400 in which a virtual mouse is activated. As described above, a user may activate the virtual mouse for example by touching the virtual mouse activation area with a finger 440 (e.g., a thumb) and maintaining the contact between the finger 440 and touchscreen display 410. The user may wish to activate the virtual mouse when the user intends to operate GUI elements on a region of the touchscreen display 410 that is not easily reachable by the fmger 440. Once the virtual mouse is activated and a cursor icon 430 is displayed, the user may control the location of the cursor icon 430 by rotating the finger 440 and changing at least one of the position of the fmger 440 on the touchscreen display 410 and/or the touch pressure. In some embodiments, the location of the cursor icon 430 (e.g., an end point of a vector from the virtual mouse activation area to the current location of the cursor icon 430) may be determined by evaluating the expression c + kpf from (Equation 3) or c + kp(c - r) (Equation 4). As previously noted, in Equations 3 and 4, c is a vector representing the position of the touch area (e.g., a vector from the virtual mouse activation area or initial reference point to a center of the current touch area). As previously noted, in Equation 4 r is a vector representing the position of the closest corner of the touchscreen (e.g., a vector from the virtual mouse activation area or initial reference point to the corner closest to c). As previously noted, in Equation
3, /is a vector representing the orientation of the touch area (e.g., a unit vector indicating the orientation of the touch area). As previously noted, in Equations 3 and
4, p is the touch pressure, and k is a scaling factor chosen so that the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 with movements of the thumb 440 that are within the easily reachable region of the touchscreen display 410.
[0061] Therefore, in an example embodiment, the position of the current touch area, the orientation of the current touch area, and the current touch pressure are all taken into consideration in the determination of the location of the cursor icon 430. In another embodiment, only the position and the orientation of the current touch area are taken into consideration in the determination of the location of the cursor icon 430 (i.e., /? in c + kpf or c + kp(c - r) is made constant). In yet another embodiment, only the orientation of the current touch area and the current touch pressure are taken into consideration in the determination of the location of the cursor icon 430 (i.e., c in c + kpfi's made constant). In all embodiments, the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 while keeping the thumb within the region of the touchscreen display 410 that is easily reachable.
[0062] In some embodiments, the scaling factor k that may be utilized in the above virtual mouse location calculations may be calibrated to adjust the amount of change in cursor location per movement of the user's fmger. In some embodiments, the user receives constant visual feedback from the touchscreen display in the form of the change in location of the displayed cursor icon. Therefore, the user may adjust the relative force and/or motion being employed by the user to achieve desired results. In some embodiments, upon first powering on, the smartphone may be configured to perform some training with a user in order to detect properties of the user's finger size and pressing activity. In this manner, the scaling factor may be adjusted to
accommodate the relative input characteristics of each user.
[0063] The smartphone may store each user-customized scaling factor for future use for the user (e.g., within a user profile), and may evolve the user's scaling factor over time as details regarding particular touch patterns are collected. In some
embodiments, the manufacturer may specify preset maximum and minimum scaling factors (i.e., a scaling factor range) based on the size of the particular display and the relative size and strength of an average human touch input. While these ranges may be used initially, some embodiments provide for eventual customization of a scaling factor over time based on users, effectively replacing a generalized scaling factor with specifically developed values. Such customizations may also be made available for the sensitivity and/or speed of the virtual mouse movement, which may be changed by applying an exponential function in place of the pressure value (i.e., replacing p with px, where x may be configurable based on user training and/or customization over time. In some embodiments, the user may manually adjust parameters, such as the scaling factor k, the exponential function applied to the pressure p, and/or the threshold values for selecting and/or deselecting GUI elements, etc., such as via various user input mechanisms.
[0064] In some embodiments, once the cursor icon 430 is at the desired location on the GUI, an operation may be performed with respect to the GUI element at the location of the cursor. In some embodiments, the processor may determine that the cursor icon 430 is at the desired location on the GUI based on a decrease in velocity of the virtual mouse or pressure of the user's touch that exceeds a threshold value.
[0065] In some embodiments, the operation performed when the cursor icon 430 is at the desired location may be the selection of an icon that causes an application (e.g., a game application) to be launched. In another example, the operation may cause a selection of an item (e.g., selection of text, a menu item selection, etc.). The operation may in some embodiments be performed in response to an additional user input with respect to the cursor icon 430. Such an additional user input may include, for example, a recognized gesture by the finger (e.g., click, double click, swipe, etc.) that is received within a threshold time after the cursor icon 430 is at the desired location on the GUI. In another example, the additional user input may be a gesture (e.g., click, double click, swipe, etc.) received from another of the user's fingers.
[0066] In another example, the additional user input that triggers performing an operation may be an increase in touch force (i.e., increase in pressure) applied by the user's finger. For example, different levels of force on the touchscreen display 410 may be recognized for different purposes, including performing an operation through the GUI in response to detecting an input force that is beyond a threshold value. In embodiments in which pressure is used to indicate distance for moving the virtual mouse, touch force may be used to prompt performance of an operation (e.g. , launching an application, etc.) provided a differentiator is used to distinguish the virtual mouse movement and the operation. For example, a brief pause in touch pressure may be used as a differentiator. In another example, maintaining the cursor icon 430 in one location for a threshold amount of time may differentiate touch pressure for performing an operation from pressure used to calculate the cursor icon 430 location.
[0067] In some embodiments, a user may configure one or more additional gestures that trigger the operation through settings on the smartphone device 400. In another example, the operation may be performed in response to detecting termination of the movement of the cursor icon 430 (e.g., indicated by the user removing the thumb from the touchscreen display 410).
[0068] In various embodiments, the processor may distinguish between the sudden decrease in touch pressure caused by the ending of the touch, which indicates that the user intends to execute a GUI operation, and the gradual change in touch pressure caused by the user intentionally changing the touch pressure in order to move the cursor icon 430, where appropriate.
[0069] In some embodiments, the processor of the smartphone may be configured such that when the cursor icon 430 is moved near an operable GUI element (i.e., within a threshold distance), such as an icon for launching an application or other item (e.g., text, menu item), the cursor icon 430 may be automatically "drawn" to the operable GUI element. The operable GUI element may be enlarged and/or highlighted by the processor once the cursor icon 430 is over it to signify selection. In some further embodiments, an already- selected operable GUI element (i.e., an operable GUI element over which the cursor icon 430 is located) may be deselected only after the cursor icon 430 has been moved away from the GUI element by a predetermined nonzero distance, in order to compensate for jittering in the touch.
[0070] In some embodiments, the virtual mouse may be deactivated based on receiving additional user input via the GUI. For example, in an embodiment the user may deactivate the virtual mouse by moving the finger to an area (e.g., the activation area 420) on the GUI, and removing the finger from the touchscreen display 410. In another embodiment, the virtual mouse may be deactivated in response to the user removing the finger from the touchscreen display 410 while the cursor icon 430 is in an area on the GUI that is not within a threshold distance from any operable GUI element.
[0071] In some embodiments, the virtual mouse may be automatically deactivated after performing an operation (e.g., selection of an application or item). In other embodiments, the user may deactivate the virtual mouse by performing a particular recognized gesture on the touchscreen display 410. For example, the processor may be configured to deactivate the virtual mouse in response to a double click, a swipe left, a swipe right, a combination thereof, etc. on the touchscreen display 410. In some embodiments, a user may preset one or more particular gestures to trigger deactivation of the virtual mouse.
[0072] FIG. 5 illustrates a method 500 for implementing a virtual mouse on a smartphone according to some embodiments. The operations of method 500 may be implemented by one or more processors of the smartphone device (e.g., 100, 150), such as a general purpose processor (e.g., 152). In various embodiments, the operations of method 500 may be implemented by a separate controller (not shown) that may be coupled to memory (e.g., 154), the touchscreen (e.g., 1 15), and to the one or more processors (e.g., 1 10).
[0073] In block 510, a virtual mouse may be activated by a processor of the
smartphone. In some embodiments, the virtual mouse may be activated by the processor upon detection of a touch event in the virtual mouse activation area on the touchscreen display, coupled with a continued touch contact. In other embodiments, the virtual mouse may be automatically activated by the processor upon detecting that the smartphone device is being held in a hand in a manner consistent with single-hand use. A cursor or icon may be displayed by the processor to signify the activation of the virtual mouse.
[0074] In block 520, a location of the cursor or icon associated with the virtual mouse may be calculated or otherwise determined by the processor. In some embodiments, the location of the cursor/icon may be determined by the processor by evaluating the expression c + kpf (Equation 3) or the expression c + kp(c - r) (Equation 4), both of which yield a vector to the location of the cursor/icon (e.g., a vector from an initial reference point to the current location of the cursor icon).
[0075] As previously noted, in Equations 3 and 4, c is the position of the touch area (e.g., a vector from an initial reference point to the current touch area), r is the position of the closest corner of the touchscreen (e.g., a vector from the initial reference point to the closest corner to c), is the orientation vector of the touch area (e.g., a unit vector indicating the orientation of the touch area), /? is the touch pressure, and k is a scaling factor chosen so that the user may move the cursor icon 430 to the farthest corner of the touchscreen display 410 with movements of the thumb 440 that are within the easily reachable region of the touchscreen display 410.
[0076] Therefore, the location of the cursor icon may be calculated or otherwise determined by the processor based at least in part on an orientation of the touch area and at least one of 1) a position of the touch area and 2) a touch pressure. In some embodiments, the calculated location of the cursor or icon is used to display a cursor or icon on the display. The location of the cursor or icon on the display may be calculated continuously until the virtual mouse is deactivated by the processor in block 530. The virtual mouse may be automatically deactivated by the processor after a GUI operation, such as an application launch, has been executed by the user ending the touch while the cursor icon is over an operable GUI element. The virtual mouse may also be deactivated by the processor upon detecting that the user has requested a deactivation of the virtual mouse. For example, the processor may detect that the user has performed an operation indicating a deactivation of the virtual mouse (e.g. the user has moved his fmger back to the virtual mouse activation area on the touchscreen display and/or ended the touch).
[0077] FIGs. 6A and 6B illustrate a method 600 for providing a virtual mouse according to various embodiments. With reference to FIGS. 1-6B, in various embodiments, the operations of method 600 may be implemented by one or more processors (e.g., 1 10) of a smartphone (e.g., 100, 150), such as a general purpose processor(s) (e.g., 1 10, 152). In various embodiments, the operations of the method 600 may be implemented by a separate controller (not shown) that may be coupled to memory (e.g., 154), the touchscreen (e.g., 1 15) and to the one or more processor 152.
[0078] In block 602, a processor of the smartphone may monitor touch sensor input on the smartphone (e.g., input to the touch sensor(s) 158, received via the touchscreen I/O controller 162). In determination block 604, the processor may determine whether a trigger activating the virtual mouse is detected. Such trigger may be, for example, input of a single-point touch selecting a virtual mouse icon in the GUI of the display. So long as no trigger of the virtual mouse activation is detected (i.e., determination block 604 = "No"), the processor may continue to monitor the touch sensor input on the smartphone in block 602.
[0079] In response to determining that a trigger to activate the virtual mouse is detected (i.e., determination block 604 = "Yes"), the processor may identify a touch area associated with the user's fmger in block 606, which may be the position of the input detected on the touch-sensitive surface through touch sensor(s) (e.g., 158). In block 608, the processor may collect touch data in the identified touch area. For example, data may be sensed/measured by the touchscreen system 156 that includes a size and shape of the touch area, pressure being applied by the user's finger (if using a pressure-sensitive device), etc.
[0080] In block 610, the processor may determine touch pressure and direction parameters based on information received from the touchscreen. As discussed above, in some embodiments the touch pressure may be determined as actual pressure if the smartphone is configured with a pressure-sensitive touchscreen. In other
embodiments, the touch pressure may be an estimated pressure value based on calculating the area of an ellipse function fitted to the boundary of the touch area. Further, as discussed above, the direction parameter may be based on an orientation of a major axis of such ellipse function, or may be based on the position of the center of the touch area with reference to a closest corner of the touchscreen. In block 612, the processor may calculate a location of the virtual mouse based on the pressure and direction parameters.
[0081] In block 614, the processor may display a cursor icon on the touchscreen using the calculated location. In determination block 616, the processor may determine whether the virtual mouse has been deactivated, such as by any of a number of deactivation triggers that may be configured.
[0082] In response to determining that the virtual mouse is deactivated (i.e., determination block 616 = "Yes"), the processor may return to block 602 and monitor sensor input on the touchscreen system in block 602. In response to determining that the virtual mouse is deactivated, the processor may also terminate displaying the icon displayed in block 614.
[0083] In response to determining that the virtual mouse has not been deactivated (i.e., determination block 616 = "No"), the processor may determine whether the cursor icon location on the touchscreen is within a threshold distance of a GUI element (i.e., close enough for possible selection) in determination block 618 (FIG. 6B). In response to determining that the cursor icon is not within a threshold distance of a GUI element (i.e., determination block 618 = "No"), the processor may repeat the operations in blocks 608-614 (FIG. 6A) to determine the location of the cursor and display the cursor icon.
[0084] In response to determining that the cursor icon is within the threshold distance of a GUI element (i.e., determination block 618 = "Yes"), the processor may draw the projected cursor icon to the GUI element in block 619. In determination block 620, the processor may determine whether an operation input (e.g., a click, a touch release, a predefined gesture, etc.) is detected, which may be used to initiate an operation relating to that GUI element. In response to determining that an operation input is detected (i.e., determination block 620 = "Yes"), the processor may perform an action corresponding to the GUI selection in block 622, for example, opening an application on the smartphone, entering another mode, etc.
[0085] In response to determining that an operation input is not detected (i.e., determination block 620 = "No"), the processor may determine whether the cursor icon has moved more than a predetermined distance from a selected GUI element in determination block 624. So long as the cursor icon has not moved more than a predetermined distance from a selected GUI element (i.e., determination block 624 = "No"), the processor may continue determining whether an operation input is detected in determination block 620.
[0086] In response to determining that the cursor icon has moved more than a predetermined distance from a selected GUI element (i.e., determination block 624 = "Yes"), the processor may deselect the GUI element in block 626, and return to determination block 618 to determine whether the cursor icon is within a threshold distance of a GUI element.
[0087] Utilization of embodiments of the disclosure described herein enables a user to interact with elements of a GUI displayed on a region of a touchscreen display that is difficult to directly reach by effecting touches and movements of a user finger within a region of the touchscreen display that is easily reachable while the user is operating the smartphone device with a single hand. Various embodiments have been described in relation to a smartphone device, but the references to a smartphone are merely to facilitate the descriptions of various embodiments and are not intended to limit the scope of the disclosure or the claims.
[0088] Various implementations of a virtual mouse have been previously described in detail. It should be appreciated that the virtual mouse application or system, as previously described, may be implemented as software, firmware, hardware, combinations thereof, etc. In one embodiment, the previous described functions may be implemented by one or more processors (e.g., processor(s) 1 10) of a smartphone device 100 to achieve the previously desired functions (e.g., the method operations of FIGs. 5 and 6).
[0089] The teachings herein may be incorporated into (e.g., implemented within or performed by) a variety of apparatuses (e.g., devices). For example, one or more embodiments taught herein may be incorporated into a general device, a desktop computer, a mobile computer, a mobile device, a phone (e.g., a cellular phone), a personal data assistant, a tablet, a laptop computer, a tablet, an entertainment device (e.g., a music or video device), a headset (e.g., headphones, an earpiece, etc.), a medical device (e.g., a biometric sensor, a heart rate monitor, a pedometer, an electrocardiography "EKG" device, etc.), a user I/O device, a computer, a server, a point-of-sale device, an entertainment device, a set-top box, a wearable device (e.g., watch, head mounted display, virtual reality glasses, etc.), an electronic device within an automobile, or any other suitable device.
[0090] In some embodiments, a smartphone device may include an access device (e.g., a Wi-Fi access point) for a communication system. Such an access device may provide, for example, connectivity to another network through transceiver (e.g., a wide area network such as the Internet or a cellular network) via a wired or wireless communication link. Accordingly, the access device may enable another device (e.g., a Wi-Fi station) to access the other network or some other functionality. In addition, it should be appreciated that one or both of the devices may be portable or, in some cases, relatively non-portable.
[0091] It should be appreciated that when devices implementing the various
embodiments are mobile or smartphone devices that such devices may communicate via one or more wireless communication links through a wireless network that are based on or otherwise support any suitable wireless communication technology. For example, in some embodiments the smartphone device and other devices may associate with a network including a wireless network. In some embodiments the network may include a body area network or a personal area network (e.g., an ultra- wideband network). In some embodiments the network may include a local area network or a wide area network. A smartphone device may support or otherwise use one or more of a variety of wireless communication technologies, protocols, or standards such as, for example, 3G, Long Term Evolution (LTE), LTE Advanced, 4G, Code-Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Orthogonal Frequency Division Multiplexing (OFDM), Orthogonal Frequency
Division Multiple Access (OFDMA), WiMAX, and Wi-Fi. Similarly, a smartphone device may support or otherwise use one or more of a variety of corresponding modulation or multiplexing schemes. A smartphone device may thus include appropriate components (e.g., air interfaces) to establish and communicate via one or more wireless communication links using the above or other wireless communication technologies. For example, a device may include a wireless transceiver with associated transmitter and receiver components (e.g., a transmitter and a receiver) that may include various components (e.g., signal generators and signal processors) that facilitate communication over a wireless medium. As is well known, a smartphone device may therefore wirelessly communicate with other mobile devices, cell phones, other wired and wireless computers, Internet web-sites, etc.
[0092] Information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
[0093] The various illustrative logical blocks, modules, engines, circuits, and algorithm operations described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, engines, circuits, and operations have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the specific application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each application, but such implementation decisions should not be interpreted as causing a departure from the scope of the claims.
[0094] The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of
microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0095] The steps of a method or algorithm described in connection with the
embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a Compact Disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
[0096] In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions or modules may be stored on or transmitted over as one or more instructions or code on a non- transitory computer-readable medium. Computer-readable media can include both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable media.
[0097] The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the claims. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the scope of the claims. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims

CLAIMS What is claimed is:
1. A method implemented in a processor for implementing a virtual mouse on a touchscreen of a computing device, comprising:
activating the virtual mouse during single-handed use of the computing device by a user;
determining a location of the virtual mouse on the touchscreen by:
identifying a touch area associated with a user touch event; collecting touch data from the identified touch area;
determining pressure and direction parameters associated with the user touch event; and
calculating a position on the touchscreen based on the pressure and direction parameters associated with the user touch event; and
displaying a cursor icon on the touchscreen at the determined location of the virtual mouse.
2. The method of claim 1, wherein the displayed cursor icon is configured to extend beyond a reach of a user's finger during single-handed use.
3. The method of claim 1, wherein activating the virtual mouse comprises detecting a touch event in a predetermined virtual mouse activation area of a touchscreen display of the computing device.
4. The method of claim 1, wherein activating the virtual mouse comprises
automatically initiating activation upon detecting that the computing device is held in a manner consistent with single-handed use by the user.
5. The method of claim 3, further comprising:
determining, while the virtual mouse is activated, whether a deactivation event is detected on the computing device; and
deactivating the virtual mouse in response to determining that the deactivation event is detected.
6. The method of claim 5, wherein determining, while the virtual mouse is activated, whether a deactivation event is detected on the computing device comprises determining whether a touch event is detected in the predetermined virtual mouse activation area.
7. The method of claim 1 , wherein determining the direction associated with the user touch event is based at least in part on an orientation of a major axis of an ellipse fitted to the touch area.
8. The method of claim 7, wherein:
determining the pressure parameter associated with the user touch event is based on at least one of an area of the ellipse fitted to the touch area, and a touch pressure; and
calculating a location of the virtual mouse comprises calculating a vector representing the location of the virtual mouse, wherein a magnitude of the calculated vector is based at least in part on the determined pressure parameter.
9. The method of claim 8, wherein calculating the vector representing the location of the virtual mouse comprises calculating a resultant vector of an equation:
c + kpf, wherein:
c represents a vector from an initial reference point to a center point of the ellipse fitted to the touch area;
k represents a scaling factor; p represents the determined pressure parameter; and
/ represents a vector corresponding to the orientation of the major axis of the ellipse fitted to the touch area.
10. The method of claim 8, wherein calculating the vector representing the location of the virtual mouse comprises calculating a resultant vector of an equation:
c + kp(c— r), wherein:
c represents a vector from an initial reference point to a center point of the ellipse fitted to the touch area;
r represents a vector from the initial reference point to a corner of the touchscreen display that is closest to the center point of the ellipse;
k represents a scaling factor; and
p represents the determined pressure parameter, and f represents a vector corresponding to the orientation of the major axis of the ellipse fitted to the touch area.
1 1. The method of claim 1 , further comprising:
determining whether a selection input is received while the projected cursor icon is located within a threshold distance of a Graphical User Interface (GUI) element displayed on the touchscreen; and
executing an operation associated with the GUI element in response to determining that the selection input is received while the projected cursor icon is located within a threshold distance of a Graphical User Interface (GUI) element displayed on the touchscreen.
12. The method of claim 1 1, further comprising automatically deactivating the virtual mouse after execution of the operation associated with the GUI element.
13. The method of claim 1, further comprising: detecting whether the projected cursor icon is positioned within a threshold distance from an operable Graphical User Interface (GUI) element displayed on the touchscreen; and
drawing the projected cursor icon to the operable GUI element in response to detecting that the cursor icon is positioned within the threshold distance.
14. The method of claim 1, further comprising:
detecting whether the projected cursor icon has moved more than a
predetermined non-zero distance away from a currently-selected operable Graphical User Interface (GUI) element; and
deselecting the operable GUI element in response to detecting that the projected cursor icon has moved more than the predetermined non-zero distance from the currently- selected operable GUI element.
15. A computing device, comprising:
a touchscreen;
a memory; and
a processor coupled to the touchscreen and the memory, wherein the processor is configured with processor-executable instructions to perform operations
comprising:
activating a virtual mouse during single-handed use of the computing device by a user;
determining a location of the virtual mouse on the touchscreen by:
identifying a touch area associated with a user touch event;
collecting touch data from the identified touch area;
determining pressure and direction parameters associated with the user touch event; and
calculating a position on the touchscreen based on the pressure and direction parameters associated with the user touch event; and displaying a cursor icon on the touchscreen at the determined location of the virtual mouse, wherein the projected cursor icon is positioned to extend beyond a reach of a user's thumb or finger during single-handed use.
16. The computing device of claim 15, wherein the processor is configured with processor-executable instructions to perform operations such that the displayed cursor icon is configured to extend beyond a reach of a user's finger during single handed use.
17. The computing device of claim 15, wherein the processor is configured with processor-executable instructions to perform operations such that activating the virtual mouse comprises detecting a touch event in a predetermined virtual mouse activation area of a touchscreen display of the computing device.
18. The computing device of claim 15, wherein the processor is configured with processor-executable instructions such that activating the virtual mouse comprises automatically initiating activation upon detecting that the computing device is held in a manner consistent with single-handed use by the user.
19. The computing device of claim 17, wherein the processor is configured with processor-executable instructions to perform operations further comprising:
determining, while the virtual mouse is activated, whether a deactivation event is detected on the computing device; and
deactivating the virtual mouse in response to determining that the deactivation event is detected.
20. The computing device of claim 19, wherein the processor is configured with processor-executable instructions such that determining, while the virtual mouse is activated, whether a deactivation event is detected comprises determining whether a touch event is detected in the predetermined virtual mouse activation area.
21. The computing device of claim 15, wherein the processor is configured with processor-executable instructions such that determining the direction associated with the user touch event is based at least in part on an orientation of a major axis of an ellipse fitted to the touch area.
22. The computing device of claim 21 , wherein the processor is configured with processor-executable instructions such that:
determining the pressure parameter associated with the user touch event is based on at least one of an area of the ellipse fitted to the touch area, and a touch pressure; and
calculating a location of the virtual mouse comprises calculating a vector representing the location of the virtual mouse, wherein a magnitude of the calculated vector is based at least in part on the determined pressure parameter.
23. The computing device of claim 22, wherein the processor is configured with processor-executable instructions such that calculating the vector representing the location of the virtual mouse comprises calculating a resultant vector of an equation: c + kpf, wherein:
c represents a vector from an initial reference point to a center point of the ellipse fitted to the touch area;
k represents a scaling factor;
p represents the determined pressure parameter; and
/ represents a vector corresponding to the orientation of the major axis of the ellipse fitted to the touch area.
24. The computing device of claim 22, wherein the processor is configured with processor-executable instructions such that calculating the vector representing the location of the virtual mouse comprises calculating a resultant vector of an equation: c + kp(c— r), wherein:
c represents a vector from an initial reference point to a center point of the ellipse fitted to the touch area;
r represents a vector from the initial reference point to a corner of the touchscreen display that is closest to the center point of the ellipse;
k represents a scaling factor; and
p represents the determined pressure parameter, and f represents a vector corresponding to the orientation of the major axis of the ellipse fitted to the touch area.
25. The computing device of claim 15, wherein the processor is configured with processor-executable instructions to perform operations further comprising:
determining whether a selection input is received while the projected cursor icon is located within a threshold distance of a Graphical User Interface (GUI) element displayed on the touchscreen; and
executing an operation associated with the GUI element in response to determining that the selection input is received while the projected cursor icon is located within a threshold distance of a Graphical User Interface (GUI) element displayed on the touchscreen.
26. The computing device of claim 25, wherein the processor is configured with processor-executable instructions to perform operations further comprising
automatically deactivating the virtual mouse after execution of the operation associated with the GUI element.
27. The computing device of claim 15, wherein the processor is configured with processor-executable instructions to perform operations further comprising:
detecting whether the projected cursor icon is positioned within a threshold distance from an operable Graphical User Interface (GUI) element displayed on the touchscreen; and
drawing the projected cursor icon to the operable GUI element in response to detecting that the projected cursor icon is positioned within the threshold distance.
28. The computing device of claim 15, wherein the processor is configured with processor-executable instructions to perform operations further comprising:
detecting whether the projected cursor icon has moved more than a
predetermined non-zero distance away from a currently-selected operable Graphical User Interface (GUI) element; and
deselecting the operable GUI element in response to detecting that the projected cursor icon has moved more than the predetermined non-zero distance from the currently- selected operable GUI element.
29. A computing device, comprising:
a touchscreen;
means for activating a virtual mouse during single-handed use of the computing device by a user;
means for determining a location of the virtual mouse on the touchscreen comprising:
means for identifying a touch area associated with a user touch event; means for collecting touch data from the identified touch area;
means for determining pressure and direction parameters associated with the user touch event; and
means for calculating a position on the touchscreen based on the pressure and direction parameters associated with the user touch event; and means for displaying a cursor icon onto the touchscreen at the determined location of the virtual mouse.
30. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device to perform operations comprising:
activating a virtual mouse during single-handed use of the computing device by a user;
determining a location of the virtual mouse on a touchscreen by:
identifying a touch area associated with a user touch event; collecting touch data from the identified touch area;
determining pressure and direction parameters associated with the user touch event; and
calculating a position on the touchscreen based on the pressure and direction parameters associated with the user touch event; and
displaying a cursor icon onto the touchscreen at the determined location of the virtual mouse.
EP15801566.9A 2014-11-11 2015-11-11 System and methods for controlling a cursor based on finger pressure and direction Withdrawn EP3218792A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201462078356P 2014-11-11 2014-11-11
US14/937,306 US20160132139A1 (en) 2014-11-11 2015-11-10 System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
PCT/US2015/060073 WO2016077414A1 (en) 2014-11-11 2015-11-11 System and methods for controlling a cursor based on finger pressure and direction

Publications (1)

Publication Number Publication Date
EP3218792A1 true EP3218792A1 (en) 2017-09-20

Family

ID=55912208

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15801566.9A Withdrawn EP3218792A1 (en) 2014-11-11 2015-11-11 System and methods for controlling a cursor based on finger pressure and direction

Country Status (6)

Country Link
US (1) US20160132139A1 (en)
EP (1) EP3218792A1 (en)
JP (1) JP2017534993A (en)
KR (1) KR20170083545A (en)
CN (1) CN107077297A (en)
WO (1) WO2016077414A1 (en)

Families Citing this family (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
EP3410287B1 (en) 2012-05-09 2022-08-17 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
WO2013169854A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
KR101683868B1 (en) 2012-05-09 2016-12-07 애플 인크. Device, method, and graphical user interface for transitioning between display states in response to gesture
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
CN105260049B (en) 2012-05-09 2018-10-23 苹果公司 For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
JP6082458B2 (en) 2012-05-09 2017-02-15 アップル インコーポレイテッド Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
EP2847661A2 (en) 2012-05-09 2015-03-18 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
KR101905174B1 (en) 2012-12-29 2018-10-08 애플 인크. Device, method, and graphical user interface for navigating user interface hierachies
CN105144057B (en) 2012-12-29 2019-05-17 苹果公司 For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature
JP6093877B2 (en) 2012-12-29 2017-03-08 アップル インコーポレイテッド Device, method, and graphical user interface for foregoing generation of tactile output for multi-touch gestures
EP2939098B1 (en) 2012-12-29 2018-10-10 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
WO2014105279A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for switching between user interfaces
JP6097843B2 (en) 2012-12-29 2017-03-15 アップル インコーポレイテッド Device, method and graphical user interface for determining whether to scroll or select content
CN107430430A (en) * 2014-12-22 2017-12-01 英特尔公司 Multi-touch virtual mouse
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) * 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
JP6569546B2 (en) * 2016-01-28 2019-09-04 富士通コネクテッドテクノロジーズ株式会社 Display device, display control method, and display control program
CN107145289A (en) * 2016-03-01 2017-09-08 富泰华工业(深圳)有限公司 The electronic installation and its input method switching method of changeable input method, system
CN107526513A (en) * 2016-06-20 2017-12-29 中兴通讯股份有限公司 The method and device that analog mouse operates on a kind of touch screen terminal
CN107728910B (en) * 2016-08-10 2021-02-05 深圳富泰宏精密工业有限公司 Electronic device, display screen control system and method
CN106790994A (en) * 2016-11-22 2017-05-31 努比亚技术有限公司 The triggering method and mobile terminal of control
US10546109B2 (en) * 2017-02-14 2020-01-28 Qualcomm Incorporated Smart touchscreen display
CN109643216A (en) * 2017-03-13 2019-04-16 华为技术有限公司 A kind of icon display method and terminal device
JP2018200494A (en) * 2017-05-25 2018-12-20 シナプティクス・ジャパン合同会社 Touch controller, display system and host device
SE542090C2 (en) * 2017-05-31 2020-02-25 Izettle Merchant Services Ab Touch input device and method
KR102374408B1 (en) * 2017-09-08 2022-03-15 삼성전자주식회사 Method for controlling a pointer in a screen of virtual reality and electronic device
US11567627B2 (en) 2018-01-30 2023-01-31 Magic Leap, Inc. Eclipse cursor for virtual content in mixed reality displays
US10540941B2 (en) 2018-01-30 2020-01-21 Magic Leap, Inc. Eclipse cursor for mixed reality displays
US11216160B2 (en) 2018-04-24 2022-01-04 Roku, Inc. Customizing a GUI based on user biometrics
US11157159B2 (en) 2018-06-07 2021-10-26 Magic Leap, Inc. Augmented reality scrollbar
CN109164950B (en) * 2018-07-04 2020-07-07 珠海格力电器股份有限公司 Method, device, medium and equipment for setting system interface of mobile terminal
US11157152B2 (en) * 2018-11-05 2021-10-26 Sap Se Interaction mechanisms for pointer control
JP7309466B2 (en) * 2019-06-11 2023-07-18 キヤノン株式会社 Electronic equipment and its control method
CN112445406A (en) * 2019-08-29 2021-03-05 中兴通讯股份有限公司 Terminal screen operation method, terminal and storage medium
CN112558825A (en) * 2019-09-26 2021-03-26 华为技术有限公司 Information processing method and electronic equipment
CN113168246B (en) * 2019-10-10 2024-09-27 微软技术许可有限责任公司 Configuring a mouse device by pressure detection
CN110825242B (en) * 2019-10-18 2024-02-13 亮风台(上海)信息科技有限公司 Method and device for inputting
CN113093973B (en) * 2019-12-23 2023-09-26 鹤壁天海电子信息系统有限公司 Mobile terminal operation method, storage medium and mobile terminal
CN111443860B (en) * 2020-03-25 2021-06-22 维沃移动通信有限公司 Touch control method and electronic equipment
US11481069B2 (en) * 2020-09-15 2022-10-25 International Business Machines Corporation Physical cursor control in microfluidic display devices
CN112162631B (en) * 2020-09-18 2023-05-16 聚好看科技股份有限公司 Interactive device, data processing method and medium
CN112351324A (en) * 2020-10-27 2021-02-09 深圳Tcl新技术有限公司 Analog mouse control method, device, equipment and computer readable storage medium
CN113703571B (en) * 2021-08-24 2024-02-06 梁枫 Virtual reality man-machine interaction method, device, equipment and medium
US11644972B2 (en) * 2021-09-24 2023-05-09 Htc Corporation Virtual image display device and setting method for input interface thereof
US11693555B1 (en) * 2022-01-12 2023-07-04 Adobe Inc. Pressure value simulation from contact area data

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170496B2 (en) * 2003-01-24 2007-01-30 Bruce Peter Middleton Zero-front-footprint compact input system
US7499058B2 (en) * 2005-04-22 2009-03-03 Microsoft Corporation Programmatical access to handwritten electronic ink in a tree-based rendering environment
JP2010102474A (en) * 2008-10-23 2010-05-06 Sony Ericsson Mobile Communications Ab Information display device, personal digital assistant, display control method, and display control program
US20100214218A1 (en) * 2009-02-20 2010-08-26 Nokia Corporation Virtual mouse
US8289316B1 (en) * 2009-04-01 2012-10-16 Perceptive Pixel Inc. Controlling distribution of error in 2D and 3D manipulation
WO2011048839A1 (en) * 2009-10-22 2011-04-28 シャープ株式会社 Display device and display device driving method
US9619056B1 (en) * 2010-03-26 2017-04-11 Open Invention Network Llc Method and apparatus for determining a valid touch event on a touch sensitive device
US8328378B2 (en) * 2010-07-20 2012-12-11 National Changhua University Of Education Package, light uniformization structure, and backlight module using same
US9052772B2 (en) * 2011-08-10 2015-06-09 Lester F. Ludwig Heuristics for 3D and 6D touch gesture touch parameter calculations for high-dimensional touch parameter (HDTP) user interfaces
JP5576571B2 (en) * 2011-10-11 2014-08-20 インターナショナル・ビジネス・マシーンズ・コーポレーション Object indication method, apparatus, and computer program
US9671880B2 (en) * 2011-12-22 2017-06-06 Sony Corporation Display control device, display control method, and computer program
US9195502B2 (en) * 2012-06-29 2015-11-24 International Business Machines Corporation Auto detecting shared libraries and creating a virtual scope repository
KR20140033839A (en) * 2012-09-11 2014-03-19 삼성전자주식회사 Method??for user's??interface using one hand in terminal having touchscreen and device thereof
US9483146B2 (en) * 2012-10-17 2016-11-01 Perceptive Pixel, Inc. Input classification for multi-touch systems
JP6137453B2 (en) * 2013-02-08 2017-05-31 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Control device and control program
KR102056316B1 (en) * 2013-05-03 2020-01-22 삼성전자주식회사 Method of operating touch screen and electronic device thereof
US9207772B2 (en) * 2013-05-13 2015-12-08 Ohio University Motion-based identity authentication of an individual with a communications device

Also Published As

Publication number Publication date
US20160132139A1 (en) 2016-05-12
WO2016077414A1 (en) 2016-05-19
KR20170083545A (en) 2017-07-18
CN107077297A (en) 2017-08-18
JP2017534993A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
US20160132139A1 (en) System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
US11573631B2 (en) System for gaze interaction
US10540008B2 (en) System for gaze interaction
KR102141099B1 (en) Rapid screen segmentation method and apparatus, electronic device, display interface, and storage medium
US20160109947A1 (en) System for gaze interaction
EP2988202A1 (en) Electronic device and method for providing input interface
US20140282278A1 (en) Depth-based user interface gesture control
US20150324000A1 (en) User input method and portable device
US20140160035A1 (en) Finger-specific input on touchscreen devices
EP2840478B1 (en) Method and apparatus for providing user interface for medical diagnostic apparatus
EP2958006A1 (en) Electronic device and method for controlling display
WO2010032268A2 (en) System and method for controlling graphical objects
KR102297473B1 (en) Apparatus and method for providing touch inputs by using human body
EP3187977A1 (en) System for gaze interaction
US10747362B2 (en) Touch device with suppression band
EP2677413B1 (en) Method for improving touch recognition and electronic device thereof
WO2016147498A1 (en) Information processing device, information processing method, and program
CN110799933A (en) Disambiguating gesture input types using multi-dimensional heat maps
WO2018160258A1 (en) System and methods for extending effective reach of a user's finger on a touchscreen user interface
KR20130102670A (en) For detailed operation of the touchscreen handset user-specific finger and touch pen point contact location method and system for setting
EP3457269B1 (en) Electronic device and method for one-handed operation
JP2017102676A (en) Portable terminal device, operation device, information processing method, and program
CN106557157B (en) Contact action method, touch-screen equipment and touch screen control system
KR20240011834A (en) Rear user interface for handheld devices

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170323

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200603