WO2017131728A1 - Déplacement d'un curseur sur la base d'un contexte - Google Patents

Déplacement d'un curseur sur la base d'un contexte Download PDF

Info

Publication number
WO2017131728A1
WO2017131728A1 PCT/US2016/015557 US2016015557W WO2017131728A1 WO 2017131728 A1 WO2017131728 A1 WO 2017131728A1 US 2016015557 W US2016015557 W US 2016015557W WO 2017131728 A1 WO2017131728 A1 WO 2017131728A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch input
computing system
contextual
cursor
sensor
Prior art date
Application number
PCT/US2016/015557
Other languages
English (en)
Inventor
Kas Kasravi
Oleg NIKOLSKY
Original Assignee
Hewlett Packard Enterprise Development Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Enterprise Development Lp filed Critical Hewlett Packard Enterprise Development Lp
Priority to PCT/US2016/015557 priority Critical patent/WO2017131728A1/fr
Publication of WO2017131728A1 publication Critical patent/WO2017131728A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing

Definitions

  • Computing devices such as laptops, smart phones, and tablets have increased in popularity. Many individuals own at least one (if not multiple) of these types devices, which may frequently be used for personal tasks such as checking email, browsing the Internet, taking photos, playing games, and other such activities. Additionally, these devices are also being used to perform basic business related tasks, such as email, accessing business web services, and internet browsing.
  • Figure 1 is a diagram illustrating a computing system having a touch input analysis module and a cursor movement module for providing contextual and non-contextual cursor movement according to examples of the present disclosure.
  • Figure 2 is a block diagram illustrating a computing system having touch input analysis instructions and cursor movement instructions for providing contextual and non-contextual cursor movement according to examples of the present disclosure.
  • Figure 3 is a plot diagram illustrating a signal corresponding to a touch input generated by a sensor in a computing system having a touch input analysis module and a cursor movement module according to examples of the present disclosure.
  • Figure 4 is a diagram illustrating a method for using contextual finger taps in a computing system according to examples of the present disclosure.
  • Figures 5A-5E are diagrams illustrating a text editing example with contextual cursor movement according to examples of the present disclosure.
  • Figure 6 is a flow diagram illustrating a method for performing a discrete cursor movement based on a contextual touch input according to examples of the present disclosure.
  • Computing devices such as laptops, smart phones, and tablets have increased in popularity. Many individuals own at least one (if not multiple) of these types devices, which may frequently be used for personal tasks such as checking email, browsing the Internet, taking photos, playing games, and other such activities. Additionally, these devices are also being used to perform basic business related tasks, such as email, accessing business web services, and internet browsing.
  • a user may enter text on a physical keyboard attached to such a computing system.
  • the user may enter text on a "soft" keyboard that appears on a touch display of such a computing system.
  • a user of a mobile smart phone may wish to compose an email or a text message.
  • the user may select the appropriate application (e.g., email application or text messaging application) by clicking or tapping on the mobile smart phone's touch screen.
  • the appropriate application e.g., email application or text messaging application
  • the user may then proceed to input the desired text using the soft keyboard displayed on the touch screen by selecting or tapping the appropriate characters.
  • Users may perform other tasks on their computing systems that utilize user inputs such as office productivity software, gaming software, image editing software, computer aided design software, and the like.
  • touch screen devices To provide such inputs, the users of such devices face the limitations of touch screen implementations. For instance, a user may frequently mistype a word because the on-screen keyboard is small in comparison to the user's fingers. That is, a user may mean to press one key and instead press an adjacent key. To correct this error, the user moves the cursor back to the position of the mistake and makes the appropriate correction. However, moving the cursor to a particular location can be difficult on such touch screen devices. More generally, touch screen devices lack precise and discrete input ability, specifically as it relates to the position and movement of a cursor. This limits and negatively affects the manner in which applications are implemented and used, limits the usefulness of the computing system, and causes user
  • Figures 1 and 2 relate to components and modules of a computing system, such as computing system 100 of Figure 1 and computing system 200 of Figure 2.
  • the computing systems 100 and 200 may include any appropriate type of computing system and/or computing device, including for example smartphones, watches, glasses, tablets, desktops, laptops, workstations, servers, smart monitors, smart televisions, digital signage, scientific instruments, retail point of sale devices, video walls, imaging devices, peripherals, networking equipment, or the like.
  • FIG. 1 is a diagram illustrating a computing system 100 having a touch input analysis module 120 and cursor movement module 124 for providing contextual and non-contextual cursor movement according to examples of the present disclosure.
  • the computing system 100 may detect a series of touch inputs (or "taps") from a user hand 130 (or in another appropriate way such as by a user finger, head, arm, etc.) via a sensor 106, analyze signals generated by the sensor 106 corresponding to the touch inputs, and generate a touch input signature based on the analysis of the signals corresponding to the series of touch inputs.
  • a discrete cursor movement may be implemented on the device based on the touch input signature.
  • the discrete cursor movement causes the cursor to move a discrete amount (or to a particular location), move to a discrete menu item or button, or to discretely select an object, menu item, or button, or another similar action is performed.
  • Figure 1 includes particular components, modules, etc. according to various examples. However, in different implementations, more, fewer, and/or other components, modules, arrangements of components/modules, etc. may be used according to the teachings described herein. In addition, various components, modules, etc. described herein may be implemented as software modules, hardware modules, special-purpose hardware (e.g., application specific hardware, application specific integrated circuits (ASICs), embedded controllers, hardwired circuitry, etc.), or some combination of these.
  • special-purpose hardware e.g., application specific hardware, application specific integrated circuits (ASICs), embedded controllers, hardwired circuitry, etc.
  • the computing system 100 may include any appropriate type of computing device, including for example smartphones, tablets, desktops, laptops, workstations, servers, smart monitors, smart televisions, digital signage, scientific instruments, retail point of sale devices, video walls, imaging devices, peripherals, wearable computing devices, or the like.
  • the computing system 100 represents a mobile device, such as a smart phone or tablet computer, although other suitable devices are also possible.
  • the computing system 100 includes a sensor 106, a touch input analysis module 120, a touch input signature generation module 122, a cursor movement module 124, and a display 1 10.
  • the sensor 106, the touch input analysis module 120, the touch input signature generation module 122, and the cursor movement module 124 are shown with dashed lines to represent that the components are partially or wholly within the computing system 100 and may not be visible externally to the computing system 100.
  • the computing system 100 may include additional components, such as processing resources, memory resources, additional sensors, and the like.
  • the sensor 106 may represent a variety of different sensors, including accelerometers, gyroscopes,
  • the touch input analysis module 120 of the computing system 100 analyzes signals generated by sensor 106.
  • the signals correspond to a series of touch inputs detected by the sensor 106.
  • hand 130 may "tap" or similarly touch a surface of the computing system 100 so as to create a touch input.
  • the touch input is registered by the sensor 106, which generates a signal responsive to the touch input being detected.
  • the touch input analysis module 120 analyzes the signal generated by the sensor 106.
  • a series of touch inputs may be received on the computing system 100 and recognized by the sensor 106.
  • the sensor 106 may then generate a plurality of signals corresponding to each of the touch inputs.
  • the plurality of signals are then analyzed by the touch input analysis module 120.
  • the touch input analysis module 120 may apply a discrete wavelet transform procedure to de-noise the signals generated by the sensor 106. Any noise present in the signal generated by the sensor 106 is reduced and/or removed by the de-noising procedures.
  • Figure 3 illustrates a signal generated by the sensor 106 and corresponding to a touch input received on the computing system 100. The signal includes noise, which may be undesirable. Consequently, the de-noising procedure may remove the noise from the signal. Figure 3 is discussed in more detail below.
  • the de-noising procedure may apply other de-noising procedures other than the discrete wavelet transform procedure, such as by using other types of appropriate wavelet transforms, digital signal processing for time-frequency analysis, or any other suitable transform procedure such as Kalman filters, recursive least square filters, Bayesian mean square error procedure, etc.
  • a custom data filtering procedure may be implemented.
  • the touch input analysis module 120 may analyze whether a multi-touch is detected (that is, two or more consecutive touches or taps). The noise present in the signal generated by the sensor 106 responsive to the multi-taps is also reduced and/or removed by the de-noising procedure. Additionally, the touch input analysis module 120 analyzes which surface of the computing system 100 received the touch. For example, although Figure 1 illustrates the hand 130 touching the right surface of the computing system 100, any of the left, right, top, and/or bottom surfaces may be similarly tapped or touched.
  • front surface such as the display 1
  • rear surface not shown
  • front surface such as the display 1
  • rear surface not shown
  • the touch input signature generation module 122 may generate a touch input signature based on the analysis of the signals corresponding to the detected series of touch inputs. For example, the touch input signature generation module 122 may compare training touch input signals, for example, by plotting the signals to find maximum, minimum, average, etc. values for the training touch input signals. From these values, a touch input signature may be generated based on the values. In examples, the touch input signature may be represented as a tolerance band with an outer bound and an inner bound.
  • the cursor movement module 124 determines which cursor movement from a set of cursor movements to cause to be implemented based at least in part on the analysis of the signal generated by the sensor 106.
  • the cursor movement module 124 may select from a list or table of predefined cursor movements.
  • the cursor movement module 124 may be (or may include) an
  • API application programming interface
  • third-party applications e.g., text entry applications, gaming applications, computer aided design applications, etc.
  • the API may output the specific commands as shown in the following table to the appropriate third-party application. These commands (e.g., T1 , B2, K1 , F3, etc.) are received by the third-party
  • the cursor movement module 124 may determine a discrete cursor movement based at least in part on one or more of predictive discrete cursor movements, previous discrete cursor movements, key words or phrases within text proximate to the current cursor position, and other factors. In some examples, the cursor movement module 124 is configured to provide contextual cursor movement and non-contextual cursor movement, as described in further detail below.
  • FIG. 2 is a block diagram illustrating a computing system 200 having touch input analysis instructions 220 and cursor movement instructions 228 for providing contextual and non-contextual cursor movement according to examples of the present disclosure.
  • the computing system 200 may include a processing resource 202 that represents generally any suitable type or form of processing unit or units capable of processing data or interpreting and executing instructions.
  • the processing resource 202 may be one or more central processing units (CPUs), microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions, such as instructions 220, 222, 224, 226, and 228.
  • CPUs central processing units
  • microprocessors microprocessors
  • the instructions such as instructions 220, 222, 224, 226, and 228 may be stored, for example, on a memory resource, such as computer-readable storage medium 204, which may include any electronic, magnetic, optical, or other physical storage device that store executable instructions.
  • a memory resource such as computer-readable storage medium 204, which may include any electronic, magnetic, optical, or other physical storage device that store executable instructions.
  • the memory resource may be, for example, random access memory (RAM), electrically-erasable programmable read-only memory (EPPROM), a storage drive, an optical disk, and any other suitable type of volatile or non-volatile memory that stores instructions to cause a programmable processor to perform the techniques described herein.
  • the memory resource includes a main memory, such as a RAM in which the instructions may be stored during runtime, and a secondary memory, such as a nonvolatile memory in which a copy of the instructions is stored.
  • the computing system 200 may include dedicated hardware, such as one or more integrated circuits, Application Specific Integrated Circuits (ASICs), Application Specific Special Processors (ASSPs), Field Programmable Gate Arrays (FPGAs), or any combination of the foregoing examples of dedicated hardware, for performing the techniques described herein.
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Special Processors
  • FPGAs Field Programmable Gate Arrays
  • multiple processing resources or processing resources utilizing multiple processing cores
  • the computing system 200 may include a sensor 206, which may represent one or more of a variety of different sensors, including
  • the sensor 206 may be a single-axis or multi-axis accelerometer.
  • the multi-axis accelerometer may detect acceleration along the x-axis, along the y- axis, and/or along the z-axis.
  • detecting acceleration along an axis relates to detecting touch inputs on one of the six surfaces of the computing system (i.e. , top, bottom, left, right, front, rear).
  • the sensor 206 generates a signal responsive to detecting the touch input (i.e., detecting an acceleration).
  • the touch input results in an impulse or sudden acceleration over a very short timeframe (a high amplitude, short duration pulse).
  • the impulse as a signal, is distinct from typical movements experienced by such computing systems (e.g. , jolting a phone with jogging, tilting the computing system to manipulate a feature of a game, etc.).
  • the computer-readable storage medium 204 is non-transitory in the sense that it does not encompass a transitory signal but instead is made up of one or more memory components configured to store the instructions 220, 222, 224, 226, and 228.
  • the computer-readable storage medium may be
  • the instructions may include touch input analysis instructions 220, touch input signature generation instructions 222, de-noising instructions 224, statistical significance instructions 226, and cursor movement instructions 228.
  • the instructions of the computer-readable storage medium 204 may be executable so as to perform the techniques described herein.
  • the touch input analysis instructions 220 analyzes signals generated by the sensor 206.
  • the signals correspond to a series of touch inputs detected by the sensor 206.
  • the touch input is registered by the sensor 206, which generates a signal responsive to the touch input being detected.
  • the touch input analysis instructions 220 analyze the signal generated by the sensor 206.
  • a series of touch inputs may be received on the computing system 200 and recognized by the sensor 206.
  • the sensor 206 may then generate a plurality of signals corresponding to each of the touch inputs.
  • the plurality of signals are then analyzed by the touch input analysis instructions 220.
  • the touch input signature generation instructions 222 generate a touch input signature based on the analysis of the signals corresponding to a detected series of training touch inputs. For example, the touch input signature
  • generation instructions 222 compare training touch input signals, for example, by plotting the signals to find maximum, minimum, average, etc. values for the training touch input signals. From these values, a touch input signature may be generated based on the values.
  • the touch input signature may be represented as a tolerance band with an outer bound and an inner bound.
  • the de-noising instructions 224 may apply a discrete wavelet transform procedure to de-noise the signals generated by the sensor 206. Any noise present in the signal generated by the sensor 206 is reduced and/or removed by the de-noising procedures.
  • Figure 3 illustrates a signal generated by the sensor 206 and corresponding to a touch input received on the computing system 200. The signal includes noise, which may be undesirable. Consequently, the de-noising procedure may remove the noise from the signal. Figure 3 is discussed in more detail below.
  • the de-noising instructions 224 may apply other de-noising procedures other than the discrete wavelet transform procedure, such as by using other types of appropriate wavelet transforms, digital signal processing for time- frequency analysis, or any other suitable transform procedure such as Kalman filters, recursive least square filters, Bayesian mean square error procedure, etc. Moreover, in some examples, a custom data filtering procedure may be implemented.
  • the statistical significance instructions 226 determine whether a touch input signature is statistically significant. For example, statistical significance techniques may be applied to the touch input signature to test the touch input signature to determine whether to accept or to reject the touch input signature. If the touch input signature is statistically significant, the generated touch input signature is stored in a data store such as a touch input signature profiles database. The touch input signature stored in the touch input signature profiles database may be useful to detect touch inputs in the future, such as when determining whether to perform a discrete cursor movement. However, if the touch input signature is not statistically significant, new and/or additional training touch inputs may be utilized.
  • the cursor movement instructions 228 determine which discrete cursor movement from a set of discrete cursor movements to cause to be implemented based at least in part on the analysis of the signal generated by the sensor 206.
  • the cursor movement instructions 228 may select from a list or table of predefined discrete cursor movements.
  • the cursor movement instructions 228 may determine a discrete cursor movement based at least in part on one or more of predictive discrete cursor movements, previous discrete cursor movements, key words or phrases within text proximate to the current cursor position, and other factors. In some examples, the cursor movement instructions 228 are configured to provide contextual cursor movement and non-contextual cursor movement, as described in further detail below.
  • Figure 3 is a plot diagram illustrating a signal 300 corresponding to a touch input generated by a sensor in a computing system having a touch input analysis module and a cursor movement module according to examples of the present disclosure.
  • Figure 3 illustrates a typical impulse signal induced by a touch input (or "tap"), where "a” represents the amplitude of the touch input, “a”' represents the rebounding effect of the computing system in the opposite direction as a result of the touch input, “t” represents the duration of the touch input, and “t”' represents the duration of the rebounding effect after the touch input.
  • Each axis will behave similarly.
  • the time at which "a" is detected is the indication of the touch input. Due to ambient conditions, certain noise may be introduced in the impulse signal as illustrated.
  • the values of "a”, “a”', “t”, and “t”' are determined against suitable thresholds to avoid false-positive taps and/or false-negative taps. It should be understood that the signal illustrated in Figure 3 is merely a possible response signal responsive to a touch input and that many variations on the illustrated signal are possible. The variety of signals produced may depend on, among other factors, the material the computing system is made from, the manner in which the user initiates and completes the touch input, the type of sensor used in the computing system, environmental variables, and other factors.
  • Some examples disclosed herein are directed to controlling the position of a cursor on a computing system, such as computing system 100 or 200, in a contextual manner when the computing system is tapped.
  • One example is directed to a method for capturing the current content being displayed on the computing system, and moving the cursor to a location or object of interest if a contextual finger tap is detected.
  • An aspect of some examples is determining the context of the computing system (e.g., the user wants to correct a
  • a user wants to click a button, a user wants to select an onscreen object). Subsequently, the finger taps against the computing system are processed as discussed above, and the location of the cursor is mapped to the screen location determined by a current context. This helps to simplify the user interaction with the computing system by allowing the user to rapidly and precisely move a cursor to a contextually relevant location on the screen.
  • a tap for the purpose of a contextual cursor for the purpose of a contextual cursor
  • a contextual tap is a specific type of tap (e.g., triple tap, or a tap at a certain location on the device) that results in seeking a contextual target for the cursor.
  • Such contextual taps may be application-specific, and may be stored in a database.
  • Each application on a given computing device may have one or more focus objects, i.e., an object that the user is most likely to seek.
  • focus objects are application-specific. For example, in a text application, a
  • a typographical error may be a focus object, because the user wishes to correct it; in a typical user interface, a widget (e.g., a button) may be the focus object; in a game, a character may become the focus object.
  • a widget e.g., a button
  • Such focus objects indicate the most likely cursor position for the user's next action.
  • the closest one may be identified as the most likely one that the user wishes to target.
  • the current location of the cursor may be just too far, and therefore out of context. Therefore, an application-specific database is used in some examples to define measures of proximity, and to help infer if an object is nearby or not. If the object is nearby, then the tap will move the cursor to the object, and if the object is not nearby, the tap may be ignored.
  • default values may be used. If application-specific databases are used, some examples may simplify such databases by categorizing applications by application type.
  • FIG. 4 is a diagram illustrating a method 400 for using contextual finger taps in a computing system according to examples of the present disclosure.
  • method 400 may be performed by computing system 100 or 200.
  • a finger tap application of the computing system is in a listen mode until the condition Finger Tap Detected is true.
  • the computing system determines at 406 whether the detected finger tap is a contextual finger tap.
  • the detected finger tap is a contextual finger tap.
  • computing system uses a database of contextual tap patterns 404 to help in making the determination at 406. If it is determined at 406 that the detected finger tap is non-contextual, the method 400 moves to 414 to resume normal operation. If it is determined at 406 that the detected finger tap is contextual, the method 400 moves to 408 to search for a contextual focus object near the current cursor position.
  • the computing system determines if there is a contextual focus object near the current cursor position. In some examples, the computing system uses a database of proximity measures 410 to help in making the determination at 412. If it is determined at 412 that there is not a contextual focus object near the current cursor position, the method 400 moves to 414 to resume normal operation. If it is determined at 412 that there is a contextual focus object near the current cursor position, the method 400 moves to 418 to move the cursor to the nearby contextual focus object. Next, at 416, the computing system automatically invokes an action at the nearby contextual focus object. In some examples, the computing system uses a database of contextual actions 420 to help in identifying an appropriate action to invoke based on the current context. After the action is invoked at 416, the method 400 moves to 414 to resume normal operation.
  • Figures 5A-5E are diagrams illustrating a text editing example with contextual cursor movement according to examples of the present disclosure.
  • Figures 5A-5E show the content of five display screens 500(1 )-500(5), respectively, of a computing system, such as mobile phone 502, at various stages of text editing performed by a user.
  • the content displayed on each of the display screens 500(1 )-500(5) includes a text entry region 506, which allows a user to enter and edit text using on-screen keyboard 508.
  • the text editing process begins with display screen 500(1 ) in Figure 5A, in which the cursor is positioned at the right end of the last (i.e., third) line of text in the text entry region 506.
  • a user then performs a contextual tap on the right side of the mobile phone 502 as indicated by hand 504 in this Figure.
  • Mobile phone 502 detects the tap, determines that the tap is a contextual tap, identifies a focus object near the current cursor position, and moves the cursor to the identified focus object.
  • the identified focus object is the misspelled word "nrw".
  • the cursor may be moved to the rightmost side of the word with the error, or a prediction step to predict the correct spelling may be performed to more precisely place the cursor at the exact location of the error, such as between two letters within the word.
  • FIG 5B the cursor has been moved to the right edge of the identified focus object (i.e., to the right edge of "nrw”), so that the user can use the backspace button and correct the misspelled word.
  • Figure 5C the misspelled word "nrw” has been corrected by the user.
  • a user then performs a contextual tap on the right side of the mobile phone 502 as indicated by hand 504 in this Figure.
  • Mobile phone 502 detects the tap, determines that the tap is a contextual tap, identifies a focus object near the current cursor position, and moves the cursor to the identified focus object.
  • the identified focus object is the misspelled word "tobihht”.
  • Figure 5D the cursor has been moved to the right edge of the identified focus object (i.e., to the right edge of "tobihht"), so that the user can use the backspace button and correct the misspelled word.
  • Figure 5E shows an example in which an action is automatically invoked in response to the contextual tap shown in Figure 5C. Specifically, in response to the contextual tap, the cursor is moved to the misspelled word "tobihht", this word is highlighted, and a dropdown list of potential corrections 510 is automatically displayed.
  • each button can be a focus object, and contextual taps may be used to move between buttons.
  • a contextual touch input is not a simple tap, as it may interfere with the normal non-contextual finger tap navigation.
  • a contextual touch input is a specific type of tap (e.g. , triple tap, or a tap at a certain location on the device) that results in seeking a contextual target for the cursor.
  • Such contextual touch inputs may be application-specific, and may be stored in a database.
  • Figure 6 is a flow diagram illustrating a method 600 for performing a discrete cursor movement based on a contextual touch input according to examples of the present disclosure.
  • computing system 100 or 200 is configured to perform method 600.
  • a method 600 for performing a discrete cursor movement based on a contextual touch input.
  • computing system generates a signal corresponding to a touch input received on a surface of the computing system. At 604, the computing system
  • the computing system determines if the touch input is a contextual touch input intended to move a cursor of the computing system based on a current context of an application being executed by the computing system.
  • the computing system generates a discrete cursor movement to move the cursor to a displayed object based on the current context when it is determined that the touch input is a contextual touch input.
  • method 600 further includes automatically invoking an action, by the computing system, when it is determined that the touch input is a contextual touch input; and accessing, by the computing system, a database of contextual actions to facilitate identification of the action to be automatically invoked.
  • the database of contextual actions defines application-specific actions for multiple applications.
  • the method 600 further includes accessing, by the computing system, a database of contextual tap patterns to facilitate the determining, by the computing system, if the touch input is a contextual touch input, wherein the database of contextual tap patterns defines application- specific tap patterns for multiple applications.
  • the method 600 further includes accessing, by the computing system, a database of proximity measures to facilitate identification of the displayed object, wherein the database of proximity measures defines application-specific proximity measures for multiple applications.
  • the touch input in method 600 is a multiple touch input.
  • the determining if the touch input is a contextual touch input includes determining upon which surface of the computing system the touch input was received.
  • Another example of the present disclosure is directed to a computing system that includes a sensor to generate a signal corresponding to a detected touch input on the computing system, wherein the detected touch input is a contextual touch input intended to move a cursor of the computing system based on a current context of an application being executed by the computing system.
  • the computing system includes a touch input analysis module to analyze the signal generated by the sensor, and a cursor movement module to move the cursor to a displayed object based on the current context and based on the analysis of the signal generated by the sensor.
  • the senor comprises an accelerometer.
  • the computing system further includes a database of contextual actions to facilitate identification of an action to be automatically invoked by the computing system in response to the contextual touch input.
  • Yet another example of the present disclosure is directed to a non- transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to: analyze a signal generated by a sensor of a computing system, the signal corresponding to a touch input received on a surface of the computing system; determine if the touch input is a contextual touch input intended to move a cursor of the computing system based on a current context of an application being displayed by the computing system; and generate a discrete cursor movement to move the cursor to a displayed object based on the current context and based on the analysis of the signal generated by the sensor.
  • the non-transitory computer-readable storage medium stores instructions that, when executed by a processor, further cause the processor to: access a database of contextual tap patterns to facilitate the determining if the touch input is a contextual touch input.
  • Examples disclosed herein provide context-sensitive touch input for computing systems, such as mobile devices. Examples disclosed herein save the user time and frustration where a discrete or high precision input is desired in a specific context (e.g., a typographical error), leading to the availability of a broader class of applications, higher productivity, and less frustration. Examples disclosed herein enhance the mobile user experience, increase a mobile device's functionality without any hardware changes, and support a new class of applications that leverage precise and discrete input.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des exemples de l'invention concernent un procédé comprenant les étapes au cours desquelles un système informatique : génère un signal correspondant à une entrée tactile reçue sur une surface du système informatique ; détermine si l'entrée tactile est une entrée tactile contextuelle visant à déplacer un curseur du système informatique sur la base d'un contexte actuel d'une application exécutée par le système informatique ; et génère un déplacement discontinu du curseur de façon à le déplacer vers un objet affiché sur la base du contexte actuel lorsqu'il est déterminé que l'entrée tactile est une entrée tactile contextuelle.
PCT/US2016/015557 2016-01-29 2016-01-29 Déplacement d'un curseur sur la base d'un contexte WO2017131728A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/015557 WO2017131728A1 (fr) 2016-01-29 2016-01-29 Déplacement d'un curseur sur la base d'un contexte

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/015557 WO2017131728A1 (fr) 2016-01-29 2016-01-29 Déplacement d'un curseur sur la base d'un contexte

Publications (1)

Publication Number Publication Date
WO2017131728A1 true WO2017131728A1 (fr) 2017-08-03

Family

ID=59398400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/015557 WO2017131728A1 (fr) 2016-01-29 2016-01-29 Déplacement d'un curseur sur la base d'un contexte

Country Status (1)

Country Link
WO (1) WO2017131728A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100287486A1 (en) * 2009-05-07 2010-11-11 Microsoft Corporation Correction of typographical errors on touch displays
US20110169731A1 (en) * 2007-08-23 2011-07-14 Kyocera Corporation Input apparatus
US20110179353A1 (en) * 2010-01-19 2011-07-21 Research In Motion Limited Mobile Electronic Device and Associated Method Providing Proposed Spelling Corrections Based Upon a Location of Cursor At or Adjacent a Character of a Text Entry
US20150046865A1 (en) * 2010-12-22 2015-02-12 Xiaorui Xu Touch screen keyboard design for mobile devices
WO2015123835A1 (fr) * 2014-02-20 2015-08-27 Nokia Technologies Oy Positionnement de curseur

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110169731A1 (en) * 2007-08-23 2011-07-14 Kyocera Corporation Input apparatus
US20100287486A1 (en) * 2009-05-07 2010-11-11 Microsoft Corporation Correction of typographical errors on touch displays
US20110179353A1 (en) * 2010-01-19 2011-07-21 Research In Motion Limited Mobile Electronic Device and Associated Method Providing Proposed Spelling Corrections Based Upon a Location of Cursor At or Adjacent a Character of a Text Entry
US20150046865A1 (en) * 2010-12-22 2015-02-12 Xiaorui Xu Touch screen keyboard design for mobile devices
WO2015123835A1 (fr) * 2014-02-20 2015-08-27 Nokia Technologies Oy Positionnement de curseur

Similar Documents

Publication Publication Date Title
JP6429981B2 (ja) ユーザ入力の意図の分類
JP6602372B2 (ja) コンテキスト情報に基づくタッチ表面の非アクティブ領域
US9594504B2 (en) User interface indirect interaction
US10235039B2 (en) Touch enhanced interface
US10331219B2 (en) Identification and use of gestures in proximity to a sensor
US8922489B2 (en) Text input using key and gesture information
CN105359083B (zh) 对于用户在触摸设备上的边缘输入的动态管理
US20140282269A1 (en) Non-occluded display for hover interactions
US20170090749A1 (en) Systems and Methods for Disambiguating Intended User Input at an Onscreen Keyboard Using Dual Strike Zones
JP2011248888A (ja) デュアルスクリーン上のユーザジェスチャのための方法及びデュアルスクリーンデバイス
US20090256803A1 (en) System and method for providing simulated mouse drag and click functions for an electronic device
US20160077631A1 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US20130038552A1 (en) Method and system for enhancing use of touch screen enabled devices
US9430035B2 (en) Interactive drawing recognition
US10175779B2 (en) Discrete cursor movement based on touch input
US20100245266A1 (en) Handwriting processing apparatus, computer program product, and method
US20170336881A1 (en) Discrete cursor movement based on touch input region
WO2017131728A1 (fr) Déplacement d'un curseur sur la base d'un contexte
EP3195097B1 (fr) Génération d'une signature d'entrée tactile pour un mouvement de curseur discret
KR102205235B1 (ko) 즐겨찾기모드 조작방법 및 이를 수행하는 터치 스크린을 포함하는 장치
CN110945470A (zh) 可编程的多点触摸屏幕上键盘

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16888458

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16888458

Country of ref document: EP

Kind code of ref document: A1