WO2015135123A1 - System and method for correcting the position of a touch input - Google Patents

System and method for correcting the position of a touch input Download PDF

Info

Publication number
WO2015135123A1
WO2015135123A1 PCT/CN2014/073180 CN2014073180W WO2015135123A1 WO 2015135123 A1 WO2015135123 A1 WO 2015135123A1 CN 2014073180 W CN2014073180 W CN 2014073180W WO 2015135123 A1 WO2015135123 A1 WO 2015135123A1
Authority
WO
WIPO (PCT)
Prior art keywords
window size
data points
touch
raw data
error
Prior art date
Application number
PCT/CN2014/073180
Other languages
French (fr)
Inventor
Yibo Jiang
Junchen Du
Jian Li
William Yee-Ming Huang
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2014/073180 priority Critical patent/WO2015135123A1/en
Priority to PCT/CN2014/091363 priority patent/WO2015135336A1/en
Publication of WO2015135123A1 publication Critical patent/WO2015135123A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04182Filtering of noise external to the device and not generated by digitiser components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates generally to touch devices, and more specifically to systems, methods, and devices for improving the linearity of touch sensors.
  • wireless computing devices such as wireless telephones, personal digital assistants (PDAs), and tablet computers that are small, lightweight, and easily carried by users.
  • PDAs personal digital assistants
  • portable computing devices may use touch screen displays that detect user gestures on the touch screen and translate the detected gestures into commands to be performed by the device. Such gestures may be performed using one or more fingers or a stylus type pointing implement.
  • Multi-touch screens are designed to recognize and track several simultaneous touches. For example, when a user moves two fingers on a screen, information indicating touch/movement for both fingers is provided by a multi-touch screen.
  • One drawback of implementing multi-touch technology on portable computing devices is the processing overhead typically required for recognizing multi- touch.
  • Processing overhead measures the total amount of work the central processing unit (CPU) of the device can perform and the percentage of that total capacity which is used by individual computing tasks, such as touch detection. In total, these tasks must require less than the processor's overall capacity.
  • Simple touch gestures may typically be handled by a touchscreen controller, which is a separate processor associated with the touch screen, but more complex touch gestures require the use of a secondary processor, often the mobile device's CPU, to process large amounts of touch data.
  • large amounts of touch data must be processed to determine the nature of the touch, sometimes only to conclude that a touch was a "false positive,” consuming large amounts of CPU capacity and device power.
  • the processing overhead required for complex touch recognition may require a large percentage of the overall CPU capacity, impairing device performance.
  • touch processing complexity increases proportional to touch-node capacity, which in turn increases proportional to display size. Therefore, because there is a trend in many portable computing devices toward increasing display size and touch complexity, touch processing is increasingly reducing device performance and threatening battery life. Further, user interaction with a device through touch events is highly sensitive to latency, and user experience can suffer from low throughput interfaces between the touchscreen panel and the host processor resulting in processing delay and response lag.
  • One aspect of the disclosure provides a method of correcting the position of a touch input.
  • the method includes receiving a plurality of raw data points corresponding to a plurality of touch events, determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method; determining an error between at least one of the fitted data points and its corresponding raw data point; and updating the window size based on the error.
  • the method also includes receiving a second plurality of raw data points corresponding to a second touch input; and determining a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size.
  • updating the window size comprises reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
  • increasing the window size comprises adding a step size to the window size.
  • decreasing the window size comprises updating the window size to be a percentage of the window size.
  • the plurality of raw data points represent a sequence of touch inputs across a touch screen.
  • each of the plurality of data points comprises an x and y value.
  • the method also includes determining a touch input position of one of the plurality of touch events based on the set of fitted data points.
  • the apparatus includes a processor, a touch device, a memory, operably connected to the processor, and configured to store processor instructions that configure the processor to: receive a plurality of raw data points corresponding to a plurality of touch events on the touch device, determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method, determine an error between at least one of the fitted data points and its corresponding raw data point, and update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
  • the memory stores instructions that further configure the processor to receive a second plurality of raw data points corresponding to a second touch input, and determine a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size.
  • the memory stores instructions that further configure the processor to update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
  • the memory stores instructions that further configure the processor to increase the window size by adding a step size to the window size.
  • the memory stores instructions that further configure the processor to decrease the window size by updating the window size to be a percentage of the window size.
  • the plurality of raw data points represent a sequence of touch inputs across a touch screen.
  • each of the plurality of data points comprises an x and y value.
  • the memory stores instructions that further configure the processor to determine a touch input position of one of the plurality of touch events based on the set of fitted data points.
  • FIG. 1A illustrates a functional block diagram of a wireless device in accordance with one embodiment of the methods and systems disclosed, that may be employed within a wireless communication system.
  • FIG. IB illustrates a block diagram of a mobile computing device
  • FIG. 2 is a flowchart of a method for improving touch linearity on a touch device.
  • FIG. 1A illustrates various components that may be utilized in a wireless device 202 that may be employed within a wireless communication system.
  • the wireless device 202 is an example of a device that may be configured to implement the various methods described herein.
  • the wireless device 202 may include a processor 204 which controls operation of the wireless device 202.
  • the processor 204 may also be referred to as a central processing unit (CPU).
  • Memory 206 which may include both read-only memory (ROM) and random access memory (RAM), may provide instructions and data to the processor 204.
  • a portion of the memory 206 may also include non-volatile random access memory (NVRAM).
  • the processor 204 typically performs logical and arithmetic operations based on program instructions stored within the memory 206.
  • the instructions in the memory 206 may be executable to implement the methods described herein.
  • the processor 204 may comprise or be a component of a processing system implemented with one or more processors.
  • the one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
  • the processing system may also include machine-readable media for storing software.
  • Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, cause the processing system to perform the various functions described herein.
  • the wireless device 202 may also include a housing 208 that may include a transmitter 210 and/or a receiver 212 to allow transmission and reception of data between the wireless device 202 and a remote location.
  • the transmitter 210 and receiver 212 may be combined into a transceiver 214.
  • An antenna 216 may be attached to the housing 208 and electrically coupled to the transceiver 214.
  • the wireless device 202 may also include (not shown) multiple transmitters, multiple receivers, multiple transceivers, and/or multiple antennas.
  • the transmitter 210 may be configured to wirelessly transmit packets having different packet types or functions.
  • the transmitter 210 may be configured to transmit packets of different types generated by the processor 204.
  • the processor 204 may be configured to process packets of a plurality of different packet types.
  • the processor 204 may be configured to determine the type of packet and to process the packet and/or fields of the packet accordingly. .
  • the receiver 212 may be configured to wirelessly receive packets having different packet types. In some aspects, the receiver 212 may be configured to detect a type of a packet used and to process the packet accordingly.
  • the wireless device 202 may also include a signal detector 218 that may be used in an effort to detect and quantify the level of signals received by the transceiver 214.
  • the signal detector 218 may detect such signals as total energy, energy per subcarrier per symbol, power spectral density and other signals.
  • the wireless device 202 may also include a digital signal processor (DSP) 220 for use in processing signals.
  • DSP 220 may be configured to generate a packet for transmission.
  • the packet may comprise a physical layer data unit (PPDU).
  • PPDU physical layer data unit
  • the wireless device 202 may further comprise a user interface 222 in some aspects.
  • the user interface 222 may comprise a keypad, a microphone, a speaker, and/or a display, including a touch display in some aspects.
  • the user interface 222 may include any element or component that conveys information to a user of the wireless device 202 and/or receives input from the user.
  • the various components of the wireless device 202 may be coupled together by a bus system 226.
  • the bus system 226 may include a data bus, for example, as well as a power bus, a control signal bus, and a status signal bus in addition to the data bus.
  • the components of the wireless device 202 may be coupled together or accept or provide inputs to each other using some other mechanism.
  • FIG. 1A Although a number of separate components are illustrated in FIG. 1A, one or more of the components may be combined or commonly implemented.
  • the processor 204 may be used to implement not only the functionality described above with respect to the processor 204, but also to implement the functionality described above with respect to the signal detector 218 and/or the DSP 220. Further, each of the components illustrated in FIG. 1A may be implemented using a plurality of separate elements.
  • FIG. IB illustrates a block diagram of a mobile computing device 300 in accordance with one embodiment of the methods and systems disclosed.
  • the device 300 comprises a display 310, a touch screen subsystem 320, and a host processor 340.
  • the illustrated embodiment is not meant to be limitative and device 300 may include a variety of other components as required for other functions.
  • the display 310 of device 300 may include a touch screen panel 312 and a display component 314.
  • Certain embodiments of display component 314 may be any flat panel display technology, such as an LED, LCD, plasma, or projection screen.
  • Display component 314 may be coupled to the host processor 340 for receiving information for visual display to a user. Such information includes, but is not limited to, visual representations of files stored in a memory of device 300, software applications installed on device 300, user interfaces, and network- accessible content objects.
  • Touch screen panel 312 may employ one or a combination of many touch sensing technologies, for instance capacitive, resistive, surface acoustic wave, or optical touch sensing.
  • touch screen panel 312 may overlay or be positioned over display component 314 such that visibility of the display component 314 is not impaired.
  • the touch screen panel 312 and display component 314 may be integrated into a single panel or surface.
  • the touch screen panel 312 may be configured to cooperate with display component 314 such that a user touch on the touch screen panel 312 is associated with a portion of the content displayed on display component 314 corresponding to the location of the touch on touch screen panel 312.
  • Display component may also be configured to respond to a user touch on the touch screen panel 312 by displaying, for a limited time, a visual representation of the touch.
  • Touch screen panel 312 may be coupled to a touch screen subsystem 320, the touch screen subsystem 320 comprising a touch detection module 322 and a processing module 324.
  • the touch screen panel 312 may cooperate with touch screen subsystem 320 to enable device 300 to sense the location, pressure, direction and/or shape of a user touch or touches on display 310.
  • the touch detection module 322 may include instructions that when executed can scan the area of the touch screen panel 312 for touch events and to provide the coordinates of touch events to the processing module 324.
  • the touch detection module 322 may be an analog touch screen front end module comprising a plurality of software drivers.
  • the processing module 324 of the touch screen subsystem 320 may be configured to analyze touch events and to communicate touch data to host processor 340.
  • the processing module 324 may, in some embodiments, include instructions that when executed act as a touch screen controller (TSC).
  • TSC touch screen controller
  • the specific type of TSC employed will depend upon the type of touch technology used in panel 312.
  • the processing module 324 may be configured to start up when the touch detection module 322 indicates that a user has touched touch screen panel 312 and to power down after release of the touch. This feature may be useful for power conservation in battery- powered devices such as mobile computing device 300.
  • Processing module 324 may be configured to perform filtering on touch event data received from touch detection module. For example, in a display 310 where the touch screen panel 312 is placed on top of a display component 314 comprising and LCD screen, the LCD screen may contribute noise to the coordinate position measurement of the touch event. This noise is a combination of impulse noise and Gaussian noise.
  • the processing module 324 may be configured with median and averaging filters to reduce this noise. Instead of using only a single sample for the coordinate measurement of the touch event, the processing module 324 may be programmed to instruct the touch detection module 322 to provide two, four, eight, or 16 samples. These samples may then be sorted, median filtered, and averaged to give a lower noise, more accurate result of the touch coordinates.
  • the processing module 324 is a processor specifically configured for use with the touch screen subsystem 320, while host processor 340 may be configured to handle the general processing requirements of device 300.
  • the processing module 324 and the host processor 340 may be in communication with each other
  • FIG. 2 is a flowchart of a method for improving touch linearity on a touch device.
  • the method 200 may be performed in some aspects on either the wireless device 202 of FIG. la or the device 300 of FIG. lb.
  • the method 200 may be performed by the processor 204 or the host processor 340.
  • the input may comprise a plurality of raw data points.
  • the input may comprise a sequence of N points, P(l), P(2), ... P(N).
  • the plurality of points represent locations of sequential touch events on a touch screen.
  • a linear fit is performed on the input.
  • the linear fit is performed based on a window size.
  • an initial condition will set the window size to a value of zero (0).
  • the window size may represent a number of points received in block 202 that are used in the linear fit.
  • the window size may be M points, where M ⁇ N.
  • the results of the linear fit operation include a series of points, such as L(l), L(2), L(3), ...L(M).
  • Each of the points L(n) may correspond to a point P(n) within the plurality of raw data points.
  • an error in the linear fit is determined.
  • the error is determined based on a difference between one or more of the points P(l ...M) and their corresponding points L(1...M) generated by the linear fit.
  • an error E(M) is determined as shown below in equation 1:
  • Euclid() is a function that determines the Euclidian distance between two parameters.
  • P(n) is a raw data point received in block 202 corresponding to time (n)
  • L(n) is a fitted data point provided by the linear fit method that corresponds to the raw data point P(n).
  • E(M) may represent an error between only the most recently received touch event in the plurality of raw data points, for example, P(N), and its corresponding fitted data point L(N). In some aspects, this error is:
  • E(M) is compared to an error threshold. If E(M) is greater than the error threshold, process 200 moves to block 212, where the window size is reduced.
  • the decreased window size may be determined by the following equation:
  • W(n) is the window size at a time n
  • a is between [0..1]
  • process 200 moves to block 210, where the window size is increased.
  • the window size may be increased by a value delta ( ⁇ ).
  • Various embodiments may move to either block 212 or block 210 when the error equals the threshold. After blocks 210 or 212 are performed, process 200 returns to block 202.
  • any reference to an element herein using a designation such as "first,” “second,” and so forth does not generally limit the quantity or order of those elements. Rather, these designations may be used herein as a convenient wireless device of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements may be employed there or that the first element must precede the second element in some manner. Also, unless stated otherwise a set of elements may include one or more elements.
  • any of the various illustrative logical blocks, modules, processors, means, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware (e.g., a digital implementation, an analog implementation, or a combination of the two, which may be designed using source coding or some other technique), various forms of program or design code incorporating instructions (which may be referred to herein, for convenience, as "software” or a "software module), or combinations of both.
  • software or a “software module”
  • the various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein and in connection with the figures may be implemented within or performed by an integrated circuit (IC), an access terminal, or an access point.
  • the IC may include a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, electrical components, optical components, mechanical components, or any combination thereof designed to perform the functions described herein, and may execute codes or instructions that reside within the IC, outside of the IC, or both.
  • the logical blocks, modules, and circuits may include antennas and/or transceivers to communicate with various components within the network or within the device.
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • the functionality of the modules may be implemented in some other manner as taught herein.
  • the functionality described herein (e.g., with regard to one or more of the accompanying figures) may correspond in some aspects to similarly designated "means for" functionality in the appended claims.
  • Computer-readable media includes both computer storage media and communication media including any medium that can be enabled to transfer a computer program from one place to another.
  • a storage media may be any available media that may be accessed by a computer.
  • such computer-readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer- readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and instructions on a machine readable medium and computer-readable medium, which may be incorporated into a computer program product.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

System and method for correcting the position of a touch input. In one aspect, the system and method (200) include: receiving a plurality of raw data points corresponding to a plurality of touch events (202), determining a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size (204), and the window size defines the number of the plurality of raw data points used in the linear fitting method, determining an error between at least one of the fitted data points and its corresponding raw data point (206); and updating the window size based on the error (208, 210, 212).

Description

SYSTEMS AND METHODS FOR IMPROVED TOUCH SENSOR LINEARITY
Field
[0001] The present application relates generally to touch devices, and more specifically to systems, methods, and devices for improving the linearity of touch sensors.
BACKGROUND
[0002] Advances in technology have resulted in smaller and more powerful computing devices. For example, there currently exist a variety of portable computing devices, including wireless computing devices such as wireless telephones, personal digital assistants (PDAs), and tablet computers that are small, lightweight, and easily carried by users. In order to simplify user interfaces and to avoid pushbuttons and complex menu systems, such portable computing devices may use touch screen displays that detect user gestures on the touch screen and translate the detected gestures into commands to be performed by the device. Such gestures may be performed using one or more fingers or a stylus type pointing implement. Multi-touch screens (touch screens having multi-touch capability) are designed to recognize and track several simultaneous touches. For example, when a user moves two fingers on a screen, information indicating touch/movement for both fingers is provided by a multi-touch screen.
[0003] One drawback of implementing multi-touch technology on portable computing devices is the processing overhead typically required for recognizing multi- touch. Processing overhead measures the total amount of work the central processing unit (CPU) of the device can perform and the percentage of that total capacity which is used by individual computing tasks, such as touch detection. In total, these tasks must require less than the processor's overall capacity. Simple touch gestures may typically be handled by a touchscreen controller, which is a separate processor associated with the touch screen, but more complex touch gestures require the use of a secondary processor, often the mobile device's CPU, to process large amounts of touch data. Typically, large amounts of touch data must be processed to determine the nature of the touch, sometimes only to conclude that a touch was a "false positive," consuming large amounts of CPU capacity and device power. The processing overhead required for complex touch recognition may require a large percentage of the overall CPU capacity, impairing device performance.
[0004] The current generation of mobile processors is not well adapted to deal with increasing touch complexity and corresponding CPU overhead, especially in conjunction with the many other common high performance uses of mobile devices. Increasing the size of the mobile processor core or cache delivers performance increases only up to a certain level, beyond which heat dissipation issues make any further increase in core and cache size impractical. Overall processing capacity is further limited by the smaller size of many mobile devices, which limits the number of processors that can be included in the device. Additionally, because mobile computing devices are generally battery-powered, high performance uses also shortens battery life.
[0005] Despite mobile processing limitations, many common mobile applications such as maps, games, email clients, web browsers, etc., are making increasingly complex use of touch recognition. Further, touch processing complexity increases proportional to touch-node capacity, which in turn increases proportional to display size. Therefore, because there is a trend in many portable computing devices toward increasing display size and touch complexity, touch processing is increasingly reducing device performance and threatening battery life. Further, user interaction with a device through touch events is highly sensitive to latency, and user experience can suffer from low throughput interfaces between the touchscreen panel and the host processor resulting in processing delay and response lag.
SUMMARY
[0006] The systems, methods, devices, and computer program products discussed herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this invention as expressed by the claims which follow, some features are discussed briefly below. After considering this discussion, and particularly after reading the section entitled "Detailed Description," it will be understood how advantageous features of the disclosed methods and systems.
[0007] One aspect of the disclosure provides a method of correcting the position of a touch input. The method includes receiving a plurality of raw data points corresponding to a plurality of touch events, determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method; determining an error between at least one of the fitted data points and its corresponding raw data point; and updating the window size based on the error.
[0008] In some aspects, the method also includes receiving a second plurality of raw data points corresponding to a second touch input; and determining a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size. In some aspects, updating the window size comprises reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold. In some aspects, increasing the window size comprises adding a step size to the window size. In some aspects, decreasing the window size comprises updating the window size to be a percentage of the window size. In some aspects, the plurality of raw data points represent a sequence of touch inputs across a touch screen. In some aspects, each of the plurality of data points comprises an x and y value. In some aspects, the method also includes determining a touch input position of one of the plurality of touch events based on the set of fitted data points.
[0009] Another aspect disclosed is an apparatus for correcting the position of a touch input. The apparatus includes a processor, a touch device, a memory, operably connected to the processor, and configured to store processor instructions that configure the processor to: receive a plurality of raw data points corresponding to a plurality of touch events on the touch device, determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method, determine an error between at least one of the fitted data points and its corresponding raw data point, and update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
[0010] In some aspects, the memory stores instructions that further configure the processor to receive a second plurality of raw data points corresponding to a second touch input, and determine a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size. In some aspects, the memory stores instructions that further configure the processor to update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold. In some aspects, the memory stores instructions that further configure the processor to increase the window size by adding a step size to the window size. In some aspects, the memory stores instructions that further configure the processor to decrease the window size by updating the window size to be a percentage of the window size. In some aspects, the plurality of raw data points represent a sequence of touch inputs across a touch screen. In some aspects, each of the plurality of data points comprises an x and y value. In some aspects, the memory stores instructions that further configure the processor to determine a touch input position of one of the plurality of touch events based on the set of fitted data points.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1A illustrates a functional block diagram of a wireless device in accordance with one embodiment of the methods and systems disclosed, that may be employed within a wireless communication system.
[0012] FIG. IB illustrates a block diagram of a mobile computing device
300 in accordance with one embodiment of the methods and systems disclosed.
[0013] FIG. 2 is a flowchart of a method for improving touch linearity on a touch device.
DETAILED DESCRIPTION
[0014] The word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Various aspects of the novel systems, apparatuses, and methods are described more fully hereinafter with reference to the accompanying drawings. This disclosure may, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the novel systems, apparatuses, and methods disclosed herein, whether implemented independently of, or combined with, any other aspect of the invention. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the invention is intended to cover such an apparatus or method which is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the invention set forth herein. It should be understood that any aspect disclosed herein may be embodied by one or more elements of a claim.
[0015] Although particular aspects are described herein, many variations and permutations of these aspects fall within the scope of the disclosure. Although some benefits and advantages of the preferred aspects are mentioned, the scope of the disclosure is not intended to be limited to particular benefits, uses, or objectives. Rather, aspects of the disclosure are intended to be broadly applicable to different wireless technologies, system configurations, networks, and transmission protocols, some of which are illustrated by way of example in the figures and in the following description of the preferred aspects. The detailed description and drawings are merely illustrative of the disclosure rather than limiting, the scope of the disclosure being defined by the appended claims and equivalents thereof.
[0016] FIG. 1A illustrates various components that may be utilized in a wireless device 202 that may be employed within a wireless communication system. The wireless device 202 is an example of a device that may be configured to implement the various methods described herein.
[0017] The wireless device 202 may include a processor 204 which controls operation of the wireless device 202. The processor 204 may also be referred to as a central processing unit (CPU). Memory 206, which may include both read-only memory (ROM) and random access memory (RAM), may provide instructions and data to the processor 204. A portion of the memory 206 may also include non-volatile random access memory (NVRAM). The processor 204 typically performs logical and arithmetic operations based on program instructions stored within the memory 206. The instructions in the memory 206 may be executable to implement the methods described herein.
[0018] The processor 204 may comprise or be a component of a processing system implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
[0019] The processing system may also include machine-readable media for storing software. Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, cause the processing system to perform the various functions described herein.
[0020] The wireless device 202 may also include a housing 208 that may include a transmitter 210 and/or a receiver 212 to allow transmission and reception of data between the wireless device 202 and a remote location. The transmitter 210 and receiver 212 may be combined into a transceiver 214. An antenna 216 may be attached to the housing 208 and electrically coupled to the transceiver 214. The wireless device 202 may also include (not shown) multiple transmitters, multiple receivers, multiple transceivers, and/or multiple antennas.
[0021] The transmitter 210 may be configured to wirelessly transmit packets having different packet types or functions. For example, the transmitter 210 may be configured to transmit packets of different types generated by the processor 204. When the wireless device 202 is implemented or used as an access point or station, the processor 204 may be configured to process packets of a plurality of different packet types. For example, the processor 204 may be configured to determine the type of packet and to process the packet and/or fields of the packet accordingly. .
[0022] The receiver 212 may be configured to wirelessly receive packets having different packet types. In some aspects, the receiver 212 may be configured to detect a type of a packet used and to process the packet accordingly.
[0023] The wireless device 202 may also include a signal detector 218 that may be used in an effort to detect and quantify the level of signals received by the transceiver 214. The signal detector 218 may detect such signals as total energy, energy per subcarrier per symbol, power spectral density and other signals. The wireless device 202 may also include a digital signal processor (DSP) 220 for use in processing signals. The DSP 220 may be configured to generate a packet for transmission. In some aspects, the packet may comprise a physical layer data unit (PPDU).
[0024] The wireless device 202 may further comprise a user interface 222 in some aspects. The user interface 222 may comprise a keypad, a microphone, a speaker, and/or a display, including a touch display in some aspects. The user interface 222 may include any element or component that conveys information to a user of the wireless device 202 and/or receives input from the user.
[0025] The various components of the wireless device 202 may be coupled together by a bus system 226. The bus system 226 may include a data bus, for example, as well as a power bus, a control signal bus, and a status signal bus in addition to the data bus. The components of the wireless device 202 may be coupled together or accept or provide inputs to each other using some other mechanism.
[0026] Although a number of separate components are illustrated in FIG. 1A, one or more of the components may be combined or commonly implemented. For example, the processor 204 may be used to implement not only the functionality described above with respect to the processor 204, but also to implement the functionality described above with respect to the signal detector 218 and/or the DSP 220. Further, each of the components illustrated in FIG. 1A may be implemented using a plurality of separate elements.
[0027] FIG. IB illustrates a block diagram of a mobile computing device 300 in accordance with one embodiment of the methods and systems disclosed. The device 300 comprises a display 310, a touch screen subsystem 320, and a host processor 340. The illustrated embodiment is not meant to be limitative and device 300 may include a variety of other components as required for other functions.
[0028] The display 310 of device 300 may include a touch screen panel 312 and a display component 314. Certain embodiments of display component 314 may be any flat panel display technology, such as an LED, LCD, plasma, or projection screen. Display component 314 may be coupled to the host processor 340 for receiving information for visual display to a user. Such information includes, but is not limited to, visual representations of files stored in a memory of device 300, software applications installed on device 300, user interfaces, and network- accessible content objects.
[0029] Touch screen panel 312 may employ one or a combination of many touch sensing technologies, for instance capacitive, resistive, surface acoustic wave, or optical touch sensing. In some embodiments, touch screen panel 312 may overlay or be positioned over display component 314 such that visibility of the display component 314 is not impaired. In other embodiments, the touch screen panel 312 and display component 314 may be integrated into a single panel or surface. The touch screen panel 312 may be configured to cooperate with display component 314 such that a user touch on the touch screen panel 312 is associated with a portion of the content displayed on display component 314 corresponding to the location of the touch on touch screen panel 312. Display component may also be configured to respond to a user touch on the touch screen panel 312 by displaying, for a limited time, a visual representation of the touch.
[0030] Touch screen panel 312 may be coupled to a touch screen subsystem 320, the touch screen subsystem 320 comprising a touch detection module 322 and a processing module 324. The touch screen panel 312 may cooperate with touch screen subsystem 320 to enable device 300 to sense the location, pressure, direction and/or shape of a user touch or touches on display 310. The touch detection module 322 may include instructions that when executed can scan the area of the touch screen panel 312 for touch events and to provide the coordinates of touch events to the processing module 324. In some embodiments, the touch detection module 322 may be an analog touch screen front end module comprising a plurality of software drivers.
[0031] The processing module 324 of the touch screen subsystem 320 may be configured to analyze touch events and to communicate touch data to host processor 340. The processing module 324 may, in some embodiments, include instructions that when executed act as a touch screen controller (TSC). The specific type of TSC employed will depend upon the type of touch technology used in panel 312. The processing module 324 may be configured to start up when the touch detection module 322 indicates that a user has touched touch screen panel 312 and to power down after release of the touch. This feature may be useful for power conservation in battery- powered devices such as mobile computing device 300.
[0032] Processing module 324 may be configured to perform filtering on touch event data received from touch detection module. For example, in a display 310 where the touch screen panel 312 is placed on top of a display component 314 comprising and LCD screen, the LCD screen may contribute noise to the coordinate position measurement of the touch event. This noise is a combination of impulse noise and Gaussian noise. The processing module 324 may be configured with median and averaging filters to reduce this noise. Instead of using only a single sample for the coordinate measurement of the touch event, the processing module 324 may be programmed to instruct the touch detection module 322 to provide two, four, eight, or 16 samples. These samples may then be sorted, median filtered, and averaged to give a lower noise, more accurate result of the touch coordinates.
[0033] The processing module 324 is a processor specifically configured for use with the touch screen subsystem 320, while host processor 340 may be configured to handle the general processing requirements of device 300. The processing module 324 and the host processor 340 may be in communication with each other
[0034] FIG. 2 is a flowchart of a method for improving touch linearity on a touch device. The method 200 may be performed in some aspects on either the wireless device 202 of FIG. la or the device 300 of FIG. lb. For example, the method 200 may be performed by the processor 204 or the host processor 340.
[0035] In block 202, input is received. In some aspects, the input may comprise a plurality of raw data points. For example, the input may comprise a sequence of N points, P(l), P(2), ... P(N). In some aspects, the plurality of points represent locations of sequential touch events on a touch screen.
[0036] In block 204, a linear fit is performed on the input. The linear fit is performed based on a window size. In some aspects, an initial condition will set the window size to a value of zero (0). The window size may represent a number of points received in block 202 that are used in the linear fit. For example, while the input may comprise N points, the window size may be M points, where M < N.
[0037] In some aspects, the results of the linear fit operation include a series of points, such as L(l), L(2), L(3), ...L(M). Each of the points L(n) may correspond to a point P(n) within the plurality of raw data points.
[0038] In block 206, an error in the linear fit is determined. In some aspects, the error is determined based on a difference between one or more of the points P(l ...M) and their corresponding points L(1...M) generated by the linear fit. For example, in some aspects, an error E(M) is determined as shown below in equation 1:
Figure imgf000011_0001
Where: Euclid() is a function that determines the Euclidian distance between two parameters.
P(n) is a raw data point received in block 202 corresponding to time (n)
L(n)is a fitted data point provided by the linear fit method that corresponds to the raw data point P(n).
[0039] Other embodiments may use a different equation for determining E(M). For example, in some aspects, E(M) may represent an error between only the most recently received touch event in the plurality of raw data points, for example, P(N), and its corresponding fitted data point L(N). In some aspects, this error is:
[0040] E(M) = Euclid P(N), L(N)) (2)
[0041] In decision block 208, E(M) is compared to an error threshold. If E(M) is greater than the error threshold, process 200 moves to block 212, where the window size is reduced. In some aspects, the decreased window size may be determined by the following equation:
W(n+l) = W(n) * a (3) where:
W(n) is the window size at a time n
a is between [0..1]
[0042] If the error, for example E(M), is less than the error threshold, process 200 moves to block 210, where the window size is increased. In some aspects, the window size may be increased by a value delta (Δ).
[0043] Various embodiments may move to either block 212 or block 210 when the error equals the threshold. After blocks 210 or 212 are performed, process 200 returns to block 202.
[0044] It should be understood that any reference to an element herein using a designation such as "first," "second," and so forth does not generally limit the quantity or order of those elements. Rather, these designations may be used herein as a convenient wireless device of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements may be employed there or that the first element must precede the second element in some manner. Also, unless stated otherwise a set of elements may include one or more elements.
[0045] A person/one having ordinary skill in the art would understand that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
[0046] A person/one having ordinary skill in the art would further appreciate that any of the various illustrative logical blocks, modules, processors, means, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware (e.g., a digital implementation, an analog implementation, or a combination of the two, which may be designed using source coding or some other technique), various forms of program or design code incorporating instructions (which may be referred to herein, for convenience, as "software" or a "software module), or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
[0047] The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein and in connection with the figures may be implemented within or performed by an integrated circuit (IC), an access terminal, or an access point. The IC may include a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, electrical components, optical components, mechanical components, or any combination thereof designed to perform the functions described herein, and may execute codes or instructions that reside within the IC, outside of the IC, or both. The logical blocks, modules, and circuits may include antennas and/or transceivers to communicate with various components within the network or within the device. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The functionality of the modules may be implemented in some other manner as taught herein. The functionality described herein (e.g., with regard to one or more of the accompanying figures) may correspond in some aspects to similarly designated "means for" functionality in the appended claims.
[0048] If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be implemented in a processor- executable software module which may reside on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that can be enabled to transfer a computer program from one place to another. A storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, such computer-readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Also, any connection can be properly termed a computer- readable medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer- readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and instructions on a machine readable medium and computer-readable medium, which may be incorporated into a computer program product.
[0049] It is understood that any specific order or hierarchy of steps in any disclosed process is an example of a sample approach. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged while remaining within the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
[0050] Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the disclosure is not intended to be limited to the implementations shown herein, but is to be accorded the widest scope consistent with the claims, the principles and the novel features disclosed herein. The word "exemplary" is used exclusively herein to mean "serving as an example, instance, or illustration." Any implementation described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other implementations.
[0051] Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
[0052] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.

Claims

WHAT IS CLAIMED IS:
1. A method of correcting the position of a touch input, comprising:
receiving a plurality of raw data points corresponding to a plurality of touch events;
determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method;
determining an error between at least one of the fitted data points and its corresponding raw data point; and
updating the window size based on the error.
2. The method of claim 1, further comprising:
receiving a second plurality of raw data points corresponding to a second touch input; and
determining a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size.
3. The method of claim 1, wherein updating the window size comprises reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
4. The method of claim 1, wherein increasing the window size comprises adding a step size to the window size.
5. The method of claim 1, wherein decreasing the window size comprises updating the window size to be a percentage of the window size.
6. The method of claim 1, wherein the plurality of raw data points represent a sequence of touch inputs across a touch screen.
7. The method of claim 1, wherein each of the plurality of data points comprises an x and y value.
8. The method of claim 1, further comprising determining a touch input position of one of the plurality of touch events based on the set of fitted data points.
9. An apparatus for correcting the position of a touch input, comprising:
a processor;
a touch device; a memory, operably connected to the processor, and configured to store processor instructions that configure the processor to:
receive a plurality of raw data points corresponding to a plurality of touch events on the touch device,
determine a set of fitted data points based on a linear fitting method, the plurality of raw data points and a window size, the window size defining the number of the plurality of raw data points used in the linear fitting method,
determine an error between at least one of the fitted data points and its corresponding raw data point, and
update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
10. The apparatus of claim 9, wherein the memory stores instructions that further configure the processor to receive a second plurality of raw data points corresponding to a second touch input, and determine a second set of fitted data points based on the linear fitting method, the second plurality of raw data points, and the updated window size.
11. The apparatus of claim 9, wherein the memory stores instructions that further configure the processor to update the window size by reducing the window size if the error is greater than an error threshold and increasing the window size if the error is less than the error threshold.
12. The apparatus of claim 9, wherein the memory stores instructions that further configure the processor to increase the window size by adding a step size to the window size.
13. The apparatus of claim 9, wherein the memory stores instructions that further configure the processor to decrease the window size by updating the window size to be a percentage of the window size.
14. The apparatus of claim 9, wherein the plurality of raw data points represent a sequence of touch inputs across a touch screen.
15. The apparatus of claim 9, wherein each of the plurality of data points comprises an x and y value.
16. The apparatus of claim 9, wherein the memory stores instructions that further configure the processor to determine a touch input position of one of the plurality of touch events based on the set of fitted data points.
PCT/CN2014/073180 2014-03-11 2014-03-11 System and method for correcting the position of a touch input WO2015135123A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2014/073180 WO2015135123A1 (en) 2014-03-11 2014-03-11 System and method for correcting the position of a touch input
PCT/CN2014/091363 WO2015135336A1 (en) 2014-03-11 2014-11-18 Systems and methods for improved touch sensor linearity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/073180 WO2015135123A1 (en) 2014-03-11 2014-03-11 System and method for correcting the position of a touch input

Publications (1)

Publication Number Publication Date
WO2015135123A1 true WO2015135123A1 (en) 2015-09-17

Family

ID=54070763

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2014/073180 WO2015135123A1 (en) 2014-03-11 2014-03-11 System and method for correcting the position of a touch input
PCT/CN2014/091363 WO2015135336A1 (en) 2014-03-11 2014-11-18 Systems and methods for improved touch sensor linearity

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/091363 WO2015135336A1 (en) 2014-03-11 2014-11-18 Systems and methods for improved touch sensor linearity

Country Status (1)

Country Link
WO (2) WO2015135123A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0854986A (en) * 1994-08-15 1996-02-27 Casio Comput Co Ltd Display device with touch panel
US20080170046A1 (en) * 2007-01-16 2008-07-17 N-Trig Ltd. System and method for calibration of a capacitive touch digitizer system
CN101907966A (en) * 2010-07-29 2010-12-08 薛家祥 Touch screen correction method and touch screen human-computer interface system of digitized arc welding power supply
CN102156579A (en) * 2011-03-31 2011-08-17 华为终端有限公司 Touch screen coordinates calibration method, device thereof and terminal device
US20120131515A1 (en) * 2010-11-22 2012-05-24 Amx, Llc Method and apparatus of error correction in resistive touch panels
CN103105975A (en) * 2013-02-26 2013-05-15 华为终端有限公司 Touch identification method and device
CN103246418A (en) * 2012-02-10 2013-08-14 三星电子株式会社 Apparatus and method for compensating touch error in electronic device with touch screen
US20130222247A1 (en) * 2012-02-29 2013-08-29 Eric Liu Virtual keyboard adjustment based on user input offset

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8049740B2 (en) * 2007-10-26 2011-11-01 Tyco Electronics Corporation Method and apparatus for laplace constrained touchscreen calibration
CN101980107A (en) * 2010-10-20 2011-02-23 陆钰明 Method for realizing gesture code based on straight basic gesture
TWI460626B (en) * 2011-12-23 2014-11-11 Cando Corp Ltd Touch control electronic device and calibration method of trajectory

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0854986A (en) * 1994-08-15 1996-02-27 Casio Comput Co Ltd Display device with touch panel
US20080170046A1 (en) * 2007-01-16 2008-07-17 N-Trig Ltd. System and method for calibration of a capacitive touch digitizer system
CN101907966A (en) * 2010-07-29 2010-12-08 薛家祥 Touch screen correction method and touch screen human-computer interface system of digitized arc welding power supply
US20120131515A1 (en) * 2010-11-22 2012-05-24 Amx, Llc Method and apparatus of error correction in resistive touch panels
CN102156579A (en) * 2011-03-31 2011-08-17 华为终端有限公司 Touch screen coordinates calibration method, device thereof and terminal device
CN103246418A (en) * 2012-02-10 2013-08-14 三星电子株式会社 Apparatus and method for compensating touch error in electronic device with touch screen
US20130222247A1 (en) * 2012-02-29 2013-08-29 Eric Liu Virtual keyboard adjustment based on user input offset
CN103105975A (en) * 2013-02-26 2013-05-15 华为终端有限公司 Touch identification method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WANG XIN ET AL.: "Data Rejection in Linear Fitting Using Origin", JOURNAL OF SHANXI TEACHER'S UNIVERSITY NATURAL SCIENCE EDITION, vol. 17, no. 3, 31 March 2003 (2003-03-31), ISSN: 1009-4490 *

Also Published As

Publication number Publication date
WO2015135336A1 (en) 2015-09-17

Similar Documents

Publication Publication Date Title
US20150242115A1 (en) Systems and methods for improved signal to noise ratio in touch systems
EP2825944B1 (en) Touch screen hover input handling
US20150242053A1 (en) Systems and methods for improved touch screen accuracy
KR20180081133A (en) Rapid screen segmentation method and apparatus, electronic device, display interface, and storage medium
AU2017203910B2 (en) Glove touch detection
CN103164067B (en) Judge the method and the electronic equipment that touch input
JP5659254B2 (en) Input operation receiving device and threshold adjustment method
JP2019508759A5 (en)
EP2929423A1 (en) Multi-touch symbol recognition
US9983731B2 (en) System and method for reducing shadow effects in touch systems
US9588607B2 (en) Method for improving touch recognition and electronic device thereof
US9213457B2 (en) Driving method for touch panel and touch control system
CN104750292A (en) Touch device and touch mode switching method thereof
WO2015135123A1 (en) System and method for correcting the position of a touch input
WO2012115647A1 (en) Key input error reduction
CN106484285B (en) A kind of display methods and mobile terminal of mobile terminal
CN104007850A (en) Signal processing method and electronic device
EP3317754B1 (en) Position-filtering for land-lift events
KR20150060475A (en) Method and apparatus for controlling an input on a touch-screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14885429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14885429

Country of ref document: EP

Kind code of ref document: A1