WO2018216760A1 - Dispositif de commande tactile, dispositif hôte et procédé - Google Patents

Dispositif de commande tactile, dispositif hôte et procédé Download PDF

Info

Publication number
WO2018216760A1
WO2018216760A1 PCT/JP2018/019961 JP2018019961W WO2018216760A1 WO 2018216760 A1 WO2018216760 A1 WO 2018216760A1 JP 2018019961 W JP2018019961 W JP 2018019961W WO 2018216760 A1 WO2018216760 A1 WO 2018216760A1
Authority
WO
WIPO (PCT)
Prior art keywords
force
touch
data
touch detection
detection surface
Prior art date
Application number
PCT/JP2018/019961
Other languages
English (en)
Japanese (ja)
Inventor
学雍 楊
哲夫 種村
Original Assignee
シナプティクス・ジャパン合同会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シナプティクス・ジャパン合同会社 filed Critical シナプティクス・ジャパン合同会社
Publication of WO2018216760A1 publication Critical patent/WO2018216760A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0447Position sensing using the local deformation of sensor cells

Definitions

  • the present disclosure relates to a touch controller, a display system, and a host device, and more particularly, to touch detection that detects contact of an object with a desired touch detection surface (for example, the surface of a touch panel).
  • a desired touch detection surface for example, the surface of a touch panel
  • a display module that combines a display panel and a touch panel is one of the most widely used user interfaces. By combining a touch panel as an input device with a display panel which is an output device, a highly convenient user interface can be realized.
  • a general touch panel is configured to detect contact of an object such as a human finger on a touch detection surface, but in recent years, not only contact of an object to the touch detection surface but also touch detection surface by contact.
  • Technology has been developed to sense the pressure acting on the surface. Sensing the pressure acting on the touch detection surface is suitable for providing an advanced user interface.
  • US Patent Application Publication No. 2016/0334917 discloses such a technique.
  • a normal force (Normal Force) acting in a direction perpendicular to the touch detection surface is detected, and an operation based on the magnitude of the detected normal force is performed. If information on the shearing force (Shear Force) acting on the touch detection surface is acquired and an operation corresponding to this information is performed, a more advanced user interface can be provided.
  • US Patent Application Publication No. 2016/0077638 discloses detection of shear force using a capacitive sensor.
  • the touch controller touches an analog front end that obtains a detection signal from a touch detection device having a touch detection surface, and the position and force point of the force point at which a force acts on the touch detection surface based on the detection signal.
  • an arithmetic unit configured to detect a force acting on the detection surface by the object.
  • the touch controller is set to the indentation detection state when the magnitude of the force acting on the touch detection surface exceeds a predetermined threshold.
  • the arithmetic device is configured to generate direction data corresponding to the direction of the shearing force acting on the touch detection surface in accordance with the change in the position of the applied force point when the touch controller is in the indentation detection state.
  • an interface for receiving touch detection data generated based on a detection signal acquired from a touch detection device having a touch detection surface, and display by user interface processing in response to the touch detection data And a processor for generating image data corresponding to an image displayed on the module.
  • the touch detection data includes action force data indicating the position of the force application point acting on the touch detection surface by the contact of the object and the magnitude of the force acting on the force application point.
  • the processor generates direction data indicating the direction of the shear force acting on the touch detection surface generated according to the change in position of the applied force point described in the applied force data, and image data according to the applied force data. Is configured to do.
  • the method obtains a detection signal from a touch detection device having a touch detection surface and, based on the detection signal, touch detection at the position of the force point on which the force acts on the touch detection surface and the force point Generate action force data indicating the force that the object acts on the surface, and push the arithmetic device into the detection state in response to the magnitude of the force that the object acts on the touch detection surface exceeds a predetermined threshold And generating direction data corresponding to the direction of the shearing force acting on the touch detection surface in accordance with a change in the position of the applied force point when the arithmetic device is in the indentation detection state.
  • FIG. 1 It is a block diagram which shows the structure of the display system in one Embodiment. It is a figure which illustrates the structure of the display panel in a display area. It is a figure which shows the structure of the touchscreen in this embodiment. It is a figure which shows notionally the touch detection in this embodiment. It is a figure explaining calculation of the x coordinate of an applied point. The state transition of the touch controller in the touch detection of this embodiment is shown. It is a figure which shows an example of operation
  • the display system 100 includes a display module 1, a touch controller built-in driver IC 2, and a host device 3.
  • the display module 1 includes a display panel 11 and a touch panel 12.
  • the display panel 11 includes a display area 13 and a scan driver circuit 14.
  • the display area 13 is an area where an image is displayed. As shown in FIG. 2, the display area 13 is provided with scanning lines 15, data lines 16, and pixel circuits 17.
  • the scan driver circuit 14 drives the scanning lines 15 provided in the display area 13.
  • the scan driver circuit 14 may be integrated on the display panel 11 using SOG (system-on-glass) technology.
  • the touch panel 12 is a touch detection device having a touch detection surface. When an object touches the touch detection surface, the touch panel 12 is configured to detect not only the fact of contact but also the force acting on the touch detection surface. . As shown in FIG. 3, the touch panel 12 includes detection capacitors 18 arranged in a matrix. The detection capacitors 18 are arranged close to the touch detection surface 12a, and each detection capacitor 18 is configured such that the capacitance changes when a force is applied to the touch detection surface 12a at a position where the detection capacitors 18 are close to each other. In one embodiment, the detection capacitor 18 has a pair of capacitor electrodes, and when a force is applied to a position close to the touch detection surface 12a, the distance between the capacitor electrodes changes. The capacity is configured to change. As will be described later, in the present embodiment, the force acting on the touch detection surface 12a is detected from the capacitance of each detection capacitor 18.
  • An xy orthogonal coordinate system is defined on the touch detection surface 12a of the touch panel 12.
  • the x axis is defined in the horizontal direction of the touch panel 12
  • the y axis is defined in the vertical direction.
  • the position on the touch detection surface 12a of the touch panel 12 can be specified by the x coordinate and the y coordinate.
  • FIG. 1 shows the display module 1 in which the display panel 11 and the touch panel 12 formed separately are combined, the display panel 11 and the touch panel 12 may be integrally formed.
  • the display module 1 in which the detection capacitor 18 is integrated on the display panel 11 may be used.
  • the touch controller built-in driver IC 2 drives the display panel 11 to display an image in the display area 13 of the display panel 11 and is based on a detection signal obtained from the detection capacitor 18 of the touch panel 12.
  • the semiconductor device is configured to perform touch detection.
  • the driver IC 2 with a built-in touch controller is simply referred to as “driver IC 2”.
  • the driver IC 2 includes a display driver 21 and a touch controller 22.
  • the display driver 21 and the touch controller 22 are monolithically integrated, that is, integrated on the same semiconductor chip.
  • the display driver 21 and the touch controller 22 may be integrated on separate semiconductor chips.
  • the display driver 21 includes a data driver circuit 23 and a panel interface circuit 24.
  • the data driver circuit 23 drives the data lines 16 of the display panel 11 in response to the image data received from the host device 3.
  • the panel interface circuit 24 generates a scan control signal for controlling the scan driver circuit 14 of the display panel 11 and supplies the scan control signal to the scan driver circuit 14.
  • the touch controller 22 includes an analog front end 25 and an arithmetic device 26.
  • the analog front end 25 acquires an analog detection signal from the detection capacitor 18 of the touch panel 12 and performs analog-digital conversion on the acquired analog detection signal to generate ADC data that is digital data.
  • the ADC data includes capacitance data indicating the capacitance of each detection capacitor 18.
  • the generated ADC data is supplied to the arithmetic unit 26.
  • the arithmetic device 26 performs arithmetic processing for touch detection on the ADC data received from the analog front end 25, and generates touch detection data indicating the result of touch detection.
  • an MCU micro control unit
  • the touch detection data generated by the arithmetic device 26 is transmitted to the host device 3.
  • the host device 3 supplies image data to the display driver 21 of the driver IC 2 and performs user interface processing based on the touch detection data received from the touch controller 22.
  • the user interface process includes, for example, a process for recognizing an operation performed by the user on the touch panel 12 and a process for generating an image to be presented to the user on the display panel 11.
  • the host device 3 includes a processor 31, a storage device 32, and an interface 33.
  • the processor 31 executes control software 34 stored in the storage device 32, and performs various operations for controlling the display system 100, for example, generation of image data supplied to the display driver 21.
  • the storage device 32 stores control software 34.
  • the control software 34 includes a UI control module 34a, and user interface processing is realized by the processor 31 executing the UI control module 34a.
  • the interface 33 transmits and receives data between the driver IC 2 and the host device 3. Specifically, the interface 33 transmits image data generated by the processor 31 to the driver IC 2 and receives touch detection data from the driver IC 2.
  • the display system 100 is configured to perform touch detection that senses contact of an object 4, for example, a human finger, with respect to the touch detection surface 12 a of the touch panel 12. .
  • touch detection the magnitude of the force that the object 4 acts by touching the touch detection surface 12a based on the capacity data included in the ADC data generated by the analog front end 25, and the touch detection surface 12a
  • force application point The coordinates of the force application point (force application point), which is the point where the force acts, are detected.
  • the coordinates of the force point may be detected by function fitting. As shown in FIG. 5, in one embodiment, the x coordinate of the force point performs function fitting on the capacitance data of a predetermined number of adjacent detection capacitors 18 including the detection capacitors 18 having the maximum capacitance data value. You may calculate by.
  • FIG. 5 shows a technique for calculating the x-coordinate of the applied force from the (n ⁇ 1) th, nth, and n + 1th detection capacitors 18 from the left in a specific row of the detection capacitors 18.
  • the value of the capacitance data of the nth detection capacitor 18 takes a maximum value.
  • A is a constant corresponding to the width of the range of the capacity data
  • w x is a constant corresponding to the peak width of f (x).
  • the number of detection capacitors 18 used for calculating the x-coordinate of the force point is not limited to 3, but may be 4 or more.
  • x coordinates x F fitting is performed force application point using function f (x) by a suitable technique is calculated.
  • the y-coordinate of the force point can be calculated by the same method.
  • the normal force acting in a direction perpendicular to the touch detection surface 12a of the touch panel 12 (Normal Force) not F N Information only
  • the shear force acting on the plane direction of the touch detection surface 12a (Shear Force ) Information about F S is acquired.
  • Analog detection signals obtained from the detection capacitor 18 of the touch panel 12 is dependent on the capacitance of each of the detection capacitor 18, the capacitance of each detection capacitor 18 is dependent on the normal force F N.
  • the touch detection is performed based on the volume data included in the ADC data generated from the analog detection signals obtained from the detection capacitor 18, basically, that the information about the normal force F N is obtained Become.
  • information on the shear force F S is acquired from the change in the coordinates of the force application point. If the coordinates of the force applied point is changed temporally, change of coordinates is generally represents the direction acting shear force F S, also the force acting on the touch detection surface 12a in this case is the shear force F S You may think that it contains.
  • a change in the coordinates of the force applied points, and the force acting on the touch detection surface 12a when the coordinates of the force applied point is changed is acquired as the information about the shear force F S.
  • the object 4 for example, the finger of a human body
  • the object 4 is touch detection surface 12a.
  • a shear force acting on the touch detection surface 12a infos (shear force) F S is obtained.
  • the touch controller 22 In a state where no object is in contact with the touch panel 12, the touch controller 22 is set to a touch non-detection state.
  • the computing device 26 monitors the contact of an object with the touch detection surface 12a of the touch panel 12 based on the ADC data received from the analog front end 25. During this time, the touch controller 22 transmits touch detection data indicating that the touch controller 22 is in a touch non-detection state to the host device 3.
  • the touch controller 22 shifts to a touch detection state.
  • the touch detection state the magnitude F of the force acting when the object 4 is in contact with the touch detection surface 12a and the coordinates (x, y) of the position of the force point where the force acts are detected based on the ADC data.
  • the arithmetic device 26 generates action force data describing the magnitude F of force and the coordinates (x, y) of the position of the force point, and transmits touch detection data including the action force data to the host device 3.
  • the touch detection data transmitted to the host device 3 at this time may include data indicating that the touch controller 22 has shifted to the touch detection state.
  • the size F is 2.86 N, and the coordinates (x, y) of the position of the applied force point at which the force acts are detected as (506, 823).
  • an operation of further pressing is detected.
  • the arithmetic device 26 detects that the magnitude F of the force acting on the touch detection surface 12a when the touch controller 22 is in the touch detection state exceeds the predetermined threshold Th1
  • the touch controller 22 Shifts to the push-in detection state.
  • the push detection state, the arithmetic unit 26, in response to changes in the coordinates of the position of the force applied points (x, y), information related to the shear force F S acting on the touch detection surface 12a, and more specifically, shear Direction data indicating the direction of the force F S is generated, and touch detection data including the generated direction data is generated.
  • touch detection is performed as follows.
  • the coordinates of the position of the force point at the current time t i are (x i , y i ), and the coordinates of the position of the force point at the time t i-1 prior to the current time t i are (x i ⁇ 1 , y i). -1 ).
  • the current time t i is a time when the touch controller 22 is in the push detection state, but if the time t i-1 is before the current time t i , the time when the touch controller 22 is in the push detection state.
  • Arithmetic unit 26 the change in position of the force applied points, more specifically, the coordinates of the position of the force applied point at the current time t i (x i, y i ) and before the current time t i the time t i-
  • the difference ( ⁇ x, ⁇ y) between the coordinates of the position of the applied force point at 1 and (x i ⁇ 1 , y i ⁇ 1 ) is calculated.
  • ⁇ x x i ⁇ x i ⁇ 1 (2a)
  • ⁇ y y i ⁇ y i ⁇ 1 (2b) It is.
  • the arithmetic unit 26 generates direction data in addition to the acting force data describing the magnitude F of force and the coordinates (x, y) of the applied point.
  • Direction data includes information corresponding to the direction of the shear force F S, is generated based on the difference ( ⁇ x, ⁇ y).
  • arctan is an arc tangent function.
  • the direction data may be generated as data describing the difference ( ⁇ x, ⁇ y) itself.
  • the computing device 26 generates touch detection data including the generated action force data and direction data, and transmits the generated touch detection data to the host device 3.
  • the touch detection data transmitted to the host device 3 at this time may include data indicating that the touch controller 22 has shifted to the push-in detection state.
  • the arithmetic unit 26 does not generate the direction data, but generates only the force data describing the magnitude F of the force and the coordinates (x, y) of the force point.
  • the arithmetic device 26 transmits touch detection data including the action force data and not including the direction data to the host device 3.
  • the touch controller 22 shifts from the touch detection state to the push detection state.
  • the threshold value Th1 is 5.00N
  • the threshold value Th2 is 15.
  • the touch controller 22 shifts to the push-in detection state.
  • the difference ( ⁇ x, ⁇ y) in the coordinates of the force points is calculated.
  • of ⁇ x is larger than the threshold Th2.
  • the direction data is generated by the arithmetic unit 26 in addition to the acting force data. Touch detection data including the generated action force data and direction data is transmitted to the host device 3.
  • the host device 3 receives touch detection data from the touch controller 22 and performs user interface processing based on the touch detection data. As described above, the user interface process is executed by the processor 31 executing the UI control module 34a.
  • touch detection data including direction data is transmitted to the host device 3, the direction data is used in user interface processing in the host device 3.
  • One useful application of the direction data included in the touch detection data is movement of a pointer displayed in the display area 13 of the display panel 11. Referring to FIG. 8 illustrating an example of the pointer 13a displayed in the display area 13, when the user performs an operation of pushing the touch detection surface 12a, the pointer 13a is moved in response to the action force data and the direction data.
  • a highly convenient user interface can be realized.
  • the moving direction in which the pointer 13a moves is determined according to the direction data, and the moving speed at which the pointer 13a moves is determined. It may be determined by the magnitude F of the force described in the applied force data.
  • the processor 31 of the host apparatus 3 determines the image element, for example, the pointer 13a in the moving direction determined according to the direction data, according to the magnitude F of the force described in the applied force data. Image data is generated so as to move at the moving speed, and is transmitted to the display driver 21 of the driver IC 2.
  • the moving speed v of the pointer 13a may be determined so as to increase monotonously with respect to the force magnitude F described in the acting force data.
  • the moving speed v of the pointer 13a may be determined as being proportional to the magnitude F of force.
  • the moving speed v may be determined according to the following formula (4).
  • v K V F (4)
  • K V is a constant.
  • the touch controller 22 detects an operation of pushing the touch detection surface 12a, and further generates touch detection data including action force data and direction data when the push operation is detected.
  • the detection of the operation of pushing the touch detection surface 12a and the generation of the subsequent direction data may be performed in the user interface process in the host device 3.
  • the state transition illustrated in FIG. 6 is performed in a user interface process executed in the host device 3. More specifically, the display system 100 operates as follows.
  • the touch controller 22 transmits touch detection data indicating that no object is in contact to the host device 3. In this case, the user interface process is set to a touch non-detected state.
  • the touch controller 22 When the touch controller 22 detects contact of an object with the touch detection surface 12a of the touch panel 12, the touch controller 22 generates action force data describing the magnitude F of force and the coordinates (x, y) of the force point, and the action force data. Is transmitted to the host device 3. In response to the touch detection data including the action force data, the user interface process is set to the touch detection state. Thereafter, as long as contact of an object with the touch detection surface 12a of the touch panel 12 is detected, the touch controller 22 continues to transmit touch detection data including action force data to the host device 3.
  • the processor 31 of the host device 3 detects an operation of pushing the touch detection surface 12a after an object, for example, a human finger 5 comes into contact with the touch detection surface 12a, based on the action force data included in the touch detection data. Specifically, when the processor 31 detects that the magnitude F of the force acting on the touch detection surface 12a when the user interface process is in the touch detection state exceeds the predetermined threshold Th1, the user interface process is , Transition to the push-in detection state.
  • the push detection state, the processor 31, in response to changes in the force applied point of coordinates (x, y), information related to the shear force F S acting on the touch detection surface 12a, and more specifically, the shearing force F S
  • the direction data indicating the direction of the is generated.
  • the indent detection state user interface processing is performed as follows.
  • the coordinates of the force point at the current time t i are (x i , y i ), and the coordinates of the force point at the time t i-1 before the current time t i are (x i-1 , y i-1 ).
  • the current time t i is the time when the user interface process is in the indentation detection state, but if the time t i-1 is before the current time t i , the time when the user interface process is in the indentation detection state
  • the time in the touch detection state may be used.
  • the processor 31 changes the coordinates (x, y) of the force point, more specifically, the coordinates of the force point at the current time t i (x i , y i ) and the time t before the current time t i. the coordinates of the force application point in i-1 to calculate the (x i-1, y i-1) and the difference ( ⁇ x, ⁇ y). As described above, the difference ( ⁇ x, ⁇ y) is calculated according to equations (2a) and (2b).
  • the processor 31 If any one of the absolute values
  • the processor 31 generates direction data based on the difference ( ⁇ x, ⁇ y).
  • the direction data may be generated as data describing ⁇ calculated according to the above equation (3) from the difference ( ⁇ x, ⁇ y).
  • the direction data may be generated as data describing the difference ( ⁇ x, ⁇ y) itself. As described above, the generated direction data may be used to move the pointer 13a displayed on the display panel 11, for example.

Abstract

Un dispositif de commande tactile comprend une extrémité avant analogique pour acquérir un signal de détection provenant d'un dispositif de détection tactile ayant une surface de détection tactile, et un dispositif de calcul configuré de façon à détecter, sur la base du signal de détection, la position d'un point d'application de force où une force agit sur la surface de détection tactile et la force avec laquelle un objet agit sur la surface de détection tactile au point d'application de force. Le dispositif de commande tactile est réglé sur un état de détection de pression lorsque l'amplitude de la force avec laquelle l'objet agit sur la surface de détection tactile dépasse une valeur de seuil prescrite. Le dispositif de calcul est configuré de telle sorte que lorsque le dispositif de commande tactile est dans l'état de détection de pression, des données de direction correspondant à la direction de la force de cisaillement agissant sur la surface de détection tactile sont générées en fonction d'un changement de la position du point d'application de force.
PCT/JP2018/019961 2017-05-25 2018-05-24 Dispositif de commande tactile, dispositif hôte et procédé WO2018216760A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-103450 2017-05-25
JP2017103450A JP2018200494A (ja) 2017-05-25 2017-05-25 タッチコントローラ、表示システム及びホスト装置

Publications (1)

Publication Number Publication Date
WO2018216760A1 true WO2018216760A1 (fr) 2018-11-29

Family

ID=64396480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019961 WO2018216760A1 (fr) 2017-05-25 2018-05-24 Dispositif de commande tactile, dispositif hôte et procédé

Country Status (2)

Country Link
JP (1) JP2018200494A (fr)
WO (1) WO2018216760A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015135648A (ja) * 2014-01-20 2015-07-27 シャープ株式会社 入力操作装置、及び、デジタル放送受信機
WO2016077414A1 (fr) * 2014-11-11 2016-05-19 Qualcomm Incorporated Système et procédés pour commander un curseur sur la base d'une pression de doigt et d'une direction
US20170083135A1 (en) * 2015-09-18 2017-03-23 Synaptics Incorporated Controlling user interface force

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015135648A (ja) * 2014-01-20 2015-07-27 シャープ株式会社 入力操作装置、及び、デジタル放送受信機
WO2016077414A1 (fr) * 2014-11-11 2016-05-19 Qualcomm Incorporated Système et procédés pour commander un curseur sur la base d'une pression de doigt et d'une direction
US20170083135A1 (en) * 2015-09-18 2017-03-23 Synaptics Incorporated Controlling user interface force

Also Published As

Publication number Publication date
JP2018200494A (ja) 2018-12-20

Similar Documents

Publication Publication Date Title
CN107111400B (zh) 估计触摸力的方法和装置
CN111630480B (zh) 触摸面板装置
US9601085B2 (en) Device and method for synchronizing display and touch controller with host polling
US10156938B2 (en) Information processing apparatus, method for controlling the same, and storage medium
US20150193037A1 (en) Input Apparatus
CA2481396A1 (fr) Methode de reconnaissance gestuelle et systeme tactile ainsi equipe
US10409489B2 (en) Input apparatus
US11422660B2 (en) Input device, input method and program
US20100088595A1 (en) Method of Tracking Touch Inputs
JP2014199493A (ja) 電子機器、電子機器の制御方法
JP2007052639A (ja) タッチパネルのハンドジェスチャー検出方法
JP5524937B2 (ja) タッチパッドを含む入力装置および携帯式コンピュータ
CN110799933A (zh) 使用多维热图消除手势输入类型的歧义
JP2008165575A (ja) タッチパネル装置
KR102198596B1 (ko) 간접 입력의 명확화 기법
US20130027342A1 (en) Pointed position determination apparatus of touch panel, touch panel apparatus, electronics apparatus including the same, method of determining pointed position on touch panel, and computer program storage medium
KR101714302B1 (ko) 3차원 터치 인식 장치 및 방법
WO2018216760A1 (fr) Dispositif de commande tactile, dispositif hôte et procédé
US20010017617A1 (en) Coordinate detection device with improved operability and method of detecting coordinates
CN109284057B (zh) 手持装置及其控制方法
US9134843B2 (en) System and method for distinguishing input objects
US10802650B2 (en) Coordinate input device
US20190113999A1 (en) Touch motion tracking and reporting technique for slow touch movements
JP2020060930A (ja) 入力装置
KR101546966B1 (ko) 제스처 판단 방법 및 접촉 감지 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18806865

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18806865

Country of ref document: EP

Kind code of ref document: A1