JP5237980B2 - Coordinate input device, coordinate input method, and computer executable program - Google Patents

Coordinate input device, coordinate input method, and computer executable program Download PDF

Info

Publication number
JP5237980B2
JP5237980B2 JP2010048281A JP2010048281A JP5237980B2 JP 5237980 B2 JP5237980 B2 JP 5237980B2 JP 2010048281 A JP2010048281 A JP 2010048281A JP 2010048281 A JP2010048281 A JP 2010048281A JP 5237980 B2 JP5237980 B2 JP 5237980B2
Authority
JP
Japan
Prior art keywords
pen
finger
detection signal
detection
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010048281A
Other languages
Japanese (ja)
Other versions
JP2011186550A (en
Inventor
良太 野村
正大 築山
Original Assignee
レノボ・シンガポール・プライベート・リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by レノボ・シンガポール・プライベート・リミテッド filed Critical レノボ・シンガポール・プライベート・リミテッド
Priority to JP2010048281A priority Critical patent/JP5237980B2/en
Publication of JP2011186550A publication Critical patent/JP2011186550A/en
Application granted granted Critical
Publication of JP5237980B2 publication Critical patent/JP5237980B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a coordinate input device, a coordinate input method, and a computer-executable program.

  Conventionally, as a coordinate input device, a resistive film method, an electromagnetic induction method, an electrostatic coupling method, an ultrasonic method, an infrared method, and the like have been considered.

  In a conventional coordinate input device, on the one hand, characters, figures, etc. are input by hand, while at the same time, on the other hand, operations such as window operations and scroll operations are performed by a finger. There is a problem that the speed or efficiency of operation input cannot be increased in an information processing apparatus or the like.

  FIG. 11 is a timing chart for explaining an example of timing at which a conventional coordinate input device detects pen input and hand / finger input. As shown in FIG. 11, in the conventional coordinate input device, during the period when the pen is used, finger detection is turned off to prevent erroneous input by a hand holding the pen.

  In order to solve the above problem, Patent Document 1 proposes a method of performing pen input and finger input simultaneously. In Patent Document 1, detection unit areas smaller than the width of the finger are arranged in the xy direction on the upper surface of the liquid crystal display unit, and a pressed coordinate detection that detects a coordinate position pressed by a pen, a hand holding the pen, or the finger. Arrange the layers. In the invalid area (cancellation area) setting unit, the area of the coordinate position pressed by the hand holding the pen is set in advance as an invalid area whose position is determined relative to the pen coordinate value. The identification determining unit determines a pen coordinate value from the pressed coordinate value calculated by the pressed coordinate calculating unit, calculates an invalid area for the pen coordinate value, and coordinates values in the invalid area other than the pen coordinate value are It is determined that the coordinate position has been pressed by a hand having “”, and the calculated coordinate value outside the invalid area is determined to be the coordinate position pressed by the finger for operation input.

  However, in Patent Document 1, when a pen is held that is not assumed at the time of design (for example, when the rear end of the pen is held or when another operation is performed with a finger holding the pen), There is a problem that a hand touches an invalid area on the touch surface and there is a possibility of erroneous input. In addition, since initial settings for the right and left hands are required, there is a problem that the setting work is troublesome. There is also a problem that finger input cannot be performed in the invalid area. Furthermore, there is a problem that if a command is input with a hand holding a pen, there is a possibility of erroneous input.

JP 2000-172441 A

  The present invention has been made in view of the above, and when performing pen input and finger input simultaneously, a coordinate input device, a coordinate input method, and the like that can reduce erroneous input and improve operation efficiency. Another object of the present invention is to provide a computer executable program.

  In order to solve the above-described problems and achieve the object, the present invention provides a finger detection means provided on a display screen for detecting coordinates touched by a finger / hand and outputting a first detection signal, and the display Pen detection means provided on the screen for detecting the coordinates touched by the pen and outputting a second detection signal; if the pen is used, the pointing is extracted from the second detection signal and pointing Signal processing means for issuing an instruction and extracting a predetermined gesture from the first detection signal and issuing a corresponding command.

  Further, according to a preferred aspect of the present invention, the predetermined gesture includes one finger single tap, one finger double tap, two finger double tap, two finger swipe, fine drag with one finger, finger Desirably, it is at least one of a pinch and a three finger tap.

  According to a preferred aspect of the present invention, the pen detection means is an electromagnetic induction type detection device arranged on the lower side of the display screen, and the finger detection means is electrostatically placed on the upper side of the display screen. An inductive detection device is desirable.

  In order to solve the above-described problems and achieve the object, the present invention provides a finger detection unit provided on a display screen that detects coordinates touched by a finger / hand and outputs a first detection signal. A step of detecting a coordinate point touched by the pen and outputting a second detection signal, and a second detection signal when the pen is used. Extracting a pointing from the first detection signal, issuing a pointing instruction, and extracting only a predetermined gesture from the first detection signal and issuing a corresponding command.

  In order to solve the above-described problems and achieve the object, the present invention provides a finger detection unit provided on a display screen that detects coordinates touched by a finger / hand and outputs a first detection signal. A step of detecting a coordinate point touched by the pen and outputting a second detection signal, and a second detection signal when the pen is used. Extracting a pointing from the first detection signal, issuing a pointing instruction, and extracting only a predetermined gesture from the first detection signal and issuing a corresponding command.

  According to the present invention, the finger detection means provided on the display screen for detecting the coordinates touched by the finger / hand and outputting the first detection signal, the coordinates provided on the display screen and touched by the pen are displayed. A pen detection means for detecting and outputting a second detection signal; and when the pen is used, a pointing is extracted from the second detection signal, a pointing instruction is issued, and the first detection signal Signal processing means for extracting only a predetermined gesture from the camera and issuing a corresponding command, so that when a pen input and a finger input are performed simultaneously, erroneous input is reduced and operation efficiency is improved. It is possible to provide a coordinate input device capable of performing the above.

FIG. 1-1 is a schematic external view of a tablet PC to which a coordinate input device according to an embodiment of the present invention is applied. FIG. 1-2 is a schematic external view of a tablet PC to which the coordinate input device according to the embodiment of the present invention is applied. FIG. 1-3 is a schematic external view of a tablet PC to which the coordinate input device according to the embodiment of the present invention is applied. FIG. 1-4 is a partial cross-sectional view of the display-side casing of the tablet PC. FIG. 2 is a schematic diagram showing a configuration of hardware built in the main body side housing and the display side housing of the tablet PC. FIG. 3 is a timing chart for explaining signal detection of the detection signal processing unit. FIG. 4 is a diagram for explaining an operation example of a single finger single tap. FIG. 5 is a diagram for explaining an operation example of a single-finger double tap. FIG. 6 is a diagram for explaining an operation example of a two-finger double tap. FIG. 7 is a diagram for explaining an operation example of two-finger swipe. FIG. 8 is a diagram for explaining an example of fine drag operation with one finger. When a fine drag with one finger is detected, the drawing object in that range is deleted. FIG. 9 is a diagram for explaining an operation example of finger pinch. FIG. 10 is a diagram for explaining an operation example of a three-finger tap. FIG. 11 is a timing chart for explaining an example of timing at which a conventional coordinate input device detects pen input and hand / finger input.

  Embodiments of an information processing apparatus, a page turning method thereof, and a computer-executable program according to the present invention will be described below in detail with reference to the drawings. Note that the present invention is not limited to the embodiments. In addition, constituent elements in the following embodiments include those that can be easily assumed by those skilled in the art or that are substantially the same.

(Embodiment)
1-1 to 1-3 are schematic external views of a tablet PC to which a coordinate input device according to an embodiment of the present invention is applied. FIG. 1-4 is a partial cross-sectional view of the display-side casing of the tablet PC. As illustrated in FIG. 1A, the tablet PC 1 is a convertible type, and includes a main body side casing 14 and a display side casing 15 that are substantially rectangular parallelepipeds. The main body side housing 14 includes an input unit 10 including a keyboard and a slice pad.

  1-4, the display-side casing 15 includes a display device 11, a finger detection unit 21 disposed on the upper surface of the display device 11, and a pen detection unit 22 disposed on the lower surface side of the display unit 10. And. The finger detection unit 21 is for an operator to input information by touching a touch surface with a finger or a hand. The pen detection unit 22 is for an operator to input information by touching the pen 30 on the touch surface.

  Furthermore, the main body side case 14 and the display side case 15 are connected to each other by a connecting portion 13 at the center of each end, and the connecting portion 13 can rotate these cases in a direction to open and close each other. ing. Further, the display-side housing 15 can be rotated at least 180 degrees with the display-side housing 15 being opened with respect to the main body-side housing 14. In the PC use mode shown in FIG. 1-1, the tablet PC 1 can be used as a normal notebook PC by an operation by the input unit 10 and an operation on a touch surface by an indicator. Further, as shown in FIG. 1B, when the connecting portion 13 is rotated and folded so that the display-side housing 15 is superimposed on the main body-side housing 14, the display 11 faces the front. It becomes the tablet use mode shown in -3. In the tablet use mode, it is possible to use the touch surface with an indicator such as the pen 30 or a finger in the same manner as the operation with the input unit 10.

  In the present embodiment, simultaneous input of pen input and hand / finger input is possible. When the pen 30 is used, only the pen 30 is used for pointing input, and finger / hand input is used for command input. Specifically, by inputting a gesture (command) that cannot be input with a hand with a pen with the other hand, erroneous input due to placing the hand with the pen on the touch surface is eliminated. Improve operational efficiency when performing hand / finger input simultaneously. For example, in FIG. 1C, a pointing input is performed with the pen 30 with the right hand, and a gesture (command) that cannot be input with the right hand with the pen 30 is input with the left hand.

  FIG. 2 is a schematic diagram illustrating a configuration of hardware built in the main body side casing 14 and the display side casing 15 of the tablet PC 1. As shown in the figure, the tablet PC 1 includes a CPU 101, a ROM 102, a memory 103, a HDD (hard disk) 104, a display device 11, a graphics adapter 105, a coordinate input device 20, an input unit 10, a disk drive 106, The power supply circuit 108 is provided, and each unit is connected directly or indirectly through a bus.

  The CPU 101 controls the tablet PC 1 as a whole by the OS 111 stored in the HDD 104 connected via the bus, and controls functions based on various programs stored in the HDD 104. The ROM 102 stores a BIOS (Basic Input / Output System: basic input / output system) 102a, data, and the like.

  The memory 103 is composed of a cache memory and a RAM, and is a writable memory used as a work area for writing processing data of the execution program as a read area for the execution program of the CPU 101.

  An HDD (hard disk) 104 includes, for example, an OS 111 for controlling the entire tablet PC 1 such as a Windows (registered trademark) OS, and various drivers (display driver 112 a, coordinate detection device driver) for operating peripheral devices. 112b, etc.) 112, an application program (hereinafter referred to as “application program”) 113 for executing the display process of the display screen of the display device 11 and other processes in accordance with the input instruction of the coordinate detection device 20; In addition, it has a function of storing other applications 114 and the like for specific business purposes.

  The graphics adapter 105 converts display information into a video signal under the control of the CPU 101, and outputs the converted video signal to the display device 11. The display device 11 is a flat display such as a liquid crystal display or an organic EL display, for example, and has a function of displaying various types of information under the control of the CPU 101.

  The input unit 10 is a user interface for a user to perform an input operation, and includes a keyboard composed of various keys for inputting characters, commands, etc., a slice pad for moving a cursor on the screen, and selecting various menus. Etc.

  The disk drive 106 has a disk 107 such as a CD-ROM or DVD inserted therein, and reads / writes data on the disk 107.

  The power supply circuit 108 includes an AC adapter, an intelligent battery, a charger for charging the intelligent battery, a DC / DC converter, and the like, and supplies power to each device under the control of the CPU 101.

  The coordinate input device 20 is provided on the display screen of the display device 11, and allows an operator to input information such as pointing and commands by touching an indicator such as a finger, hand, or pen 30 on the touch surface. The coordinate position touched by the indicator is detected, and a pointing instruction and a command are output to the CPU 101. In addition, you may decide to implement | achieve each function of the coordinate input device 20 when a computer runs a program. Hereinafter, functions realized by the CPU 101 by executing the program will be described with the program as an operation subject.

  The coordinate input device 20 detects a coordinate touched by a finger / hand and outputs a first detection signal, and detects a coordinate touched by the pen and outputs a second detection signal. And a detection signal processing unit 23 that detects a pointing and a predetermined gesture from the first detection signal and the second detection signal, and outputs a pointing instruction and a command corresponding to the predetermined gesture to the application 113. It has.

  The pen detection unit 22 is an electromagnetic induction type detection device for detecting a coordinate position designated by the electromagnetic induction type pen 30. The pen detection unit 22 is arranged so that the loop coil crosses the X direction and the Y direction, and a current is passed through the loop coil to detect the induced voltage obtained in the LC circuit of the pen 30. The coordinate position instructed in is detected and output to the detection signal processing unit 23 as a second detection signal. The pen detection unit 22 also detects whether the pen 30 is in a position close to the tablet PC 1 (for example, “1” when the pen 30 is close to the pen 30, If not, “0”) is output to the detection signal processing unit 30. In the pen detection unit 22, when the pen 30 is in the vicinity of the pen detection unit 22, an induced voltage is detected, and thus it is possible to detect whether or not the pen 30 is in a close position.

  The finger detection unit 21 is a capacitance type detection device. The finger detection unit 21 includes transparent line-shaped conductors arranged in the X direction and the Y direction, and is pressed by the pen 30 at the coordinate position pressed by the finger / hand for operation input on the touch surface. All the coordinate positions pressed by the hand holding the pen 30, that is, the coordinate positions of the indicator that has touched the touch surface, are detected and output to the detection signal processing unit 23 as the second detection signal.

  The detection signal processing unit 23 detects a pointing input and a predetermined gesture input from the first detection signal input from the finger detection unit 21 and the second detection signal input from the pen detection unit 22, and detects a pointing instruction and a predetermined detection signal. A command corresponding to the gesture is output to the CPU 101.

  In response to a pointing instruction input from the detection signal processing unit 23, the application 113 displays lines, dots, and the like that constitute a part of characters, figures, and the like on the display device 11, and performs handwriting input of characters, figures, etc. Depending on the command to be processed or input from the detection signal processing unit 23, the display / editing / changing of characters / graphics displayed on the display device 11 such as enlargement / reduction, screen scrolling, mode switching, etc. Process.

  FIG. 3 is a timing chart for explaining signal detection of the detection signal processing unit 23. The detection signal processing unit 23 detects a pointing and a predetermined gesture from the first detection signal input from the finger detection unit 21 during a period when the pen 30 is not detected (pen detection signal “0”), and performs a pointing instruction and detection. A command corresponding to the predetermined gesture is output to the application 113. That is, during a period in which no pen is detected, pointing and a predetermined gesture can be input with fingers.

  Further, the detection signal processing unit 23 detects the pointing from the second detection signal input from the pen detection unit 22 during the period of detecting the pen 30 (pen detection signal “1”), and gives a pointing instruction to the application. It outputs to 113. In addition, the detection signal processing unit 23 invalidates the hand / finger pointing input during the period in which the pen 30 is detected (pen detection signal “1”), and the first detection signal input from the finger detection unit 21. , Only a predetermined gesture is detected, and a corresponding command is output to the application 113. That is, only a predetermined gesture can be input with a finger during a period in which the pen 30 is detected.

  Here, the predetermined gesture is, for example, (1) one-finger single tap, (2) one-finger double tap, (3) two-finger double tap, (4) two-finger swipe, (5) one For example, a fine drag with a finger, (6) a pinch of a finger, and (7) a three-finger tap. Such predetermined gestures define what cannot be input by the hand holding the pen 30, so that erroneous input will not occur even if the hand holding the pen 30 touches it. That is, even if a hand holding the pen 30 is touched, it is not detected as a predetermined gesture, so that erroneous input can be prevented, and operational efficiency can be improved by pen input and hand / finger input.

4-10 is explanatory drawing for demonstrating the relationship between a predetermined | prescribed gesture and the example of operation.
When the detection signal processing unit 23 detects a predetermined gesture, the detection signal processing unit 23 issues a corresponding command to the application 13, and the application 113 performs processing according to the command.

(1) Single-finger single tap FIG. 4 is a diagram for explaining an operation example of a single-finger single tap. The detection signal processing unit 23 outputs a menu display ON / OFF command to the application 113 when a single finger single tap is detected. When the menu display ON / OFF command is input, the application 113 switches menu display ON / OFF. For example, the user does not normally display the menu, makes the best use of the screen, performs a single finger single tap when necessary, and turns on the menu display. When no longer needed, perform a single finger single tap to turn off the menu display.

(2) Single-finger double tap FIG. 5 is a diagram for explaining an operation example of a single-finger double tap. The detection signal processing unit 23 outputs a canvas reduction command to the application 113 when detecting a single finger double tap. When the canvas reduction command is input, the application 113 reduces the canvas (for example, 50%).

(3) Two-finger double tap FIG. 6 is a diagram for explaining an operation example of a two-finger double tap. The detection signal processing unit 23 outputs a canvas enlargement command to the application 113 when detecting a two-finger double tap. When the canvas enlargement command is input, the application 113 enlarges the canvas (for example, 200%). The user can repeat the reduction / enlargement by repeating the one-finger double tap and the two-finger double tap.

(4) Two-finger swipe FIG. 7 is a diagram for explaining an operation example of two-finger swipe. The detection signal processing unit 23 outputs a scroll command to the application 113 when detecting a two-finger swipe. When the scroll command is input, the application scrolls the canvas in the swipe direction.

(5) Fine Drag with One Finger FIG. 8 is a diagram for explaining an example of fine drag operation with one finger. The detection signal processing unit 23 outputs an erasure command to the application 113 when detecting a fine drag with one finger. When the delete command is input, the application 113 deletes the drawing object in the range.

(6) Finger Pinch FIG. 9 is a diagram for explaining an operation example of finger pinch. When the detection signal processing unit 23 detects a pinch of a finger, the detection signal processing unit 23 outputs an enlargement / reduction command to the application 113. When an enlargement / reduction command is input, the application 113 performs enlargement / reduction according to the direction. In the same figure, enlargement is shown, and it can be reduced by pinching a finger in the opposite direction.

(7) Three-finger tap FIG. 10 is a diagram for explaining an operation example of a three-finger tap. When the detection signal processing unit 23 detects a three-finger tap, the detection signal processing unit 23 outputs a mode switching command to the application 113. When a mode switching command is input, the application 113 switches between a freehand mode (a mode for drawing a written object as it is) and a ruler mode (a mode for drawing a similar figure among drawing objects). The user can switch between the freehand mode and the ruler mode with a three-finger tap.

  As described above, according to the present embodiment, the pen 30 is touched by the finger detection unit 21 provided on the display screen and detecting the coordinates touched by the finger / hand and outputting the first detection signal. When the pen detection unit 22 that detects the detected coordinates and outputs the second detection signal and the pen 30 are used, the pointing is extracted from the second detection signal, a pointing instruction is issued, and the second Since the detection signal processing unit 23 that extracts only a predetermined gesture from the detection signal and issues a corresponding command is provided, when performing pen input and finger input simultaneously, erroneous input is reduced and operation efficiency is reduced. It becomes possible to improve. In other words, even when a hand holding a pen touches the touch surface, pointing input or command input is not performed, and thus erroneous input can be prevented. Since initial settings for the right hand and the left hand are not required, the user operation is simplified. It is possible to input a command with a hand not holding the pen on the touch surface in the vicinity of the pen.

  In the present embodiment, the predetermined gesture cannot be input by the hand holding the pen 30, for example, (1) one-finger single tap, (2) one-finger double tap, (3 2) Double finger double tap, (4) 2 finger swipe, (5) Fine drag with 1 finger, (6) Finger pinch, (7) 3 finger tap, etc. It is possible to eliminate erroneous input by a hand that is present.

  In the above embodiment, the coordinate input device is applied to a tablet PC. However, the present invention is not limited to this, and can be applied to all screen devices capable of finger detection and pen detection. For example, the present invention can also be applied to a notebook PC with a touch panel, a slate PC, and an ALL-in-one type PC. In the above embodiment, the tablet PC is exemplified as the information input device equipped with the coordinate input device according to the present invention. However, the information input device according to this embodiment is limited to the tablet PC. Instead, the present invention can be applied to various information input devices such as a desktop PC, a mobile phone, a PDA, and a digital camera.

  As described above, the coordinate input device, the coordinate input method, and the computer-executable program according to the present invention are useful when pen input and finger input are performed simultaneously.

1 Tablet PC
DESCRIPTION OF SYMBOLS 10 Input part 11 Display device 14 Main body side housing | casing 15 Display side housing | casing 20 Coordinate input device 21 Finger detection part 22 Pen detection part 23 Detection signal processing part 101 CPU
102 ROM
103 Memory 104 HDD (Hard Disk)
105 Graphics adapter 106 Disk drive 107 Disk 108 Power supply circuit 111 OS
112 Drivers 113 Apps 114 Other apps

Claims (5)

  1. A finger detection means provided on a display screen for detecting coordinates touched by a finger / hand and outputting a first detection signal;
    Pen detection means provided on the display screen for detecting coordinates touched by a pen and outputting a second detection signal;
    Pen position detection means for outputting a pen detection signal indicating whether or not the pen is in a position close to the device;
    When the pen detection signal indicates that the pen is in a position close to the device itself, the pointing is extracted from the second detection signal, a pointing instruction is issued, and the predetermined value is determined from the first detection signal. A signal processing means for extracting only the gestures and issuing a corresponding command;
    A coordinate input device comprising:
  2.   The predetermined gesture includes at least one of a one-finger single tap, a one-finger double tap, a two-finger double tap, a two-finger swipe, a fine drag with one finger, a finger pinch, and a three-finger tap. The coordinate input device according to claim 1, wherein the number is one.
  3.   The pen detection means is an electromagnetic induction type detection device arranged on the lower side of the display screen, and the finger detection means is a capacitance type detection device arranged on the upper side of the display screen. The coordinate input device according to claim 1, wherein:
  4. A step of detecting a coordinate touched by a finger / hand and outputting a first detection signal by a finger detection unit provided on the display screen;
    A step of detecting a coordinate touched by a pen and outputting a second detection signal by a pen detection unit provided on the display screen;
    A pen position detection step for outputting a pen detection signal indicating whether or not the pen is in a position close to the device;
    When the pen detection signal indicates that the pen is in a position close to the device itself, the pointing is extracted from the second detection signal, a pointing instruction is issued, and the predetermined value is determined from the first detection signal. Extracting only the gestures and issuing the corresponding command;
    The coordinate input method characterized by including.
  5. A step of detecting a coordinate touched by a finger / hand and outputting a first detection signal by a finger detection unit provided on the display screen;
    A step of detecting a coordinate touched by a pen and outputting a second detection signal by a pen detection unit provided on the display screen;
    A pen position detection step for outputting a pen detection signal indicating whether or not the pen is in a position close to the device;
    When the pen detection signal indicates that the pen is in a position close to the device itself, the pointing is extracted from the second detection signal, a pointing instruction is issued, and the predetermined value is determined from the first detection signal. Extracting only the gestures and issuing the corresponding command;
    A computer-executable program characterized by causing a computer to execute.
JP2010048281A 2010-03-04 2010-03-04 Coordinate input device, coordinate input method, and computer executable program Active JP5237980B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010048281A JP5237980B2 (en) 2010-03-04 2010-03-04 Coordinate input device, coordinate input method, and computer executable program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010048281A JP5237980B2 (en) 2010-03-04 2010-03-04 Coordinate input device, coordinate input method, and computer executable program

Publications (2)

Publication Number Publication Date
JP2011186550A JP2011186550A (en) 2011-09-22
JP5237980B2 true JP5237980B2 (en) 2013-07-17

Family

ID=44792768

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010048281A Active JP5237980B2 (en) 2010-03-04 2010-03-04 Coordinate input device, coordinate input method, and computer executable program

Country Status (1)

Country Link
JP (1) JP5237980B2 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5284448B2 (en) * 2011-11-25 2013-09-11 株式会社東芝 Information processing apparatus and display control method
JP5778592B2 (en) * 2012-01-30 2015-09-16 株式会社ジャパンディスプレイ Display device, touch detection device, and electronic device
JP5984439B2 (en) 2012-03-12 2016-09-06 キヤノン株式会社 Image display device and image display method
JP6127401B2 (en) * 2012-07-24 2017-05-17 カシオ計算機株式会社 Information processing apparatus, program, and information processing method
CN102819374B (en) * 2012-08-24 2015-07-29 北京壹人壹本信息科技有限公司 The touch control method of electric capacity and electromagnetic double-mode touch screen and hand-held electronic equipment
US9389737B2 (en) 2012-09-14 2016-07-12 Samsung Display Co., Ltd. Display device and method of driving the same in two modes
US9448684B2 (en) * 2012-09-21 2016-09-20 Sharp Laboratories Of America, Inc. Methods, systems and apparatus for setting a digital-marking-device characteristic
WO2014049671A1 (en) * 2012-09-26 2014-04-03 パナソニック株式会社 Display device and pen input erasing method
KR102092132B1 (en) 2012-11-30 2020-04-14 삼성전자주식회사 Electronic apparatus providing hovering input effect and control method thereof
WO2014147724A1 (en) * 2013-03-18 2014-09-25 株式会社 東芝 Electronic device and input method
JP6115221B2 (en) * 2013-03-21 2017-04-19 カシオ計算機株式会社 Information processing apparatus, information processing method, and program
JP5728592B1 (en) * 2013-05-30 2015-06-03 株式会社東芝 Electronic device and handwriting input method
JP6196101B2 (en) * 2013-09-02 2017-09-13 株式会社東芝 Information processing apparatus, method, and program
JP2015072670A (en) * 2013-09-04 2015-04-16 日東電工株式会社 Input device
CN103473012A (en) 2013-09-09 2013-12-25 华为技术有限公司 Screen capturing method, device and terminal equipment
JP5982345B2 (en) * 2013-10-29 2016-08-31 京セラドキュメントソリューションズ株式会社 Display device, electronic device, and computer program
KR101581672B1 (en) * 2014-06-09 2015-12-31 주식회사 더한 Multiple input pad and input system capable of detecting electrostatic touch and induced electromagnetic field
JP2016035706A (en) * 2014-08-04 2016-03-17 パナソニックIpマネジメント株式会社 Display device, display control method and display control program
JP2016035705A (en) * 2014-08-04 2016-03-17 パナソニックIpマネジメント株式会社 Display device, display control method and display control program
JP2016105264A (en) * 2014-11-19 2016-06-09 セイコーエプソン株式会社 Display device, display control method and display system
CN107710112A (en) 2014-12-24 2018-02-16 株式会社施乐库 Coordinate detecting device
US20170052631A1 (en) * 2015-08-20 2017-02-23 Futurewei Technologies, Inc. System and Method for Double Knuckle Touch Screen Control

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2599019B2 (en) * 1990-06-28 1997-04-09 三洋電機株式会社 Pen input device
JP3758865B2 (en) * 1998-12-01 2006-03-22 富士ゼロックス株式会社 Coordinate input device
US8997015B2 (en) * 2006-09-28 2015-03-31 Kyocera Corporation Portable terminal and control method therefor

Also Published As

Publication number Publication date
JP2011186550A (en) 2011-09-22

Similar Documents

Publication Publication Date Title
US9996176B2 (en) Multi-touch uses, gestures, and implementation
US10152948B2 (en) Information display apparatus having at least two touch screens and information display method thereof
US20180314400A1 (en) Pinch Gesture to Navigate Application Layers
US20200110513A1 (en) Swipe-based confirmation for touch sensitive devices
US9671893B2 (en) Information processing device having touch screen with varying sensitivity regions
US10268367B2 (en) Radial menus with bezel gestures
US20180225021A1 (en) Multi-Finger Gestures
US20160062467A1 (en) Touch screen control
US10031604B2 (en) Control method of virtual touchpad and terminal performing the same
JP2014102850A (en) User interface methods providing continuous zoom functionality
US8994674B2 (en) Information viewing apparatus, control program and controlling method
JP5490508B2 (en) Device having touch sensor, tactile sensation presentation method, and tactile sensation presentation program
US8850360B2 (en) Skipping through electronic content on an electronic device
JP5295328B2 (en) User interface device capable of input by screen pad, input processing method and program
US8547244B2 (en) Enhanced visual feedback for touch-sensitive input device
RU2505848C2 (en) Virtual haptic panel
JP6009454B2 (en) Enhanced interpretation of input events that occur when interacting with a computing device that uses the motion of the computing device
CA2788137C (en) Off-screen gestures to create on-screen input
US9292111B2 (en) Gesturing with a multipoint sensing device
KR101844366B1 (en) Apparatus and method for recognizing touch gesture
US6335725B1 (en) Method of partitioning a touch screen for data input
DE60029888T2 (en) Method and device for graphic feedback during time-dependent user input
EP1569075B1 (en) Pointing device for a terminal having a touch screen and method for using same
US8427445B2 (en) Visual expander
TWI443553B (en) Touch input across touch-sensitive display devices

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111212

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121023

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130118

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130326

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130329

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

Ref document number: 5237980

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160405

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R360 Written notification for declining of transfer of rights

Free format text: JAPANESE INTERMEDIATE CODE: R360

R371 Transfer withdrawn

Free format text: JAPANESE INTERMEDIATE CODE: R371

R360 Written notification for declining of transfer of rights

Free format text: JAPANESE INTERMEDIATE CODE: R360

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250