CA2855064A1 - Touch input system and input control method - Google Patents

Touch input system and input control method Download PDF

Info

Publication number
CA2855064A1
CA2855064A1 CA2855064A CA2855064A CA2855064A1 CA 2855064 A1 CA2855064 A1 CA 2855064A1 CA 2855064 A CA2855064 A CA 2855064A CA 2855064 A CA2855064 A CA 2855064A CA 2855064 A1 CA2855064 A1 CA 2855064A1
Authority
CA
Canada
Prior art keywords
touch
input
detected
operations
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA2855064A
Other languages
French (fr)
Inventor
Kenji Izumi
Yuji SHINOMURA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shimane Prefecture
Original Assignee
Shimane Prefecture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shimane Prefecture filed Critical Shimane Prefecture
Priority to CA2855064A priority Critical patent/CA2855064A1/en
Publication of CA2855064A1 publication Critical patent/CA2855064A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

A touch input system has an operation input device having a sensor and a CPU. The CPU is configured to display a mark as a selection object on a display based on a first touch operation.
The CPU is further configured to determine an operation mode of a series of touch operations including the first touch operation and the second touch operation. In determining, the CPU is configured to classify the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected.

Description

TITLE OF THE INVENTION
TOUCH INPUT SYSTEM AND INPUT CONTROL METHOD
BACKGROUND OF THE INVENTION
Field of the invention [0001]
The present invention relates to a touch input system and an input control method, and more particularly, to a touch input system and input control method for performing a touch operation with a finger.
Description of the Related Art
[0002]
Many of smartphones are typically provided with a touch panel to realize a touch operation by a user, and when a finger is released from the touch panel, a click operation or a double-click operation is determined. As a device with a touch panel, for example, an input device of Japanese Patent Laid-Open No. 2012-173749 is disclosed.
SUMMARY OF THE INVENTION
[0003]
However, if the click operation or the double-click operation is determined when the finger used for the touch operation is released from the touch panel, operation content may be erroneously determined depending on the situation because the finger may be released by accident.
[0004]
Also, in the input device of Japanese Patent Laid-Open No.
2012-173749, a touch panel is configured integrally with a display unit, and therefore there is a problem that the touch panel has to be used integrally with the display unit at any time.
[0005]
In view of the above, it is an object of the present invention to provide a touch input system and input control method, whereby a multi-touch operation can be made using a touch sensor without a display function and an appropriate touch operation can be made based on a user's decision instruction.
[0006]
A touch input system for solving the aforementioned problem includes: a display part; at least one operation input unit for receiving a touch operation, the at least one operation input unit having a sensor and; a display control part for displaying a mark as a selection object on the display part based on a first touch operation, the first touch operation being first detected based on a signal from the sensor; an operation mode determination part, when in a state where the first touch operation is detected a second touch operation following the first touch operation is detected, for determining an operation mode of a series of touch operations including the first touch operation and the second touch operation; and an input operation for performing part for performing an input process corresponding to the operation mode, and wherein the operation mode determination part classifies the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected, and determines the touch operations.
[0007]
Here, the aforementioned operation mode determination part may determine the operation mode on the basis of a correspondence relationship between the first touch operation and the second touch operation.
[0008]
Alternatively, the aforementioned operation mode may include at least one of a click operation, a double-click operation, a zoom in/out operation, a drag operation, a flip operation, a scroll operation, a swipe operation, and a rotating operation.
[0009]
When the aforementioned operation input unit has two operation input units, display areas on a right side and a left side of the display part may be preliminarily related to each operation input unit, and the display area on the right side and the left side may be set as a selection screen area in the order in which a touch first occurs.
[0010]
An input control method performed by a computer for solving the aforementioned problem includes: displaying a mark as a selection object on a display part based on a first touch operation that is first detected on a basis of a signal from a sensor provided in an operation input unit; when, in a state where the first touch operation is detected, a second touch operation following the first touch operation is detected, determining an operation mode of a series of touch operations including the first touch operation and the second touch operation; and performing an input process corresponding to the operation mode, and wherein the determining includes classifying the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected, and determining the touch operations.
[0011]
Further, a program for solving the aforementioned problem is a program for causing a computer to perform the aforementioned input control method.
[0012]
According to the present invention, a multi-touch operation can be made using a touch sensor without a display function and an appropriate touch operation can be made based on a user's decision instruction.
[0013]
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014]
Fig. 1 is diagram showing an example of a hardware configuration of a touch input system according to a first embodiment;

Fig. 2A is a diagram showing an example of an appearance of an operation input unit;
Fig. 2B is a diagram showing the example of the appearance of the operation input unit;
Fig. 3A is a diagram showing a usage mode of the operation input unit;
Fig. 39 is a diagram showing a usage mode of the operation input unit;
Fig. 4 is a timing chart showing an example of a first touch operation, a second touch operation and operation images;
Fig. 5 is a diagram showing an example of an operation mode;
Fig. 6A is a diagram showing an example of a touch operation;
Fig. 6B is a diagram showing an example of a touch operation;
Fig. 6C is a diagram showing an example of a touch operation;
Fig. 69 is a diagram showing an example of a touch operation;
Fig. 7 is a diagram showing an example of a functional configuration of the touch input system according to the first embodiment;
Fig. 8 is a flowchart showing an outline of an example of the overall actions of the touch input system in the first embodiment;

Fig. 9 is a flowchart for explaining an input process of the first embodiment in detail;
Fig. 10 is a diagram showing a usage mode in the case where a user performs touch operations with one hand;
Fig. 11 is a diagram for explaining a correspondence relationship between an operation area of a first operation input unit and display areas of a display part in a touch input system of a variation 2;
Fig. 12 is a diagram for explaining a correspondence relationship between an operation area of a second operation input unit and display areas of the display part in the touch input system of the variation 2;
Fig. 13 is a diagram for explaining a correspondence relationship between operation areas of two operation input units and a display area of a display part realized by a touch input system of a variation 3;
Fig. 14 is a diagram showing usage modes of a touch input system_ of a variation 4, which is adapted to be able to perform a touch operation on keyboard screens displayed on a display device;
Fig. 15 is a diagram for explaining a mode where a user sits on a sofa using the operation input unit; and Fig. 16 is a diagram for explaining a usage mode of the operation input unit.
DESCRIPTION OF THE EMBODIMENTS
(First Embodiment)
[0015]
Explanation will be hereinafter made for a first embodiment of a touch input system of the present invention.
The touch input system (hereinafter referred to as an input system) 1 is a system that receives a touch operation performed with a finger.
[Configuration of input system 1]
[0016]
Fig. 1 is a diagram showing an example of a hardware configuration of the input system 1 according to the first embodiment of the present invention. As illustrated in Fig.
1, the input system 1 is provided with a main body unit 2 and an operation input unit 3. The main body unit 2 has a CPU
(Central Processing Unit) 21, ROM (Read Only Memory) 22, RAM
(Random Access Memory) 23, speaker 24, display device 25 and interface 26.
[0017]
The CPU 21 is connected to the respective components through a bus to perform a transfer process of a control signal or data, as well as executing various types of programs for realizing the overall actions of the input system 1 and performing processes such as an arithmetic process.
[0018]
The ROM 22 is recorded with the programs and data necessary for the overall actions of the input system 1. The programs are stored in a recording medium such as a DVD-ROM, and read in the RAM 23 to start the execution by the CPU 21, and thereby the input system 1 of the present embodiment is realized.
[0019]
The RAM 23 temporarily retains data or a program.
[0020]
The speaker 24 outputs sound corresponding to sound data generated by the CPU 21, such as data on a sound effect.
[0021]
The display device 25 can be a flat panel display such as a liquid crystal display or an EL (Electro-Luminescence) display.
[0022]
The operation input unit 3 connected to the interface 26 is provided with a sensor adapted to, when inputting information, detect a touch operation performed by a user, and the output of the sensor is transmitted to the CPU 21 through the interface 26 and then processed. This embodiment is configured to connect the interface 26 and the operation input unit 3 to each other wirelessly; however, a wired configuration is also possible.
In this embodiment, a user places a finger on the operation input unit 3 to move a corresponding mark such as a cursor on a display screen of the display device 25, and thereby operates the main body unit 2; however, as long as such an operation is available, any system can be employed as a configuration of the operation input unit 3.
[0023]

The operation input unit 3 has, for example, a touch panel and the like. The sensor provided in the operation input unit 3 is, for example, a pressure sensor adapted to detect a user's touch on the operation input unit 3. Alternatively, the sensor of the present embodiment is, for example, a multi-touch sensor, and in such a case, it is not necessary to, like a track pad, separately provide a hardware structure such as a button.
[0024]
As the above-described mark, an icon, a pointer, a cursor, or the like may be employed.
[0025]
Fig. 2 is a diagram showing an example of an appearance of the operation input unit 3, in which A shows a top view of the operation input unit 3, and B shows a side view of the operation input unit 3. The operation input unit 3 illustrated in Fig. 2A is configured to include an operation surface 31 mounted with the above-described sensor, and an outer frame 32, and when a finger touches the operation surface 31, the touch is detected by the sensor.
[0026]
A menu button 33 plays a role in switching between an input operation through a software keyboard and an input operation through a mark such as a cursor. Note that the present invention may be adapted to switch between the input operations with a preset touch operation (such as a touch operation with five fingers).
[0027]

The operation input unit 3 is configured to be user-friendly by, as illustrated in Fig. 2B, providing a tilt so as to make a front side lower and a rear side higher as viewed from a user.
[Outline of touch operation]
[0028]
Next, the outline of the touch operation realized by the input system 1 is described with reference to Figs. 3, 4 and 5. Fig. 3 is a diagram showing an example of usage modes of the touch operation through the operation input unit 3, in which A illustrates the case of performing the touch operation only with the thumbs of both hands 110 and 120, and B illustrates the case of performing the touch operation with the forefinger of the right hand 110 and the forefinger and thumb of the left hand 120. Fig. 4 is a timing chart showing an example of a first touch operation, a second touch operation, and operation images.
Fig. 5 is a diagram showing an example of an operation mode.
[0029]
For example, as illustrated in Fog. 3A or 3B, a user touches the operation input unit 3 with the one fingers of the both hands 110 and 120, or with the one finger of the right hand 110 and the two fingers of the left hand 120, and thereby the sensor of the operation input unit 3 detects the touch operations by the user. For example, Figs. 4A to 4E illustrate an example where a user gets a touch with two fingers (forefinger and thumb) of the right hand, and then gets a touch with one finger (thumb) of the left hand (second touch operation) , as well as performing a pinch out motion of spreading the two fingers of the right hand (first touch operation) , whereby an operation mode of a screen enlarging action (Fig. 4C) corresponding to the pinch out motion is fixed, and the display screen of the display device 25 is enlarged. In this case, referring to the operation images in Figs. 49 and 4E, the CPU 21 displays respective marks Al and A2 at positions in a display area 251 of the display device 25, which correspond to positions (dl ¨> d3, d2 ¨> d4) at which the respective fingers touch the operation input unit 3.
[0030]
Note that in this embodiment, the marks Al and A2 are displayed in response to the first touch operation; however, the present invention may be adapted to delete all the marks Al and A2 collectively at the time of performing the second touch operation.
[0031]
In the input system 1, the CPU 21 determines the operation mode of the series of touch operations including the first touch operation and the second touch operation by, for example, referring to operation mode patterns illustrated in Fig. 5.
These patterns are recorded in the ROM 22, and when performing a tap operation on the input system 1, a corresponding pattern is read from the ROM 22.
[0032]
As Fig. 5 illustrates the example, each of the operation modes is preliminarily related to respective operation contents of the first and second touch operations. Note that the first touch operation and the second touch operation are performed with different fingers, respectively, and the second touch operation is performed in the case where the first touch operation is performed.
[0033]
In Fig. 5, for example, in the case where the first touch operation is a "single touch" performed with one finger, and the second touch operation is a "single tap", the operation mode is set as a "click operation", whereas in the case where the first touch operation is a "single touch", and the second touch operation is a "double tap", the operation mode is set as a "double-click operation". A tap operation may be performed with one or more fingers.
[0034]
Also, in Fig. 5, in the case where the first touch operation is a multi-touch "pinch out motion or pinch in motion" performed with two fingers, and the second touch operation is a "single touch", the operation mode is set as a "screen enlarging/reducing operation" (see Fig. 4 for the enlarging operation). In the case where the first touch operation is a "single touch", and the second touch operation is a "slide" that moves fingers (regardless of the number of fingers), the operation mode is set as a "drag operation". In the case where the first touch operation is a "single touch", and the second touch operation is a "flip" that, while moving a finger after a touch, immediately release the finger, the operation mode is set as a "flip operation".
[0035]
Note that a correspondence relationship between the first touch operation and the second touch operation is not limited to any of those exemplified in Fig. 5. For the correspondence relationship, various settings are possible. For example, for the operation mode of the "screen enlarging/reducing operation", a correspondence relationship between the first touch operation (single touch) and the second touch operation (pinch out or pinch in motion performed with two fingers) may be set.
[0036]
Also, for the case where the operation mode is the "drag operation", a correspondence relationship between the first touch operation (single touch) , and after the second touch operation (single touch or multi-touch) , dragging performed with a finger used in the first touch operation for the single touch may be set.
[0037]
Further, for the case where the operation mode is the "flip operation", a correspondence relationship between the first touch operation (single touch) , and after the second touch operation (single touch or multi-touch) , a flip performed with a finger used in the first touch operation for the single touch may be set.
[0038]
Fig. 6 is a diagram showing an example of motions corresponding to basic touch operations, in which A exemplifies a touch motion performed on the operation input unit 3 with fingers, B a slide motion that moves two fingers in an arbitrary direction, C a slide motion that moves three fingers leftward, and D a rotational motion that moves two fingers clockwise.
[0039]
Each of Figs. 6A to 6D exemplifies the case where the first touch operation is performed with one or more fingers of the right hand 110, and the second touch operation is performed with the thumb of the left hand 120.
[0040]
Note that the operation surface of the operation input unit 3 is mounted with the sensor, and therefore when a finger is touched to the operation surface of the operation input part 3, the sensor detects the touch operation.
[0041]
Fig. 7 is a diagram showing an example of a functional configuration of the main body unit 2 of the input system 1, which is realized in the hardware configuration illustrated in Fig. 1.
[0042]
As shown in Fig. 7, the main body unit 2 of the input system 1 is provided with a storage part 101, display control part 102, display part 103, operation mode determination part 104, and input operation performing part 105.
[0043]
The storage part 101 is configured to include the ROM 22 and the RAM 23 in Fig. 1, and stores data such as data on an operation mode pattern (see Fig. 5) .
[0044]
The display control part 102 is made to function by the CPU 21. The display control part 102 displays a mark (such as a cursor) as a selection object on the display part 103 on the basis of a first touch operation that is first detected on the basis of a signal from the sensor 31 of the operation input unit 3.
[0045]
The display part 103 is configured to include the display device 25 in Fig. 1, and provided in order to make a mark viewable to a user.
[0046]
In the case where in a state where the above-described first touch operation is detected, a second touch operation following the first touch operation is detected, the operation mode determination part 104 determines an operation mode of a series of touch operations including the first touch operation and the second touch operation. As the operation mode, for example, the click operation, double¨click operation, screen enlarging/reducing operation, drag operation, flip operation, scroll operation, swipe operation, screen rotating operation, and the like are available. A process for determining an operation mode will be described later in detail.
[0047]
The input operation performing part 105 is intended to perform an input operation corresponding to the operation mode determined by the operation mode determination part 104. In addition, the operation mode determination part 104 and the input operation performing part 105 are both made to function by the CPU 21.
[Actions of input system 1]
[0048]
In the following, the actions of the input system 1 are described with reference to Figs. 1, 4, 5, 8 and 9. Fig. 8 is a flowchart showing an outline of an example of the overall actions of the input system 1.
[0049]
In Fig. 8, first, the CPU 21 displays, for example, a cursor as a mark in the display area 251 of the display device 25 on the basis of a first touch operation through the operation input unit 3 (Step S101) . In the example of Fig. 4, the CPU 21 displays the cursors Al and A2 at the positions in the display area 251, which correspond to the positions dl and d2 of the two fingers touched by the first touch operation. In this case, the CPU
21 receives signals from the sensor 31 of the operation input unit 3 through the interface 26, and on the basis of the signals, determines the respective positions (dl ¨> d3, d2 ¨> d4) of the two fingers (contact points of the two fingers in touch with the operation input unit 3) . The signals from the sensor 31 include pieces of positional information respectively indicating the positions of the contact points in the touch operation with reference to the operation surface of the operation input unit 3, and the CPU 21 specifies the positions Al and A2 in the display area of the display device 25 correspondingly to the movements of the contact points, which are detected on the basis of the signals from the sensor 31, respectively.
[0050]
Note that in Step S101, the CPU 21 functions as the display control part 102.
[0051]
Subsequently, in the case where in a state where the first touch operation is detected in Step 101, a second touch operation following the first touch operation is detected on the basis of a signal from the sensor 31 (Yes in Step S102) , the CPU 21 determines an operation mode of a series of touch operations including the first touch operation and the second touch operation (Step S103) . In this case, at the time of detecting the second touch operation, the CPU 21 refers to, for example, the operation mode patterns illustrated in Fig. 5 to determine the operation mode of the series of touch operations including the first touch operation and the second touch operation. In other words, at the time of detecting the second touch operation, the CPU 21 fixes the operation mode.
[0052]
For example, in the example of Fig. 4, the CPU 21 specifies the first touch operation (after the touch with the two fingers, the pinch out operation) and the second touch operation (single touch) on the basis of the signals from the sensor 31. Then, the CPU 21 refers to the operation mode patterns (Fig. 5) to determine the operation mode as the "screen enlarging operation" from the correspondence relationship between the two touch operations.
[0053]
In Steps S102 and S103, the CPU 21 functions as the operation mode determination part 104.
[0054]
In addition, a time point of fixing the operation mode may be after a predetermined time has elapsed since the second touch operation was detected.
[0055]
In Fig. 8, in the case where the second touch operation is not detected (No in Step S102), the CPU 21 performs a cursor display process (Step S101) based on the first touch operation.
[0056]
Further, the CPU 21 performs an input operation corresponding to the operation mode (Step S104) . In the example of Fig. 4, the CPU 21 determines the operation mode as the screen enlarging operation, and therefore performs a screen enlarging operation according to the pinch out operation.
[0057]
Note that in Step S104, the CPU 21 functions as the input operation performing part 105.
[0058]
Next, with reference to Fig. 9, an input process performed in the input system 1 is more specifically described. Fig. 9 is a flowchart for explaining the input process of the present embodiment in detail.
[0059]
In addition, in Fig. 9, S201, S202, S204 and S206 to S208, S203 and S205, and S209 correspond to the operation mode determination part 104, the display control part 102, and the input operation performing part 105, respectively.
[0060]
In Fig. 9, illustrated as an example is the case where for S201 to S203, and S204 to S209, STEP1 and STEP2 are described as respectively different group processes.
[0061]
In Fig. 9, first, on the basis of a signal from the sensor 31, the CPU 21 senses a finger first touched (Step S201) , and classifies the sensed finger as STEP1 (first touch operation group) (Step S202) . In this case, if a plurality of fingers is detected, touch operations performed with the plurality of fingers are all classified as the first touch operation group.
[0062]
Subsequently, on the basis of the touch operation performed with the finger classified as STEP1, the CPU 21 displays a mark on the display device 25 (Step S203) . An example of the display is the display of the marks Al and A2 illustrated in Fig. 49.
[0063]
After the above-described display of the mark, in the case where on the basis of a signal from the sensor 31, the CPU 21 detects a finger touched, the CPU 21 classifies the detected finger as STEP2 (second touch operation group) (Step S204) .

That is, on the basis of a time difference between when touch operations were detected, the CPU 21 classifies each of the touch operations into any of a first touch operation group and a second touch operation group to make a determination. The CPU 21 can determine the time difference between the touch operations from reception times of sensing signals from the sensor 31. In this case, a time difference for making the classification into the first touch operation group and the second touch operation group is predetermined.
[0064]
If in Step S204, the touch operation by the sensed finger is an operation for which a mark is to be displayed, the CPU
21 displays a mark on the display device 25 on the basis of the touch operation classified as STEP2 (Step S205). On the other hand, in the case where it is not necessary to display a mark as STEP2, a processing step in S205 is not necessary.
[0065]
The CPU 21 determines that as a result of detecting the touch of the finger in Step S203, a user intends to perform a motion of a "decision" to fix the content of the touch operation (Step S206), and determines the type of the "decision" motion by the fingers respectively classified as STEP1 and STEP2 (Step S207). When performing Step S207, a corresponding operation mode is extracted from the operation mode patterns exemplified in Fig. 5.
[0066]

After that, the CPU 21 assigns the above "decision" motion to the STEP1 mark (Step S208) to perform a variation process of the STEP1 mark (Step S209). Note that, in Step S208, for example, in the case where a "decision" motion indicating a screen enlarging motion is assigned, the CPU 21 moves STEP1 marks according to movements of fingers classified as STEP1, which are performed for the enlarging motion (see the marks Al and A2 in Fig. 4E).
[0067]
As described above, according to the input system 1 of the present embodiment, a user performs a first touch operation and a second touch operation, and as a result, an operation mode corresponding to the series of touch operations determined as different operations on the basis of a time difference between the touches is determined to perform an input operation corresponding to the operation mode. Note that in order for the operation mode to be determined, the user should perform the first touch operation and the second touch operation. That is, the user should perform the second touch operation with a finger different from that used for the first touch operation.
This is equivalent to, every time a second touch operation is performed, issuing a user's instruction to perform an input operation, and as a result, a correct input operation is performed.
[0068]
Also, the input operation unit 3 is independently configured separately from the display device 25. That is, according to the input system 1, the touch sensor not having a display function can be used to perform a multi-touch operation.
[0069]
Note that the operation input unit 3 does not have any operation part such as an operation key or an operation button.
This is an optional feature which results in the input system 1 having a reduced manufacturing cost. Also, a user can then perform an operation while freely moving fingers without using an operation key, an operation button, or the like of the operation input unit 3, and therefore a degree of freedom of finger motions of one or both hands is increased. That is, the input system 1 is also provided with an aspect that makes an operation more intuitive.
[0070]
Next, variations of the input system 1 of the present embodiment are described.
(Variation 1)
[0071]
In the above, described is the case where with reference to Fig. 4, the input operation is performed correspondingly to the touch operations with the fingers of the both hands.
However, apart from this, the present invention may be adapted to perform touch operations not with both hands but with one hand.
[0072]
Fig. 10 is a diagram showing a usage mode in the case where a user performs touch operations with one hand. In the example illustrated in Fig. 10, a first touch operation is performed with the forefinger of the right hand, and a second touch operation is performed with the thumb of the right hand. This can be performed by performing processing in Steps S101 to S104 in the flowchart of Fig. 8 and processing in Steps S201 to S209 in the flowchart of Fig. 9.
[0073]
By performing such touch operations, a series of touch operations can be performed with one hand.
(Variation 2)
[0074]
In the above, a usage mode of two operation input units is not referred to; however, the present invention may be adapted to use two operation input units. In the following, with reference to Figs. 11 and 12, functions of operation input units 3A and 35 are described. Fig. 11 is a diagram for explaining a correspondence relationship between an operation area 311 of the operation input unit 3A and display areas 241 and 242 of a display device 25. Fig. 12 is a diagram for explaining a correspondence relationship between an operation area 321 of the operation input unit 3B and display areas 243 and 244 of the display device 25.
[0075]
In the example of Fig. 11, the operation area (operation surface) 311 of the operation input unit 3A is related to the first display area 241 and the second display area 242 (also collectively referred to as a "right side display area") of the entire display area of the display device 25. The relating is done according to, for example, a mapping table for converting pieces of coordinate data in the respective areas to each other.
Note that the first display area 241 is shared by part of the below-described second display area 244.
[0076]
In this input system 1, a total area of the first display area 241 and the second display area 242 are set larger than an area obtained by evenly halving the entire display area along the center line (in the diagram, indicated by a boundary line between the first display area 241 and the second display area 242) of the entire display area.
[0077]
In the example of Fig. 12, the operation area (operation surface) 321 of the operation input unit 3B is related to the first display area 243 and the second display area 244 (also collectively referred to as a "left side display area") of the entire display area of the display device 25. The relating is done according to, for example, a mapping table for converting pieces of coordinate data in the respective areas to each other.
Note that the first display area 243 is shared by part of the above-described second display area 242.
[0078]
In this input system 1, a total area of the first display area 243 and the second display area 244 are set larger than an area obtained by evenly halving the entire display area along the center line (in the diagram, indicated by a boundary line between the first display area 243 and the second display area 244) of the entire display area.
[0079]
According to the input system of the present variation, the two operation input units 3A and 3B respectively for performing operations on the predetermined display areas are provided, and therefore an operation input unit for performing a touch operation on a corresponding display area is used. In this case, the display areas 241 to 244 of the display device 25, each of which is preliminarily related to any of the operation input units 3A and 3B, are set as a selection screen area for performing a selection operation of a mark in order on which a touch is first got. In this regard, a user can perform the selection operation with any of the left and right hands.
For example, the input system can also accommodate an ambidextrous user.
[0080]
For example, in the case of getting a first touch on the operation input unit 3A, the CPU 21 sets the first display areas 241 and 242 as the selection screen area.
(Variation 3)
[0081]
In an input system of a variation 3, the two input operation units 3A and 3B of the variation 2 may be used in combination.
[0082]
Fig. 13 is a diagram showing an example where the two input operation units 3A and 3B are used in combination. In the example of Fig. 13, the input operation units 3A and 3B are combined with each other, and the operation areas of the respective input operation units 3A and 3B are related to a display area 245 of a display device 25.
[0083]
Note that in the input system of this variation, the respective operation areas of the input operation units 3A and 3B are related to the display area so as to have a one-to-one correspondence relationship.
(Variation 4)
[0084]
An input system of a variation 4 is characterized by setting keyboard screens in predetermined areas of a display screen 300, respectively, and in order to make it possible to perform a touch operation on each of the keyboard screens, relating areas of the respective keyboard screens, and an operation area of an operation input unit 3 or operation areas of respective operation input units 3A and 3B. to each other
[0085]
For example, Fig. 14 exemplifies usage modes of the input system that is adapted to be able to perform a touch operation on the keyboard screens displayed on a display device. In Fig.
14, the input system is configured to arrange the two keyboard screens on the display screen 300, and to be able to select data within each of the keyboard screens by, for example, touching a finger to the operation input unit 3.
[0086]

Also, in the example of the operation input units 3A and 3B, the input system is configured to be able to touch a finger to each of the operation input units 3A and 3B to be able to select data within a corresponding one of the keyboard screens.
This enables a user to perform a touch operation through each of the operation input units.
[0087]
The input system may be used in various situations. For example, Fig. 15 exemplifies a mode where a user sits on a sofa using the operation input unit 3. Also, for example, Fig. 16 exemplifies another usage mode of the operation input unit 3.
(Other variations)
[0088]
In the above, as an example, described with reference to Fig. 4 is the case of displaying the two marks Al and A2 corresponding to the touch operations with the two fingers;
however, the present invention is not limited to this. Even in the case of an input system using a single mark corresponding to a touch operation with one finger, or three or more marks corresponding to a touch operation with three or more fingers, the same workings and effects can be produced.
[0089]
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims (6)

WHAT IS CLAIMED IS:
1. A touch input system comprising:
a display part;
at least one operation input unit for receiving a touch operation, the at least one operation input unit having a sensor and;
a display control part for displaying a mark as a selection object on the display part based on a first touch operation, the first touch operation being first detected based on a signal from the sensor;
an operation mode determination part, when in a state where the first touch operation is detected a second touch operation following the first touch operation is detected, for determining an operation mode of a series of touch operations including the first touch operation and the second touch operation; and an input operation for performing part for performing an input process corresponding to the operation mode, and wherein the operation mode determination part classifies the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected, and determines the touch operations.
2. The touch input system according to claim 1, wherein the operation mode determination part determines the operation mode based on a correspondence relationship between the first touch operation and the second touch operation.
3. The touch input system according to claim 1 or 2, wherein the operation mode includes at least one of a click operation, a double-click operation, a zoom in/out operation, a drag operation, a flip operation, a scroll operation, a swipe operation, and a rotating operation.
4. The touch input system according to any one of claims 1 to 3, wherein: when the operation input unit has two operation input units, display areas on a right side and a left side of the display part preliminarily related to each operation input unit, and the display area on the right side and the left side are set as a selection screen area in the order in which a touch first occurs.
5. An input control method performed by a computer, comprising:
displaying a mark as a selection object on a display part based on a first touch operation that is first detected on a basis of a signal from a sensor provided in an operation input unit;
when, in a state where the first touch operation is detected, a second touch operation following the first touch operation is detected, determining an operation mode of a series of touch operations including the first touch operation and the second touch operation; and performing an input process corresponding to the operation mode, and wherein the determining includes classifying the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected, and determining the touch operations.
6. A
storage media having a program that, when executed, to cause a computer to perform the following steps displaying a mark as a selection object on a display part based on a first touch operation that is first detected on a basis of a signal from a sensor provided in an operation input unit;
when, in a state where the first touch operation is detected, a second touch operation following the first touch operation is detected, determining an operation mode of a series of touch operations including the first touch operation and the second touch operation; and performing an input process corresponding to the operation mode, and wherein the determining includes classifying the touch operations into any of groups of first touch operations and second touch operations based on a time difference between the touch operations detected, and determining the touch operations.
CA2855064A 2014-06-25 2014-06-25 Touch input system and input control method Abandoned CA2855064A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2855064A CA2855064A1 (en) 2014-06-25 2014-06-25 Touch input system and input control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA2855064A CA2855064A1 (en) 2014-06-25 2014-06-25 Touch input system and input control method

Publications (1)

Publication Number Publication Date
CA2855064A1 true CA2855064A1 (en) 2015-12-25

Family

ID=54851648

Family Applications (1)

Application Number Title Priority Date Filing Date
CA2855064A Abandoned CA2855064A1 (en) 2014-06-25 2014-06-25 Touch input system and input control method

Country Status (1)

Country Link
CA (1) CA2855064A1 (en)

Similar Documents

Publication Publication Date Title
JP5204286B2 (en) Electronic device and input method
JP5832784B2 (en) Touch panel system and electronic device using the same
JP5529616B2 (en) Information processing system, operation input device, information processing device, information processing method, program, and information storage medium
US9354780B2 (en) Gesture-based selection and movement of objects
US20160202778A1 (en) Keyboard and Touchpad Areas
WO2011125352A1 (en) Information processing system, operation input device, information processing device, information processing method, program and information storage medium
TWI490775B (en) Computing device, method of operating the same and non-transitory computer readable medium
WO2014047084A1 (en) Gesture-initiated keyboard functions
JP5846129B2 (en) Information processing terminal and control method thereof
JP2009151718A (en) Information processing device and display control method
TWI490771B (en) Programmable display unit and screen operating and processing program thereof
JP2012079279A (en) Information processing apparatus, information processing method and program
US20120075202A1 (en) Extending the touchable area of a touch screen beyond the borders of the screen
US20150002433A1 (en) Method and apparatus for performing a zooming action
JP6017995B2 (en) Portable information processing apparatus, input method thereof, and computer-executable program
CN106325726B (en) Touch interaction method
JP6370118B2 (en) Information processing apparatus, information processing method, and computer program
JP5414134B1 (en) Touch-type input system and input control method
JPWO2012111227A1 (en) Touch-type input device, electronic apparatus, and input method
US20150309601A1 (en) Touch input system and input control method
JP5653062B2 (en) Information processing apparatus, operation input apparatus, information processing system, information processing method, program, and information storage medium
JP5477108B2 (en) Information processing apparatus, control method therefor, and program
US20100201638A1 (en) Operation method of touch pad with multiple function modes, integration system thereof, and computer program product using the operation method
US20150091831A1 (en) Display device and display control method
CA2855064A1 (en) Touch input system and input control method

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20190607

FZDE Dead

Effective date: 20200831