WO2010122825A1 - Mobile terminal - Google Patents

Mobile terminal Download PDF

Info

Publication number
WO2010122825A1
WO2010122825A1 PCT/JP2010/050540 JP2010050540W WO2010122825A1 WO 2010122825 A1 WO2010122825 A1 WO 2010122825A1 JP 2010050540 W JP2010050540 W JP 2010050540W WO 2010122825 A1 WO2010122825 A1 WO 2010122825A1
Authority
WO
WIPO (PCT)
Prior art keywords
contact points
movement
unit
control unit
image
Prior art date
Application number
PCT/JP2010/050540
Other languages
French (fr)
Japanese (ja)
Inventor
英樹 根本
託也 千葉
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Publication of WO2010122825A1 publication Critical patent/WO2010122825A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a portable terminal equipped with a pointing device such as a touch pad or a touch panel.
  • a mobile terminal such as a mobile phone is equipped with various input devices for receiving operation instructions from a user.
  • these input devices one that can input an operation instruction by an intuitive operation such as a touch pad or a touch panel is known (for example, see Japanese Patent Application Laid-Open No. 2008-258805).
  • This touch pad or touch panel accepts an operation instruction based on input position information obtained by a touch sensor that senses a change in capacitance or contact pressure associated with contact with the operation surface.
  • a stylus pen or a finger When performing an operation for inputting an operation instruction to a touch panel or a touch pad, a stylus pen or a finger is generally used.
  • the stylus pen has a very small contact area with the operation surface, and the contact point with the operation surface is unlikely to change depending on how the user holds the stylus pen. For this reason, it is possible to cause the touch sensor to detect an input operation that matches the operation instruction intended by the user.
  • the finger has a large contact area with the operation surface. For this reason, there is a possibility that the touch sensor may detect the input position information by being dispersed within the contact range of the finger due to the influence of delicate force of the finger.
  • the mobile terminal executes various processes such as a display process based on the input position information detected by the touch sensor.
  • various processes such as a display process based on the input position information detected by the touch sensor.
  • the input position information detected by the touch sensor changes due to the influence of a delicate finger force
  • there is a possibility of executing processing different from the operation instruction intended by the user. is there. For this reason, the user cannot obtain sufficient operability when performing an input operation using a finger.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a portable terminal capable of stabilizing the movement operation of an image such as a cursor and performing display processing with higher accuracy. To do.
  • a mobile terminal includes a display unit that displays a predetermined image, an operation unit that receives an instruction to move the image, and the operation unit that receives the instruction.
  • a detection unit that detects information on the contact points, a speed calculation unit that obtains a moving speed between the contact points based on information on the plurality of contact points detected continuously by the detection unit, and a speed calculation unit.
  • a control unit that performs display control of the image based on information about the contact point that invalidates movement between the contact points when the calculated moving speed is greater than a threshold; It is characterized by having.
  • the portable terminal detects a display unit that displays a predetermined image, an operation unit that receives an instruction to move the image, and information regarding a contact point on the operation unit that has received the instruction. And when the detection unit finishes detecting the information on the contact point, a plurality of information between the contact points is detected based on the information on the plurality of contact points detected continuously immediately before the detection ends. When a plurality of movement directions between the contact points calculated by the direction calculation unit and the direction calculation unit for obtaining a movement direction do not match, at least movement between the contact points whose movement direction has changed is invalidated And a control unit that performs display control of the image based on information on the contact point.
  • the mobile terminal according to the present invention can stabilize the movement operation of an image such as a cursor, and can perform display processing with higher accuracy.
  • surface which shows an example of the movement distance calculated
  • a portable terminal As a portable terminal to which the present invention is applied, a portable terminal that is formed in a card shape and that allows a user to input an operation instruction by touching the display with a finger will be described as an example.
  • FIG. 1 is an external perspective view showing an embodiment of a portable terminal according to the present invention.
  • the mobile terminal 1 includes a rectangular plate-shaped casing 11.
  • a touch panel 14 occupies most of one surface of the casing 11.
  • the touch panel 14 has both functions of a display unit and a detection unit.
  • the touch panel 14 as a display unit is a display (display 35 in FIG. 2) provided with an area for displaying a display screen composed of characters, images, and the like.
  • This display includes, for example, an LCD (Liquid Crystal Display), an organic EL (ElectroLuminescence) display, and an inorganic EL display.
  • the touch panel 14 as a detection unit is a touch sensor (touch sensor 33 in FIG. 2) that detects a contact operation on the operation surface as input position information.
  • the touch sensor includes a plurality of elements for detecting a contact operation arranged on the upper surface of the display, and a transparent operation surface laminated thereon.
  • a pressure-sensitive method for detecting a change in pressure an electrostatic method for detecting an electric signal due to static electricity, or other methods can be applied.
  • the input position information is coordinate information indicating the position where the contact operation is performed.
  • the input position information is represented by coordinate values on two axes, for example, with the short side direction of the touch panel 14 as the X axis and the long direction as the Y axis direction.
  • a receiver 15 for outputting sound and a microphone 16 for inputting sound are disposed on the housing 11 at positions opposed to each other in the longitudinal direction via the touch panel 14.
  • FIG. 2 is a schematic functional block diagram showing the main functional configuration of the mobile terminal 1 in the present embodiment.
  • the mobile terminal 1 is configured by connecting a main control unit 30, a power supply circuit unit 31, an input control unit 32, a display control unit 34, a voice control unit 36, a communication control unit 37, and a storage unit 39 so that they can communicate with each other via a bus. Has been.
  • the main control unit 30 includes a CPU (Central Processing Unit).
  • the main control unit 30 operates based on various programs stored in the storage unit 39 and performs overall control of the mobile terminal 1.
  • the power supply circuit unit 31 includes a power supply source (not shown).
  • the power supply circuit unit 31 switches the power supply ON / OFF state of the mobile terminal 1 based on an operation of turning on the power supply.
  • the power supply circuit unit 31 supplies power from the power supply source to each unit when the power supply is in an ON state, thereby enabling the mobile terminal 1 to operate.
  • the input control unit 32 includes an input interface for the touch sensor 33.
  • the input control unit 32 receives the detection signal from the touch sensor 33 as input position information indicating the coordinates of the input position every predetermined time (for example, every 7 ms), generates a signal indicating the input, and transmits the signal to the main control unit 30. To do.
  • the display control unit 34 includes a display interface for the display 35.
  • the display control unit 34 causes the display 35 to display an image based on the document data and the image signal based on the control of the main control unit 30.
  • the sound control unit 36 generates an analog sound signal from the sound collected by the microphone 16 based on the control of the main control unit 30, and converts the analog sound signal into a digital sound signal. Further, when acquiring the digital audio signal, the audio control unit 36 converts the digital audio signal into an analog audio signal based on the control of the main control unit 30 and outputs the analog audio signal as audio from the receiver 15.
  • the communication control unit 37 restores data by performing a spectrum despreading process on the received signal received from the base station via the antenna 38 based on the control of the main control unit 30.
  • This data is transmitted to the voice control unit 36 and output from the receiver 15 according to an instruction from the main control unit 30, transmitted to the display control unit 34 and displayed on the display 35, or recorded in the storage unit 39.
  • the communication control unit 37 acquires the voice data collected by the microphone 16, the data input via the touch panel 14, or the data stored in the storage unit 39 based on the control of the main control unit 30, Is spread over the data and transmitted to the base station via the antenna 38.
  • the storage unit 39 is a ROM (Read Only Memory) that stores a processing program for processing performed by the main control unit 30 and data necessary for the processing, a hard disk, a non-volatile memory, a database, and is used when the main control unit 30 performs processing.
  • RAM Random Access Memory
  • the mobile terminal 1 in the present embodiment includes the touch panel 14 as described above.
  • the touch panel 14 receives an instruction for an image or the like displayed on the display 35 of the touch panel 14 via the operation surface.
  • the touch panel 14 performs an input operation on the touch panel via a user's finger or a stylus pen.
  • the user can hold the casing 11 with one hand and perform an input operation on the touch panel 14 using the finger of the other hand or the finger of the hand holding the casing 11.
  • FIG. 3 is a diagram illustrating an example in which an operation instruction is performed on the cursor displayed on the touch panel 14.
  • a cursor 41 which is an image used for pointing an operation target, is displayed.
  • an area assigned as an operation pad 14a that exclusively receives an operation instruction for the cursor 41 is displayed.
  • the input control unit 32 detects the input position information every predetermined time (for example, 7 ms) based on the input operation of the user who moves the finger F in contact therewith.
  • the main control unit 30 performs display processing for moving the cursor 41 based on the detected input position information.
  • the mobile terminal 1 detects the movement of the finger F moving in the direction of the illustrated arrow A on the operation pad 14a, the cursor 41 moves correspondingly in the direction of the illustrated arrow B.
  • FIG. 4 is a diagram conceptually showing the input position information detected by the touch sensor with respect to the input operation.
  • FIG. 4A shows input position information detected when an input operation is performed using a stylus pen.
  • FIG. 4B shows input position information detected when an input operation is performed using a finger.
  • FIG. 4 shows the detection points of the input position information by the touch sensor as x marks when contact with the touch panel is continued at substantially the same position for a certain time.
  • the stylus pen has a very small contact area with the operation surface, and the contact point with the operation surface is unlikely to change depending on how the user holds the stylus pen.
  • the touch sensor can detect input position information from a single contact point. For this reason, it is possible to cause the touch sensor to detect an operation instruction that matches the operation instruction intended by the user.
  • the fingertip or the belly of the finger is generally used. Unlike the stylus pen, the finger has a large contact area with the operation surface. For this reason, there is a possibility that the touch sensor may detect the input position information by being scattered within the contact range of the finger due to the influence of delicate force. For example, as shown in FIG. 4B, the touch sensor detects a large number of contact points in accordance with the finger force, so that the input position information is different from the original operation instruction expected by the user. Will be detected. Along with this, there is a possibility that the mobile terminal executes processing different from the operation instruction intended by the user.
  • FIG. 5 is a diagram illustrating an example of a cursor display process based on an operation instruction received through an input operation with a finger.
  • the user when the user finishes inputting the operation instruction, the user performs an operation (release operation) of releasing the finger F from the operation pad 14a.
  • release operation an operation of releasing the finger F from the operation pad 14a.
  • any part of the finger F having a large contact area may touch the operation pad 14a with delicate force.
  • input position information based on an unintended operation is detected, and display processing of the cursor 41 unintended for the user is performed.
  • the mobile terminal 1 realizes display processing with high accuracy by suitably invalidating an operation instruction not intended by the user even if the operation instruction is received via a finger having a large contact area.
  • the visibility and operability of the touch panel can be improved.
  • the operation instruction invalidation process with movement and the operation instruction invalidation process with release will be specifically described below.
  • FIG. 6 is a flowchart for explaining invalidation processing of an operation instruction accompanied by movement executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
  • step S1 the main control unit 30 detects an instruction for moving the cursor from the operation pad 14a as the operation unit displayed on the touch panel 14.
  • the main control unit 30 acquires the distance between the two contact points detected in succession from the input position information that is information about the contact points on the operation pad 14a.
  • the touch panel 14 acquires the input position information every predetermined time (for example, every 7 ms). For this reason, the main control unit 30 as a speed calculation unit can determine whether the movement between the contact points is fast or slow by obtaining the distance between the contact points. That is, when the distance between the contact points is large, the movement between the contact points is fast, and when the distance between the contact points is small, the movement between the contact points is slow.
  • step S2 the main control unit 30 determines whether or not the acquired distance between the two contact points is equal to or less than a threshold value.
  • the threshold value used for the determination at this time is a predetermined value determined in advance. This threshold value is set to a value that allows the movement between the contact points to be regarded as a fast action that has been unintentionally detected due to the delicate adjustment of the user's finger.
  • the main control unit 30 determines that the distance between the contact points is greater than the threshold value (NO in step S2), at least one acquired before the contact point determined that the distance is greater than the threshold value in step S3. It is determined whether the distance between the contact points is equal to or less than a threshold value. In other words, the main control unit 30 determines whether or not the fast operation determined that the distance is larger than the threshold value is performed after the slow operation. This is because only a fast action after a slow action can be regarded as a sudden fast action that is not intended by the user.
  • the number of operations performed before the fast operation considered in the determination in the operation determination step S3 is not particularly limited. For example, only the speed of the previous action of the fast action may be considered, or the speed of the action two or more times before may be considered.
  • this operation determination step S3 may be omitted in the operation invalidation process.
  • step S4 the main control unit 30 invalidates the movement between the two contact points targeted in the distance acquisition step S1. . That is, since the movement between the contact points is accompanied by an abrupt and quick operation, it is regarded as the input position information detected by the input operation not intended by the user. For this reason, this fast operation is invalidated and is not referred to when performing cursor display processing.
  • step S5 the main control unit 30 performs a cursor display process on the touch panel 14 based on the input position information.
  • the distance between the contact points is equal to or less than the threshold value in the distance determination step S2 (YES in step S2), and in the operation determination step S3, if it is not a fast operation after the slow operation (NO in step S3),
  • the main control unit 30 performs cursor display control corresponding to the detected input position information. That is, since the movement between the contact points is slow, or because the movement is not based on an abrupt fast movement, it can be regarded as the input position information detected with the input movement intended by the user. Therefore, the main control unit 30 performs cursor display control according to the detected input position information.
  • the distance between the contact points is larger than the threshold (NO in step S2), and in the operation determination step S3, it is determined that the operation is a fast operation after the slow operation (YES in step S3).
  • the main control unit 30 performs display control of the cursor 41 corresponding to the input position information in which the movement between the contact points is invalidated.
  • This display control process is executed every predetermined time (for example, every 14 ms). In other words, the cursor is moved and displayed every predetermined time.
  • FIG. 7 illustrates the movement distance referred to in the display control process executed every predetermined time in a predetermined two consecutive display control processes (shown as nth and (n + 1) th in FIG. 7). It is a table
  • FIG. 8 is a diagram showing a trajectory of the input operation obtained by the main control unit 30 from the input position information referred to in the n-th display control process.
  • FIG. 9 is a diagram showing a movement locus of the cursor 41 based on the input operation of FIG.
  • FIG. 10 is a diagram illustrating the locus of the input operation obtained by the main control unit 30 from the input position information referred to in the (n + 1) th display control process.
  • FIG. 8 is a diagram showing a trajectory of the input operation obtained by the main control unit 30 from the input position information referred to in the n-th display control process.
  • FIG. 9 is a diagram showing a movement locus of the cursor 41 based on the input operation of FIG.
  • FIG. 10
  • FIGS. 8 and 10 are diagrams illustrating the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation performed on the operation pad 14a.
  • 9 and 11 show the locus of the cursor displayed on the display 35 of the touch panel 14.
  • the display control process of FIG. 7 is executed every 14 ms, for example, by acquiring input position information detected during the display control process. For example, when the input position information is detected every 7 ms, the main control unit 30 acquires an average value of the input position information for two times as newly detected input position information and uses it for the display control process. It has become.
  • FIG. 7 shows the moving distance obtained from the newly acquired input position information on the right side of each time.
  • the movement distance on the left indicates the movement distance obtained in the previous display control process.
  • 8 to 11 the distance “CurrentDist” is obtained from the input position information newly acquired in each display control process and the input position information detected in the previous display control process. The distance traveled is shown.
  • the distance “Dist2” indicates the movement distance used to acquire the distance “CurrentDist” in the previous display control process.
  • the distance “Dist1” indicates the movement distance used for acquiring the distance “CurrentDist” in the last display control process.
  • the main control unit 30 determines whether or not the movement distance obtained from the newly acquired input position information in each display control process is 5 or less set as a threshold (distance determination step S2 in FIG. 6). When the moving distance is equal to or less than the threshold value 5, the main control unit 30 regards the user's input operation as an operation with a low moving speed and is an effective operation, and performs cursor display processing according to the obtained input position information. I do. On the other hand, the main control unit 30 is invalid when the moving distance is larger than the threshold 5 and the user's input operation is an operation with a high moving speed and a movement based on a fast operation after a slow operation. It is regarded as an operation, and the display processing is performed with the movement of the cursor based on the input position information invalid.
  • the main control unit 30 invalidates the movement of the distance CurrentDist (step S4), and performs the cursor display processing while validating the movement of the distance Dist1 and the distance Dist2 (FIG. 9, step S5).
  • a portion indicated by a dotted line in FIG. 9 indicates a locus of the invalid cursor.
  • the main control unit 30 acquires the movement distance CurrentDist obtained from the newly acquired input position information in the (n + 1) th display control process. In the (n + 1) th display control process shown in FIGS. 7 and 10, the main current control unit 30 determines that the user's input operation is a slow operation because the distance CurrentDist is 5 (YES in step S2).
  • the main control unit 30 performs a straight line obtained from the last trajectory of the cursor that has been effectively moved in the nth display control process (nth Dist2 trajectory) and the n + 1th display control process.
  • An angle ⁇ formed with a straight line obtained from the locus of the cursor (n + 1th CurrentDist) treated as valid is acquired.
  • the main control unit 30 displays the last trajectory of the cursor displayed in the nth display control process (the nth Dist2 trajectory) and the cursor treated as valid in the n + 1th display control process.
  • the cursor display process is performed so that the trajectory (n + 1th CurrentDist) forms this angle ⁇ . That is, the main control unit 30 performs cursor display control so that the starting point of the valid cursor locus after being invalidated coincides with the end point of the last valid cursor.
  • each time the display control process is performed using information acquired from the average value of the input position information detected every 7 ms as the input position information.
  • input position information detected from a contact point acquired every 7 ms may be used as it is in the display control process.
  • the portable terminal 1 by executing the invalidation process of the operation instruction accompanied by the movement, the above-described fast movement of the finger can be ignored, and the spread of the cursor 41 as shown in FIG. 5 is suppressed. be able to.
  • FIG. 12 is a diagram showing the input operation trajectory of the operation instruction using the thumb belly and the movement trajectory of the cursor based on the input operation.
  • the touch panel 14 shown in FIG. 12 has shown the example using what was comprised, for example by the dimension of 88 mm long and 52 mm wide. Also, in FIG. 12, the display of the finger that has performed the input operation and the cursor is omitted.
  • FIG. 13 is a flowchart for explaining operation instruction invalidation processing with release executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
  • step S11 the main control unit 30 determines whether or not the detection of the input position information is finished. If the main control unit 30 determines that the detection of the input position information is still continued, the main control unit 30 waits until the end.
  • the main control unit 30 serves as a direction calculation unit for a predetermined number (for example, three points) continuously detected immediately before the detection ends.
  • a predetermined number for example, three points
  • a plurality of movement directions between the contact points are acquired from the input position information of the contact points.
  • the number of contact points from which the moving direction is acquired may be three or more.
  • the main control unit 30 determines the moving direction of the contact point using, for example, the X-axis direction that matches the short-side direction of the touch panel 14 or the Y-axis direction that matches the long-side direction. Specifically, the main control unit 30 refers to the X-axis or Y-axis coordinate value of each point from the input position information based on a predetermined number of contact points detected continuously, and moves between the contact points. The direction with respect to the X axis or Y axis is obtained.
  • step S13 the main control unit 30 determines whether or not a plurality of movement directions between the predetermined number of contact points are the same.
  • the movement directions are the same, it is a case where the X coordinate value or the Y coordinate value moves uniformly in the direction of increasing or decreasing as the contact points move. If the movement directions do not match, the X coordinate value or Y coordinate value of each contact point does not move uniformly along with the movement of each contact point, and moves in both the increasing direction and the decreasing direction. It is a case.
  • step S14 the movement direction is determined in movement direction acquisition step S12.
  • the movement between the predetermined number of acquired contact points is invalidated. This is because the moving direction of the contact point has suddenly changed at the time of release, and can be regarded as a movement instruction based on an unintended operation accompanying the release operation.
  • step S13 determines in step S13 that a plurality of movement directions between the predetermined number of contact points match (YES in step S13)
  • the main control unit 30 moves in the movement direction acquisition step S12 in step S15.
  • the movement between the two points detected last is invalidated. Even if the contact points move in the same direction, there is a high possibility that the user's unintended movement will be detected due to the influence of subtle finger force when the release movement is performed. is there.
  • step S16 the main control unit 30 controls the display of the cursor based on the input position information in consideration of the movement invalidated in the invalid steps S14 and 15.
  • FIG. 14 is a diagram showing the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation when the moving directions between the three contact points are the same.
  • FIG. 15 is a diagram showing a movement locus of the cursor based on the input operation of FIG.
  • FIG. 16 is a diagram illustrating the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation when the moving directions between the three contact points do not match.
  • FIG. 17 is a diagram showing a movement locus of the cursor based on the input operation of FIG. That is, FIGS. 14 and 16 are diagrams showing the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation performed on the operation pad 14a. 15 and 17 show the locus of the cursor displayed on the display 35 of the touch panel 14.
  • the input position information used when acquiring the movement direction of the movement in FIGS. 14 to 17 is, for example, the average value of the input position information acquired every 7 ms twice, similarly to FIG. 7 described above.
  • the main control unit 30 invalidates the movement between the last detected point Prev1 and the point Curr among the contact points (step S15).
  • the main control unit 30 moves the cursor on the touch panel 14 (display 35) as shown in FIG. 15 in consideration of the invalid movement (step S16).
  • a portion indicated by a dotted line indicates a trajectory of the cursor that is invalidated.
  • the main control unit 30 determines that the moving direction between the contact points does not match the X axis direction (in step S13). NO).
  • the main control unit 30 invalidates the movement between three points (point Prev2, point Prev1, point Curr) among the contact points (step S15).
  • the main controller 30 moves the cursor on the touch panel 14 as shown in FIG. 17 in consideration of the invalid movement (step S16).
  • the mobile terminal 1 can invalidate the operation generated due to the influence of delicate finger force generated in the operation accompanied by the release, thereby further improving the operation instruction detection accuracy. be able to.
  • the portable terminal 1 can perform the display processing of the cursor that is close to the user's operational feeling by invalidating the movement between the contact points detected last.
  • the mobile terminal 1 invalidates the movement between the contact points detected before that in addition to the movement between the contact points detected last, in addition to the movement more than when the movement directions coincide. As a result, it is possible to perform a cursor display process close to the user's operational feeling.
  • the input position information of the contact point used in the invalidation process of the operation instruction accompanying the release is not limited to three points, and input position information of three or more points may be used.
  • the movement directions of the contact points acquired immediately before the release operation are the same, the movement between the contact points detected immediately before the release operation is the same. If the movement directions are not the same, the movement directions are the same.
  • movement between contact points to be invalidated is not limited to this example, and movement between more or less contact points may be invalidated.
  • the movement operation of the cursor can be stabilized, and the display process for moving the cursor can be performed more accurately. It can be done at a higher level.
  • the portable terminal 1 in this embodiment applied and demonstrated the structure provided with the touchscreen 14 with which the display 35 and the touch sensor 33 were united, the touch sensor 33 is the touch comprised separately with the display 35. It may be a pointing device such as a pad.
  • the mobile terminal according to the present invention can be applied to a mobile phone, a PDA (Personal Digital Assistant), a personal computer, a portable game machine, a portable music player, a portable video player, and other portable terminals.
  • a PDA Personal Digital Assistant

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

A mobile terminal comprises: a display unit for displaying a predetermined image; an operation unit for receiving an instruction for moving the image; a detection unit for detecting information relating to contact points on the operation unit that has received the instruction; a speed calculation unit for obtaining, from the information relating to the multiple contact points continuously detected by the detection unit, the moving speed between the contact points; and a control unit for, when the moving speed calculated by the speed calculation unit is greater than a threshold value, controlling displaying of the image on the basis of the information relating to the contact points in which inter-contact-point movements having moving speeds greater than the threshold value are invalidated.

Description

携帯端末Mobile device
 本発明は、タッチパッドやタッチパネルなどのポインティングデバイスを備えた携帯端末に関する。 The present invention relates to a portable terminal equipped with a pointing device such as a touch pad or a touch panel.
 携帯電話機をはじめとする携帯端末には、ユーザからの操作指示を受け付けるための種々の入力装置が搭載されている。この入力装置のうち、タッチパッドやタッチパネルなどの直感的な動作で操作指示を入力することができるものが知られている(例えば、特開2008-258805号公報参照)。このタッチパッドやタッチパネルは、操作面に対する接触に伴う静電容量や接触圧力の変化を感知するタッチセンサにより得られた入力位置情報に基づいて、操作指示を受け付けるようになっている。 A mobile terminal such as a mobile phone is equipped with various input devices for receiving operation instructions from a user. Among these input devices, one that can input an operation instruction by an intuitive operation such as a touch pad or a touch panel is known (for example, see Japanese Patent Application Laid-Open No. 2008-258805). This touch pad or touch panel accepts an operation instruction based on input position information obtained by a touch sensor that senses a change in capacitance or contact pressure associated with contact with the operation surface.
 タッチパネルやタッチパッドなどに操作指示を入力する動作を行う場合、一般的にはスタイラスペンや指などが用いられる。 When performing an operation for inputting an operation instruction to a touch panel or a touch pad, a stylus pen or a finger is generally used.
 スタイラスペンは、操作面に対する接触面積が極めて小さく、ユーザの保持の仕方により操作面に対する接触点が変わる可能性が少ない。このため、ユーザが意図した操作指示と一致した入力動作をタッチセンサに検出させることができる。 The stylus pen has a very small contact area with the operation surface, and the contact point with the operation surface is unlikely to change depending on how the user holds the stylus pen. For this reason, it is possible to cause the touch sensor to detect an input operation that matches the operation instruction intended by the user.
 これに対し、指は、スタイラスペンとは異なり操作面に対する接触面積が大きい。このため、タッチセンサは、指の微妙な力加減が影響し、指の接触範囲内に分散して入力位置情報を検出してしまう可能性がある。 On the other hand, unlike the stylus pen, the finger has a large contact area with the operation surface. For this reason, there is a possibility that the touch sensor may detect the input position information by being dispersed within the contact range of the finger due to the influence of delicate force of the finger.
 携帯端末は、タッチセンサより検出された入力位置情報に基づいて表示処理など種々の処理を実行する。しかし、上述したように指の微妙な力加減が影響しタッチセンサにより検出される入力位置情報が変化してしまう場合、ユーザが意図した操作指示とは異なった処理を実行してしまう可能性がある。このため、ユーザは、指を用いた入力動作を行った場合には、十分な操作性を得ることができなかった。 The mobile terminal executes various processes such as a display process based on the input position information detected by the touch sensor. However, as described above, when the input position information detected by the touch sensor changes due to the influence of a delicate finger force, there is a possibility of executing processing different from the operation instruction intended by the user. is there. For this reason, the user cannot obtain sufficient operability when performing an input operation using a finger.
発明の開示
 本発明はこのような事情を考慮してなされたもので、カーソルなどの画像の移動動作を安定させ、より精度の高い表示処理を行うことができる携帯端末を提供することを目的とする。
DISCLOSURE OF THE INVENTION The present invention has been made in view of such circumstances, and an object of the present invention is to provide a portable terminal capable of stabilizing the movement operation of an image such as a cursor and performing display processing with higher accuracy. To do.
 本発明に係る携帯端末は、上述した課題を解決するために、所定の画像を表示する表示部と、前記画像を移動させるための指示を受け付ける操作部と、前記指示を受け付けた前記操作部上の接触点に関する情報を検出する検出部と、前記検出部により連続して検出された複数の前記接触点に関する情報より、前記接触点間の移動速度を求める速度算出部と、前記速度算出部により算出された前記移動速度が閾値より大きい場合、前記移動速度が前記閾値より大きい前記接触点間の移動を無効とした前記接触点に関する情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする。 In order to solve the above-described problem, a mobile terminal according to the present invention includes a display unit that displays a predetermined image, an operation unit that receives an instruction to move the image, and the operation unit that receives the instruction. A detection unit that detects information on the contact points, a speed calculation unit that obtains a moving speed between the contact points based on information on the plurality of contact points detected continuously by the detection unit, and a speed calculation unit. A control unit that performs display control of the image based on information about the contact point that invalidates movement between the contact points when the calculated moving speed is greater than a threshold; It is characterized by having.
 また、本発明に係る携帯端末は、所定の画像を表示する表示部と、前記画像を移動させるための指示を受け付ける操作部と、前記指示を受け付けた前記操作部上の接触点に関する情報を検出する検出部と、前記検出部が前記接触点に関する情報の検出を終了した場合、検出が終了する直前に連続して検出された複数の前記接触点に関する情報より、各前記接触点間の複数の移動方向を求める方向算出部と、前記方向算出部により算出された各前記接触点間の複数の前記移動方向が一致しない場合、少なくとも前記移動方向が変化した前記接触点間の移動を無効とした前記接触点に関する情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする。 In addition, the portable terminal according to the present invention detects a display unit that displays a predetermined image, an operation unit that receives an instruction to move the image, and information regarding a contact point on the operation unit that has received the instruction. And when the detection unit finishes detecting the information on the contact point, a plurality of information between the contact points is detected based on the information on the plurality of contact points detected continuously immediately before the detection ends. When a plurality of movement directions between the contact points calculated by the direction calculation unit and the direction calculation unit for obtaining a movement direction do not match, at least movement between the contact points whose movement direction has changed is invalidated And a control unit that performs display control of the image based on information on the contact point.
 本発明に係る携帯端末はカーソルなどの画像の移動動作を安定させ、より精度の高い表示処理を行うことができる。 The mobile terminal according to the present invention can stabilize the movement operation of an image such as a cursor, and can perform display processing with higher accuracy.
本発明に係る携帯端末の実施形態を示す外観斜視図である。It is an external appearance perspective view which shows embodiment of the portable terminal which concerns on this invention. 本実施形態における携帯端末の主な機能構成を示す概略的な機能ブロック図である。It is a schematic functional block diagram which shows the main functional structures of the portable terminal in this embodiment. タッチパネルに表示されたカーソルに対する操作指示を行う場合の一例を示す図である。It is a figure which shows an example in the case of performing the operation instruction with respect to the cursor displayed on the touch panel. 入力動作に対してタッチセンサが検出する入力位置情報を概念的に示す図である。It is a figure which shows notionally the input position information which a touch sensor detects with respect to input operation | movement. 指による入力動作を介して受け付けた操作指示に基づくカーソルの表示処理の一例を示す図である。It is a figure which shows an example of the display process of the cursor based on the operation instruction received through the input operation by a finger. 本実施形態における携帯端末の主制御部により実行される移動を伴う操作指示の無効処理を説明するフローチャートである。It is a flowchart explaining the invalidation process of the operation instruction with the movement performed by the main control part of the portable terminal in this embodiment. 所定時間毎に実行される表示制御処理のうち、連続した所定の二回の表示制御処理において参照される入力位置情報から求められた移動距離の一例を示す表である。It is a table | surface which shows an example of the movement distance calculated | required from the input position information referred in the continuous two predetermined display control processes among the display control processes performed for every predetermined time. n回目の表示制御処理において参照される入力位置情報より主制御部が得た入力動作の軌跡を示す図である。It is a figure which shows the locus | trajectory of the input operation which the main control part acquired from the input position information referred in the nth display control process. 図8の入力動作に基づくカーソルの移動軌跡を示す図である。It is a figure which shows the movement locus | trajectory of the cursor based on the input operation | movement of FIG. n+1回目の表示制御処理において参照される入力位置情報より主制御部が得た入力動作の軌跡を示す図である。It is a figure which shows the locus | trajectory of the input operation which the main control part acquired from the input position information referred in the display control process of the (n + 1) th time. 図10の入力動作に基づくカーソルの移動軌跡を示す図である。It is a figure which shows the movement locus | trajectory of the cursor based on the input operation | movement of FIG. 親指の腹を用いた入力動作で検出された操作指示に基づく入力位置情報と、この入力動作情報に伴うカーソルの移動軌跡を示す図である。It is a figure which shows the input locus information based on the operation instruction detected by the input operation using the belly of the thumb, and the movement locus of the cursor accompanying this input operation information. 本実施形態における携帯端末の主制御部により実行されるリリースを伴う操作指示の無効処理を説明するフローチャートである。It is a flowchart explaining the invalidation process of the operation instruction with the release performed by the main control part of the portable terminal in this embodiment. 三接触点間の各移動方向が一致している場合の入力動作に基づく入力位置情報より主制御部が得た入力動作の軌跡を示す図である。It is a figure which shows the locus | trajectory of the input operation | movement which the main control part acquired from the input position information based on the input operation | movement when each moving direction between three contact points corresponds. 図14の入力動作に基づくカーソルの移動軌跡を示す図である。It is a figure which shows the movement locus | trajectory of the cursor based on the input operation | movement of FIG. 三接触点間の移動方向が一致していない場合の入力動作に基づく入力位置情報より主制御部が得た入力動作の軌跡を示す図である。It is a figure which shows the locus | trajectory of the input operation | movement which the main control part acquired from the input position information based on the input operation | movement when the moving directions between three contact points are not in agreement. 図16の入力動作に基づくカーソルの移動軌跡を示す図である。It is a figure which shows the movement locus | trajectory of the cursor based on the input operation | movement of FIG.
 本発明に係る携帯端末の実施形態を添付図面に基づいて説明する。本発明を適用する携帯端末として、カード型に形成され、ユーザがディスプレイを指で触れることで操作指示を入力することができる携帯端末を例に挙げて説明する。 Embodiments of a portable terminal according to the present invention will be described with reference to the accompanying drawings. As a portable terminal to which the present invention is applied, a portable terminal that is formed in a card shape and that allows a user to input an operation instruction by touching the display with a finger will be described as an example.
 図1は、本発明に係る携帯端末の実施形態を示す外観斜視図である。 FIG. 1 is an external perspective view showing an embodiment of a portable terminal according to the present invention.
 携帯端末1は、矩形の板状の筐体11を備える。この筐体11の一方の表面には、タッチパネル14が大部分を占めて構成される。 The mobile terminal 1 includes a rectangular plate-shaped casing 11. A touch panel 14 occupies most of one surface of the casing 11.
 このタッチパネル14は、表示部と検出部との双方の機能を備える。 The touch panel 14 has both functions of a display unit and a detection unit.
 表示部としてのタッチパネル14は、文字や画像などからなる表示画面を表示する領域が設けられたディスプレイ(図2のディスプレイ35)である。このディスプレイは、例えばLCD(Liquid Crystal Display)、有機EL(ElectroLuminescence)ディスプレイ、無機ELディスプレイで構成される。 The touch panel 14 as a display unit is a display (display 35 in FIG. 2) provided with an area for displaying a display screen composed of characters, images, and the like. This display includes, for example, an LCD (Liquid Crystal Display), an organic EL (ElectroLuminescence) display, and an inorganic EL display.
 検出部としてのタッチパネル14は、操作面に対する接触動作を入力位置情報として検出するタッチセンサ(図2のタッチセンサ33)である。タッチセンサは、ディスプレイの上面に複数配置された接触動作を検出するための素子と、さらにその上に積層された透明な操作面で構成される。なお、タッチパネル14上で接触動作を検知する方法は、圧力の変化を感知する感圧式、静電気による電気信号を感知する静電式、その他の方法を適用することができる。 The touch panel 14 as a detection unit is a touch sensor (touch sensor 33 in FIG. 2) that detects a contact operation on the operation surface as input position information. The touch sensor includes a plurality of elements for detecting a contact operation arranged on the upper surface of the display, and a transparent operation surface laminated thereon. As a method for detecting the contact operation on the touch panel 14, a pressure-sensitive method for detecting a change in pressure, an electrostatic method for detecting an electric signal due to static electricity, or other methods can be applied.
 入力位置情報は、接触動作が行われた位置を示す座標の情報である。入力位置情報は、例えばタッチパネル14の短辺方向をX軸、長手方向をY軸方向として、二軸上の座標値で表わされる。 The input position information is coordinate information indicating the position where the contact operation is performed. The input position information is represented by coordinate values on two axes, for example, with the short side direction of the touch panel 14 as the X axis and the long direction as the Y axis direction.
 また、筐体11上であって、タッチパネル14を介した長手方向対向位置には、音声を出力するためのレシーバ15と、音声を入力するためのマイクロフォン16とがそれぞれ配置される。 Further, a receiver 15 for outputting sound and a microphone 16 for inputting sound are disposed on the housing 11 at positions opposed to each other in the longitudinal direction via the touch panel 14.
 図2は、本実施形態における携帯端末1の主な機能構成を示す概略的な機能ブロック図である。携帯端末1は、主制御部30、電源回路部31、入力制御部32、表示制御部34、音声制御部36、通信制御部37、記憶部39がバスによって相互に通信可能に接続されて構成されている。 FIG. 2 is a schematic functional block diagram showing the main functional configuration of the mobile terminal 1 in the present embodiment. The mobile terminal 1 is configured by connecting a main control unit 30, a power supply circuit unit 31, an input control unit 32, a display control unit 34, a voice control unit 36, a communication control unit 37, and a storage unit 39 so that they can communicate with each other via a bus. Has been.
 主制御部30は、CPU(Central Processing Unit)を具備する。主制御部30は、記憶部39に記憶された各種プログラムに基づき動作して、携帯端末1の総括的な制御を行う。 The main control unit 30 includes a CPU (Central Processing Unit). The main control unit 30 operates based on various programs stored in the storage unit 39 and performs overall control of the mobile terminal 1.
 電源回路部31は、電力供給源(図示せず)を備える。電源回路部31は、電源をONする操作に基づいて携帯端末1の電源のON/OFF状態を切り替える。電源回路部31は、電源がON状態の場合に電力供給源から各部に対して電力を供給して、携帯端末1を動作可能にする。 The power supply circuit unit 31 includes a power supply source (not shown). The power supply circuit unit 31 switches the power supply ON / OFF state of the mobile terminal 1 based on an operation of turning on the power supply. The power supply circuit unit 31 supplies power from the power supply source to each unit when the power supply is in an ON state, thereby enabling the mobile terminal 1 to operate.
 入力制御部32はタッチセンサ33に対する入力インタフェースを備える。入力制御部32は、所定時間毎(例えば7ms毎)にタッチセンサ33からの検知信号を入力位置の座標を示す入力位置情報として受け取り、その入力を示す信号を生成して主制御部30に伝送する。 The input control unit 32 includes an input interface for the touch sensor 33. The input control unit 32 receives the detection signal from the touch sensor 33 as input position information indicating the coordinates of the input position every predetermined time (for example, every 7 ms), generates a signal indicating the input, and transmits the signal to the main control unit 30. To do.
 表示制御部34はディスプレイ35に対する表示インタフェースを備える。表示制御部34は、主制御部30の制御に基づいて、文書データや画像信号に基づいた画像をディスプレイ35に表示させる。 The display control unit 34 includes a display interface for the display 35. The display control unit 34 causes the display 35 to display an image based on the document data and the image signal based on the control of the main control unit 30.
 音声制御部36は、主制御部30の制御に基づいて、マイクロフォン16で集音された音声からアナログ音声信号を生成し、このアナログ音声信号をデジタル音声信号に変換する。また音声制御部36は、デジタル音声信号を取得すると、主制御部30の制御に基づいて、このデジタル音声信号をアナログ音声信号に変換し、レシーバ15から音声として出力する。 The sound control unit 36 generates an analog sound signal from the sound collected by the microphone 16 based on the control of the main control unit 30, and converts the analog sound signal into a digital sound signal. Further, when acquiring the digital audio signal, the audio control unit 36 converts the digital audio signal into an analog audio signal based on the control of the main control unit 30 and outputs the analog audio signal as audio from the receiver 15.
 通信制御部37は、主制御部30の制御に基づいて、基地局からアンテナ38を介して受信した受信信号をスペクトラム逆拡散処理してデータを復元する。このデータは、主制御部30の指示により、音声制御部36に伝送されてレシーバ15から出力されたり、表示制御部34に伝送されてディスプレイ35に表示されたり、または記憶部39に記録されたりする。また通信制御部37は、主制御部30の制御に基づいて、マイクロフォン16で集音された音声データやタッチパネル14を介して入力されたデータや記憶部39に記憶されたデータを取得すると、これらのデータに対してスペクトラム拡散処理を行い、基地局に対してアンテナ38を介して送信する。 The communication control unit 37 restores data by performing a spectrum despreading process on the received signal received from the base station via the antenna 38 based on the control of the main control unit 30. This data is transmitted to the voice control unit 36 and output from the receiver 15 according to an instruction from the main control unit 30, transmitted to the display control unit 34 and displayed on the display 35, or recorded in the storage unit 39. To do. When the communication control unit 37 acquires the voice data collected by the microphone 16, the data input via the touch panel 14, or the data stored in the storage unit 39 based on the control of the main control unit 30, Is spread over the data and transmitted to the base station via the antenna 38.
 記憶部39は、主制御部30が行う処理についての処理プログラムや処理に必要なデータなどを格納するROM(ReadOnlyMemory)やハードディスク、不揮発性メモリ、データベース、主制御部30が処理を行う際に使用されるデータを一時的に記憶するRAM(RandomAccessMemory)などから構成される。 The storage unit 39 is a ROM (Read Only Memory) that stores a processing program for processing performed by the main control unit 30 and data necessary for the processing, a hard disk, a non-volatile memory, a database, and is used when the main control unit 30 performs processing. RAM (Random Access Memory) that temporarily stores data to be stored.
 本実施形態における携帯端末1は、上述したようにタッチパネル14を備える。このタッチパネル14は、操作面を介してタッチパネル14のディスプレイ35に表示された画像などに対する指示を受け付ける。タッチパネル14は、ユーザの指やスタイラスペンなどを介して、このタッチパネルに対して入力動作を行うようになっている。例えば、ユーザは筐体11を一方の手で保持し、他方の手の指または筐体11を保持した手の指を用いてタッチパネル14に対して入力動作を行うことが可能なように構成される。 The mobile terminal 1 in the present embodiment includes the touch panel 14 as described above. The touch panel 14 receives an instruction for an image or the like displayed on the display 35 of the touch panel 14 via the operation surface. The touch panel 14 performs an input operation on the touch panel via a user's finger or a stylus pen. For example, the user can hold the casing 11 with one hand and perform an input operation on the touch panel 14 using the finger of the other hand or the finger of the hand holding the casing 11. The
 図3は、タッチパネル14に表示されたカーソルに対する操作指示を行う場合の一例を示す図である。 FIG. 3 is a diagram illustrating an example in which an operation instruction is performed on the cursor displayed on the touch panel 14.
 タッチパネル14には、操作の対象を指し示すために用いられる画像であるカーソル41が表示されるようになっている。また、タッチパネル14の所定位置には、カーソル41に対する操作指示を専用に受け付ける操作パッド14aとして割り当てられた領域が表示される。入力制御部32は、指Fを接触させて移動させるユーザの入力動作に基づいて、入力位置情報を所定時間毎(例えば7ms)に検出する。主制御部30は、検出された入力位置情報に基づいて、カーソル41を移動させる表示処理を行うようになっている。 On the touch panel 14, a cursor 41, which is an image used for pointing an operation target, is displayed. In addition, at a predetermined position on the touch panel 14, an area assigned as an operation pad 14a that exclusively receives an operation instruction for the cursor 41 is displayed. The input control unit 32 detects the input position information every predetermined time (for example, 7 ms) based on the input operation of the user who moves the finger F in contact therewith. The main control unit 30 performs display processing for moving the cursor 41 based on the detected input position information.
 例えば、携帯端末1は、操作パッド14a上で図示矢印Aの方向に移動する指Fの動作を検出すると、カーソル41はこれに対応して図示矢印Bの方向に移動するようになっている。 For example, when the mobile terminal 1 detects the movement of the finger F moving in the direction of the illustrated arrow A on the operation pad 14a, the cursor 41 moves correspondingly in the direction of the illustrated arrow B.
 ここで、指を用いてタッチパネル14(操作パッド14a)に入力を行った場合、指の操作パッド14aに対する接触面積が大きいことから、操作性に関する問題点があった。 Here, when an input is performed on the touch panel 14 (operation pad 14a) using a finger, there is a problem with operability because the contact area of the finger with respect to the operation pad 14a is large.
 図4は、入力動作に対してタッチセンサが検出する入力位置情報を概念的に示す図である。図4(A)は、スタイラスペンを用いて入力動作を行った場合に検出された入力位置情報を示す。図4(B)は、指を用いて入力動作を行った場合に検出された入力位置情報を示す。図4は、タッチパネルに対して一定時間、ほぼ同一の位置で接触が継続された場合に、タッチセンサによる入力位置情報の検出点を×印として示したものである。 FIG. 4 is a diagram conceptually showing the input position information detected by the touch sensor with respect to the input operation. FIG. 4A shows input position information detected when an input operation is performed using a stylus pen. FIG. 4B shows input position information detected when an input operation is performed using a finger. FIG. 4 shows the detection points of the input position information by the touch sensor as x marks when contact with the touch panel is continued at substantially the same position for a certain time.
 スタイラスペンは、操作面に対する接触面積が極めて小さく、ユーザの保持の仕方により操作面に対する接触点が変わる可能性が少ない。例えば、図4(A)に示すように、タッチセンサはおおむね一の接触点より入力位置情報を検出することができる。このため、ユーザが意図した操作指示と一致した操作指示をタッチセンサに検出させることができる。 The stylus pen has a very small contact area with the operation surface, and the contact point with the operation surface is unlikely to change depending on how the user holds the stylus pen. For example, as shown in FIG. 4A, the touch sensor can detect input position information from a single contact point. For this reason, it is possible to cause the touch sensor to detect an operation instruction that matches the operation instruction intended by the user.
 これに対し、指を用いて入力動作を行う場合、一般的には指先や指の腹が用いられる。指は、スタイラスペンとは異なり操作面に対する接触面積が大きい。このため、タッチセンサは、微妙な力加減が影響し、指の接触範囲内に四散して入力位置情報を検出してしまう可能性がある。例えば、図4(B)に示すように、タッチセンサは指の力加減に応じて多数の接触点が四散して検出されてしまうため、ユーザが期待する本来の操作指示とは異なる入力位置情報を検出してしまう。これに伴い、携帯端末は、ユーザが意図した操作指示とは異なった処理を実行してしまう可能性がある。 On the other hand, when an input operation is performed using a finger, the fingertip or the belly of the finger is generally used. Unlike the stylus pen, the finger has a large contact area with the operation surface. For this reason, there is a possibility that the touch sensor may detect the input position information by being scattered within the contact range of the finger due to the influence of delicate force. For example, as shown in FIG. 4B, the touch sensor detects a large number of contact points in accordance with the finger force, so that the input position information is different from the original operation instruction expected by the user. Will be detected. Along with this, there is a possibility that the mobile terminal executes processing different from the operation instruction intended by the user.
 図5は、指による入力動作を介して受け付けた操作指示に基づくカーソルの表示処理の一例を示す図である。 FIG. 5 is a diagram illustrating an example of a cursor display process based on an operation instruction received through an input operation with a finger.
 指Fを操作パッド14a上で図示矢印Cの方向に動かして操作指示を入力すると、これに対応してカーソル41が図示矢印Dの方向に移動する表示処理が行われる。しかし、上述したように、指の微妙な力加減により、接触範囲内で入力位置情報が多数検出されてしまう。このため、図5に示すように、四散して検出された入力位置情報に応じて、カーソル41も四散した状態で表示されながら移動することになる。 When the finger F is moved on the operation pad 14a in the direction of the illustrated arrow C and an operation instruction is input, a display process is performed in which the cursor 41 moves in the direction of the illustrated arrow D correspondingly. However, as described above, a large amount of input position information is detected within the contact range due to the delicate force applied by the finger. For this reason, as shown in FIG. 5, according to the input position information detected in a scattered manner, the cursor 41 also moves while being displayed in a scattered state.
 また、ユーザは操作指示の入力を終了する場合、操作パッド14aより指Fを離す動作(リリース動作)を行う。このリリース動作において、ユーザは単に指Fを離したつもりであっても、大きい接触面積を有する指Fのいずれかの箇所が、微妙な力加減で操作パッド14aに触れてしまう場合がある。このため、操作パッド14a上でリリース動作が行われた場合には、意図しない動作に基づく入力位置情報を検出してしまい、ユーザにとって意図しないカーソル41の表示処理が行われてしまう。 In addition, when the user finishes inputting the operation instruction, the user performs an operation (release operation) of releasing the finger F from the operation pad 14a. In this release operation, even if the user simply intends to release the finger F, any part of the finger F having a large contact area may touch the operation pad 14a with delicate force. For this reason, when a release operation is performed on the operation pad 14a, input position information based on an unintended operation is detected, and display processing of the cursor 41 unintended for the user is performed.
 本実施形態における携帯端末1は、指のように接触面積が大きいものを介して受け付けた操作指示であってもユーザの意図しない操作指示を好適に無効とすることで、精度よく表示処理を実現し、タッチパネルの視認性、操作性を向上させることができるようになっている。以下、移動を伴う操作指示の無効処理およびリリースを伴う操作指示の無効処理について具体的に説明する。 The mobile terminal 1 according to the present embodiment realizes display processing with high accuracy by suitably invalidating an operation instruction not intended by the user even if the operation instruction is received via a finger having a large contact area. In addition, the visibility and operability of the touch panel can be improved. The operation instruction invalidation process with movement and the operation instruction invalidation process with release will be specifically described below.
 図6は、本実施形態における携帯端末1の主制御部30により実行される移動を伴う操作指示の無効処理を説明するフローチャートである。 FIG. 6 is a flowchart for explaining invalidation processing of an operation instruction accompanied by movement executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
 ステップS1において、主制御部30は、タッチパネル14上に表示された操作部としての操作パッド14aよりカーソルを移動させるための指示を検出する。主制御部30は、操作パッド14a上の接触点に関する情報である入力位置情報より、連続して検出された二つの接触点間の距離を取得する。 In step S1, the main control unit 30 detects an instruction for moving the cursor from the operation pad 14a as the operation unit displayed on the touch panel 14. The main control unit 30 acquires the distance between the two contact points detected in succession from the input position information that is information about the contact points on the operation pad 14a.
 タッチパネル14は、入力位置情報を所定時間毎(例えば7ms毎)に取得するようになっている。このため、速度算出部としての主制御部30は、接触点間の距離を求めることにより、接触点間の移動が速い動作か、遅い動作かがわかるようになっている。すなわち、接触点間の距離が大きい場合は接触点間の移動が速く、接触点間の距離が小さい場合は接触点間の移動が遅いといえる。 The touch panel 14 acquires the input position information every predetermined time (for example, every 7 ms). For this reason, the main control unit 30 as a speed calculation unit can determine whether the movement between the contact points is fast or slow by obtaining the distance between the contact points. That is, when the distance between the contact points is large, the movement between the contact points is fast, and when the distance between the contact points is small, the movement between the contact points is slow.
 ステップS2において、主制御部30は、取得された二つの接触点間の距離が閾値以下であるか否かを判定する。このとき判定に用いられる閾値は予め決められた所定値である。この閾値は、接触点間の移動がユーザの指の微妙な力加減により、意図せず検出されてしまった速い動作であるとみなすことができる値に設定される。 In step S2, the main control unit 30 determines whether or not the acquired distance between the two contact points is equal to or less than a threshold value. The threshold value used for the determination at this time is a predetermined value determined in advance. This threshold value is set to a value that allows the movement between the contact points to be regarded as a fast action that has been unintentionally detected due to the delicate adjustment of the user's finger.
 主制御部30は、接触点間の距離が閾値より大きいと判定した場合(ステップS2のNO)、ステップS3において、距離が閾値より大きいと判定された接触点間の前に取得された少なくとも一の接触点間の距離が閾値以下であるか否かの判定を行う。すなわち、主制御部30は、距離が閾値より大きいと判定された速い動作が、遅い動作の後に行われたか否かの判定を行う。遅い動作の後の速い動作のみを、ユーザの意図しない唐突な速い動作とみなすことができるためである。 If the main control unit 30 determines that the distance between the contact points is greater than the threshold value (NO in step S2), at least one acquired before the contact point determined that the distance is greater than the threshold value in step S3. It is determined whether the distance between the contact points is equal to or less than a threshold value. In other words, the main control unit 30 determines whether or not the fast operation determined that the distance is larger than the threshold value is performed after the slow operation. This is because only a fast action after a slow action can be regarded as a sudden fast action that is not intended by the user.
 このとき、動作判定ステップS3の判定で考慮される、速い動作の前に行われた動作の数については、特に限定しない。例えば、速い動作の一つ前の動作の速度のみを考慮してもよいし、二つ前、またはそれ以上前の動作の速度まで考慮してもよい。 At this time, the number of operations performed before the fast operation considered in the determination in the operation determination step S3 is not particularly limited. For example, only the speed of the previous action of the fast action may be considered, or the speed of the action two or more times before may be considered.
 なお、この動作判定ステップS3は、操作無効処理においては省略してもよい。 Note that this operation determination step S3 may be omitted in the operation invalidation process.
 主制御部30は、遅い動作の後の速い動作であると判定した場合(ステップS3のYES)、ステップS4において、距離取得ステップS1において対象となった二つの接触点間の移動を無効とする。すなわち、接触点間の移動が唐突な速い動作に伴うものであるため、ユーザの意図しない入力動作に伴い検出された入力位置情報であるとみなす。このため、この速い動作を無効としてカーソルの表示処理を行う際には参考にしないものとする。 If the main control unit 30 determines that the operation is a fast operation after a slow operation (YES in step S3), in step S4, the main control unit 30 invalidates the movement between the two contact points targeted in the distance acquisition step S1. . That is, since the movement between the contact points is accompanied by an abrupt and quick operation, it is regarded as the input position information detected by the input operation not intended by the user. For this reason, this fast operation is invalidated and is not referred to when performing cursor display processing.
 ステップS5において、主制御部30は、入力位置情報に基づいてタッチパネル14においてカーソルの表示処理を行う。距離判定ステップS2において接触点間の距離が閾値以下であった場合(ステップS2のYES)および、動作判定ステップS3において遅い動作の後の速い動作ではなかった場合(ステップS3のNO)には、主制御部30は、検出された入力位置情報に対応したカーソルの表示制御を行う。すなわち、接触点間の移動が遅いため、または唐突な速い動作に基づく移動ではないため、ユーザの意図した入力動作に伴い検出された入力位置情報であるとみなすことができる。このため、主制御部30は、検出された入力位置情報に従いカーソルの表示制御を行う。 In step S5, the main control unit 30 performs a cursor display process on the touch panel 14 based on the input position information. When the distance between the contact points is equal to or less than the threshold value in the distance determination step S2 (YES in step S2), and in the operation determination step S3, if it is not a fast operation after the slow operation (NO in step S3), The main control unit 30 performs cursor display control corresponding to the detected input position information. That is, since the movement between the contact points is slow, or because the movement is not based on an abrupt fast movement, it can be regarded as the input position information detected with the input movement intended by the user. Therefore, the main control unit 30 performs cursor display control according to the detected input position information.
 これに対し、距離判定ステップS2において接触点間の距離が閾値より大きく(ステップS2のNO)、かつ動作判定ステップS3において遅い動作の後の速い動作であると判定され(ステップS3のYES)、無効ステップS4において接触点間の移動が無効にされた場合には、主制御部30は、この接触点間の移動を無効にした入力位置情報に対応したカーソル41の表示制御を行う。 In contrast, in the distance determination step S2, the distance between the contact points is larger than the threshold (NO in step S2), and in the operation determination step S3, it is determined that the operation is a fast operation after the slow operation (YES in step S3). When the movement between the contact points is invalidated in the invalid step S4, the main control unit 30 performs display control of the cursor 41 corresponding to the input position information in which the movement between the contact points is invalidated.
 以上で図6の移動を伴う操作指示の無効処理の説明を終了する。 This is the end of the description of the operation instruction invalidation process with movement in FIG.
 次に、図6の移動を伴う操作指示の無効処理における表示ステップS5の表示制御処理を説明する。この表示制御処理は、所定時間毎(例えば14ms毎)に実行される。すなわち、カーソルは、所定時間毎に移動して表示されるものとする。 Next, the display control process of the display step S5 in the operation instruction invalidation process accompanying the movement of FIG. 6 will be described. This display control process is executed every predetermined time (for example, every 14 ms). In other words, the cursor is moved and displayed every predetermined time.
 図7は、所定時間毎に実行される表示制御処理のうち、連続した所定の二回(図7においてはn回目、(n+1)回目と示す。)の表示制御処理において参照される移動距離の一例を示す表である。図8は、n回目の表示制御処理において参照される入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。図9は、図8の入力動作に基づくカーソル41の移動軌跡を示す図である。図10は、n+1回目の表示制御処理において参照される入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。図11は、図10の入力動作に基づくカーソル41の移動軌跡を示す図である。すなわち、図8および10は、操作パッド14a上で行われた入力動作に基づく入力位置情報から主制御部30が得た入力動作の軌跡を示す図である。図9および11は、タッチパネル14のディスプレイ35に表示されたカーソルの軌跡である。 FIG. 7 illustrates the movement distance referred to in the display control process executed every predetermined time in a predetermined two consecutive display control processes (shown as nth and (n + 1) th in FIG. 7). It is a table | surface which shows an example. FIG. 8 is a diagram showing a trajectory of the input operation obtained by the main control unit 30 from the input position information referred to in the n-th display control process. FIG. 9 is a diagram showing a movement locus of the cursor 41 based on the input operation of FIG. FIG. 10 is a diagram illustrating the locus of the input operation obtained by the main control unit 30 from the input position information referred to in the (n + 1) th display control process. FIG. 11 is a diagram showing a movement locus of the cursor 41 based on the input operation of FIG. That is, FIGS. 8 and 10 are diagrams illustrating the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation performed on the operation pad 14a. 9 and 11 show the locus of the cursor displayed on the display 35 of the touch panel 14.
 以下に説明する表示制御処理では、1回の表示制御処理を行うために最新の入力位置情報からこの最新の入力位置情報より前に取得された三つの入力位置情報の計四つの入力位置情報間の距離が考慮される。図7の表示制御処理は、例えば表示制御処理間において検出された入力位置情報を取得して14ms毎に実行される。主制御部30は、例えば7ms毎に入力位置情報が検出される場合には、二回分の入力位置情報の平均値を新たに検出された入力位置情報として取得して、表示制御処理に用いるようになっている。 In the display control process described below, a total of four pieces of input position information among three pieces of input position information acquired before the latest input position information from the latest input position information in order to perform one display control process. The distance is taken into account. The display control process of FIG. 7 is executed every 14 ms, for example, by acquiring input position information detected during the display control process. For example, when the input position information is detected every 7 ms, the main control unit 30 acquires an average value of the input position information for two times as newly detected input position information and uses it for the display control process. It has become.
 図7には、各回の右側には新たに取得した入力位置情報から求められた移動距離が示されている。左側の移動距離はそれ以前の表示制御処理において求められた移動距離が示されている。また、図8~11に示す移動距離のうち、距離「CurrentDist」は各回の表示制御処理において新たに取得した入力位置情報とその一つ前の表示制御処理において検出された入力位置情報とから求められた移動距離を示す。また、距離「Dist2」は、前回の表示制御処理における距離「CurrentDist」を取得するために用いられた移動距離を示す。距離「Dist1」は、前々回の表示制御処理における距離「CurrentDist」を取得するために用いられた移動距離を示す。 FIG. 7 shows the moving distance obtained from the newly acquired input position information on the right side of each time. The movement distance on the left indicates the movement distance obtained in the previous display control process. 8 to 11, the distance “CurrentDist” is obtained from the input position information newly acquired in each display control process and the input position information detected in the previous display control process. The distance traveled is shown. The distance “Dist2” indicates the movement distance used to acquire the distance “CurrentDist” in the previous display control process. The distance “Dist1” indicates the movement distance used for acquiring the distance “CurrentDist” in the last display control process.
 主制御部30は、各回の表示制御処理において新たに取得した入力位置情報より求まる移動距離が閾値として設定された5以下であるか否かを判定する(図6の距離判定ステップS2)。主制御部30は、この移動距離が閾値5以下である場合にはユーザの入力動作は移動速度が遅い動作であり、有効な動作であるとみなし、得られた入力位置情報に従ってカーソルの表示処理を行う。一方、主制御部30は、移動距離が閾値5よりも大きく、ユーザの入力動作は移動速度が速い動作であり、かつ遅い動作の後の速い動作に基づく移動であった場合には、無効な動作であるとみなし、この入力位置情報に基づくカーソルの移動を無効として表示処理を行う。 The main control unit 30 determines whether or not the movement distance obtained from the newly acquired input position information in each display control process is 5 or less set as a threshold (distance determination step S2 in FIG. 6). When the moving distance is equal to or less than the threshold value 5, the main control unit 30 regards the user's input operation as an operation with a low moving speed and is an effective operation, and performs cursor display processing according to the obtained input position information. I do. On the other hand, the main control unit 30 is invalid when the moving distance is larger than the threshold 5 and the user's input operation is an operation with a high moving speed and a movement based on a fast operation after a slow operation. It is regarded as an operation, and the display processing is performed with the movement of the cursor based on the input position information invalid.
 図7、8に示すn回目の表示制御処理において、主制御部30は、n-2回目に求められた移動距離Dist1が3(Dist1=3)であるため、ユーザの入力動作は遅い動作であると判定した。また、主制御部30は、n-1回目に求められた距離Dist2が4(Dist2=4)であるため、ユーザの入力動作は遅い動作であると判定した。また、主制御部30は、最新の(新たに求められた)移動距離CurrentDistが6(CurrentDist=6)であるため、ユーザの入力動作は速い動作であると判定した(ステップS2のNO)。また、主制御部30は、最新の移動距離CurrentDistは、遅い動作であるDist2の後の速い動作であると判定した(ステップS3のYES)。 In the n-th display control processing shown in FIGS. 7 and 8, the main control unit 30 has a slow input operation by the user because the movement distance Dist1 obtained at the (n−2) th time is 3 (Dist1 = 3). It was determined that there was. Further, the main control unit 30 determines that the user's input operation is a slow operation because the distance Dist2 obtained at the (n-1) th time is 4 (Dist2 = 4). Further, the main control unit 30 determines that the user's input operation is a fast operation because the latest (newly determined) travel distance CurrentDist is 6 (CurrentDist = 6) (NO in step S2). Further, the main control unit 30 determines that the latest movement distance CurrentDist is a fast operation after Dist2, which is a slow operation (YES in step S3).
 主制御部30はこれらの判定に基づき、距離CurrentDistの移動を無効とし(ステップS4)、距離Dist1および距離Dist2の移動を有効としてカーソルの表示処理を行った(図9、ステップS5)。なお、図9の点線で示された部分は、無効とされたカーソルの軌跡を示す。 Based on these determinations, the main control unit 30 invalidates the movement of the distance CurrentDist (step S4), and performs the cursor display processing while validating the movement of the distance Dist1 and the distance Dist2 (FIG. 9, step S5). A portion indicated by a dotted line in FIG. 9 indicates a locus of the invalid cursor.
 主制御部30は、n+1回目の表示制御処理において、新たに取得した入力位置情報より求まる移動距離CurrentDistを取得した。図7、図10に示すn+1回目の表示制御処理において、主制御部30は、距離CurrentDistが5であるため、ユーザの入力動作は遅い動作であると判定した(ステップS2のYES)。 The main control unit 30 acquires the movement distance CurrentDist obtained from the newly acquired input position information in the (n + 1) th display control process. In the (n + 1) th display control process shown in FIGS. 7 and 10, the main current control unit 30 determines that the user's input operation is a slow operation because the distance CurrentDist is 5 (YES in step S2).
 このとき主制御部30は、n+1回目の表示制御処理においては、移動が無効にされた入力位置情報を含む入力位置情報、すなわちn回目の表示制御処理において無効にされたCurrentDist=6を求めた点A、B(図10参照)の入力位置情報を参照して、カーソルの表示処理を行う。 At this time, the main control unit 30 obtains the input position information including the input position information whose movement is invalidated in the (n + 1) th display control process, that is, the CurrentDist = 6 invalidated in the nth display control process. Cursor display processing is performed with reference to the input position information of points A and B (see FIG. 10).
 主制御部30は、図10に示すように、n回目の表示制御処理で有効に移動したカーソルの最後の軌跡(n回目のDist2の軌跡)から得られる直線と、n+1回目の表示制御処理で有効として扱われるカーソルの軌跡(n+1回目のCurrentDist)から得られる直線とで形成される角度αを取得する。主制御部30は、図11に示すように、n回目の表示制御処理で表示されたカーソルの最後の軌跡(n回目のDist2の軌跡)と、n+1回目の表示制御処理で有効として扱われるカーソルの軌跡(n+1回目のCurrentDist)とが、この角度αを形成するように、カーソルの表示処理を行う。すなわち、主制御部30は、無効とされた後の有効なカーソルの軌跡の起点が最後に有効とされたカーソルの終点に一致するようにカーソルの表示制御を行う。 As shown in FIG. 10, the main control unit 30 performs a straight line obtained from the last trajectory of the cursor that has been effectively moved in the nth display control process (nth Dist2 trajectory) and the n + 1th display control process. An angle α formed with a straight line obtained from the locus of the cursor (n + 1th CurrentDist) treated as valid is acquired. As shown in FIG. 11, the main control unit 30 displays the last trajectory of the cursor displayed in the nth display control process (the nth Dist2 trajectory) and the cursor treated as valid in the n + 1th display control process. The cursor display process is performed so that the trajectory (n + 1th CurrentDist) forms this angle α. That is, the main control unit 30 performs cursor display control so that the starting point of the valid cursor locus after being invalidated coincides with the end point of the last valid cursor.
 なお、図7~図11を用いて説明した表示制御処理においては、7ms毎に検出された入力位置情報の二回分の平均値から取得される情報を入力位置情報として用いて各回の表示制御処理を14ms毎に行う例を説明したが、例えば7ms毎に取得された接触点より検出される入力位置情報を表示制御処理にそのまま用いてもよい。 In the display control process described with reference to FIGS. 7 to 11, each time the display control process is performed using information acquired from the average value of the input position information detected every 7 ms as the input position information. However, for example, input position information detected from a contact point acquired every 7 ms may be used as it is in the display control process.
 この携帯端末1によれば、移動を伴う操作指示の無効処理を実行することで、上述した速度の速い指の移動を無視することができ、図5に示すようなカーソル41の四散を抑制することができる。 According to the portable terminal 1, by executing the invalidation process of the operation instruction accompanied by the movement, the above-described fast movement of the finger can be ignored, and the spread of the cursor 41 as shown in FIG. 5 is suppressed. be able to.
 図12は、親指の腹を用いた操作指示の入力動作の軌跡と、この入力動作に基づくカーソルの移動軌跡を示す図である。図12に示すタッチパネル14は、例えば縦88mm、横52mmの寸法で構成されたものを用いた例を示した。また、図12においては、入力動作を行った指と、カーソルの表示は省略した。 FIG. 12 is a diagram showing the input operation trajectory of the operation instruction using the thumb belly and the movement trajectory of the cursor based on the input operation. The touch panel 14 shown in FIG. 12 has shown the example using what was comprised, for example by the dimension of 88 mm long and 52 mm wide. Also, in FIG. 12, the display of the finger that has performed the input operation and the cursor is omitted.
 親指の腹を用いてカーソルを移動させる操作指示を操作パッド14a上で図示矢印E方向に入力した場合、親指の腹の接触面積が大きいために、微妙な力加減で接触範囲内で入力位置情報が多数検出されてしまう。しかし、本実施形態における移動を伴う操作指示の無効処理を実行することで、好適にユーザの入力動作のうち速い動作を無視し、カーソルをユーザの意図する図示矢印F方向にほぼ直線的な軌跡を描いて移動させることができる。 When an operation instruction for moving the cursor using the thumb of the thumb is input in the direction of the arrow E on the operation pad 14a, the contact area of the thumb's belly is large. Will be detected in large numbers. However, by executing the invalidation process of the operation instruction accompanied by the movement in the present embodiment, it is preferable to ignore the fast operation among the input operations of the user, and to move the cursor in a substantially linear locus in the direction of the arrow F illustrated by the user. Can be drawn and moved.
 次に、リリースを伴う操作指示の無効処理について説明する。 Next, the operation instruction invalidation process with release will be described.
 図13は、本実施形態における携帯端末1の主制御部30により実行されるリリースを伴う操作指示の無効処理を説明するフローチャートである。 FIG. 13 is a flowchart for explaining operation instruction invalidation processing with release executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
 ステップS11において、主制御部30は、入力位置情報の検出を終了したか否かの判定を行う。主制御部30は、入力位置情報の検出がいまだ継続されていると判定した場合、終了するまで待機する。 In step S11, the main control unit 30 determines whether or not the detection of the input position information is finished. If the main control unit 30 determines that the detection of the input position information is still continued, the main control unit 30 waits until the end.
 一方、入力位置情報の検出が終了したと判定された場合、ステップS12において、主制御部30は方向算出部として、検出が終了する直前に連続して検出された所定数(例えば三点)の接触点の入力位置情報より、各接触点間の複数の移動方向を取得する。この移動方向が取得される接触点の数は、三点以上であればよい。 On the other hand, when it is determined that the detection of the input position information has ended, in step S12, the main control unit 30 serves as a direction calculation unit for a predetermined number (for example, three points) continuously detected immediately before the detection ends. A plurality of movement directions between the contact points are acquired from the input position information of the contact points. The number of contact points from which the moving direction is acquired may be three or more.
 主制御部30は、接触点の移動方向を、例えばタッチパネル14の短辺方向と一致するX軸、または長手方向と一致するY軸の方向を用いて求める。具体的には、主制御部30は、連続して検出された所定数の接触点に基づく入力位置情報から各点のX軸またはY軸の座標値を参照し、各接触点間の移動のX軸またはY軸に対する方向を求める。 The main control unit 30 determines the moving direction of the contact point using, for example, the X-axis direction that matches the short-side direction of the touch panel 14 or the Y-axis direction that matches the long-side direction. Specifically, the main control unit 30 refers to the X-axis or Y-axis coordinate value of each point from the input position information based on a predetermined number of contact points detected continuously, and moves between the contact points. The direction with respect to the X axis or Y axis is obtained.
 なお、接触点の移動方向を取得する際には、X軸またはY軸のどちらか一方の軸が用いられる。 In addition, when acquiring the moving direction of the contact point, either the X axis or the Y axis is used.
 ステップS13において、主制御部30は、所定数の各接触点間の複数の移動方向が一致しているか否かの判定を行う。移動方向が一致している場合は、各接触点の移動に伴って、X座標値またはY座標値が大きくなる方向または小さくなる方向に一様に移動している場合である。移動方向が一致していない場合は、各接触点の移動に伴って、各接触点のX座標値またはY座標値が一様に移動せず、大きくなる方向および小さくなる方向の両者に移動している場合である。 In step S13, the main control unit 30 determines whether or not a plurality of movement directions between the predetermined number of contact points are the same. When the movement directions are the same, it is a case where the X coordinate value or the Y coordinate value moves uniformly in the direction of increasing or decreasing as the contact points move. If the movement directions do not match, the X coordinate value or Y coordinate value of each contact point does not move uniformly along with the movement of each contact point, and moves in both the increasing direction and the decreasing direction. It is a case.
 主制御部30は、ステップS13において所定数の各接触点間の複数の移動方向が一致していないと判定した場合(ステップS13のNO)、ステップS14において、移動方向取得ステップS12において移動方向が取得された所定数の接触点間の移動を無効とする。リリース時に急激に接触点の移動方向が変化したため、リリース動作に伴う意図しない動作に基づく移動指示であるとみなし得るためである。 If the main control unit 30 determines in step S13 that the plurality of movement directions between the predetermined number of contact points do not match (NO in step S13), in step S14, the movement direction is determined in movement direction acquisition step S12. The movement between the predetermined number of acquired contact points is invalidated. This is because the moving direction of the contact point has suddenly changed at the time of release, and can be regarded as a movement instruction based on an unintended operation accompanying the release operation.
 なお、無効にする接触点間の移動はこの例に限らず、最後に検出された二点間の移動のみを無効としてもよい。 Note that the movement between the contact points to be invalidated is not limited to this example, and only the movement between the two points detected last may be invalidated.
 一方、主制御部30は、ステップS13において所定数の各接触点間の複数の移動方向が一致していると判定した場合(ステップS13のYES)、ステップS15において、移動方向取得ステップS12において移動方向が取得された所定数の接触点間の移動のうち、最後に検出された二点間の移動を無効とする。接触点の移動方向が一致している場合であっても、リリース動作が行われた際の指の微妙な力加減が影響し、ユーザの意図しない動作を検出してしまう可能性が高いからである。 On the other hand, if the main control unit 30 determines in step S13 that a plurality of movement directions between the predetermined number of contact points match (YES in step S13), the main control unit 30 moves in the movement direction acquisition step S12 in step S15. Of the movements between the predetermined number of contact points whose directions are acquired, the movement between the two points detected last is invalidated. Even if the contact points move in the same direction, there is a high possibility that the user's unintended movement will be detected due to the influence of subtle finger force when the release movement is performed. is there.
 ステップS16において、主制御部30は、無効ステップS14および15において無効とされた移動を考慮して、入力位置情報に基づいてカーソルの表示制御を行う。 In step S16, the main control unit 30 controls the display of the cursor based on the input position information in consideration of the movement invalidated in the invalid steps S14 and 15.
 以上で、図13のリリースを伴う操作指示の無効処理の説明を終了する。 This completes the description of the invalid operation instruction processing accompanying the release of FIG.
 次に、図13のリリースを伴う操作指示の無効処理における表示ステップS16の表示制御処理を説明する。入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。 Next, the display control process of the display step S16 in the invalidation process of the operation instruction accompanying the release in FIG. 13 will be described. It is a figure which shows the locus | trajectory of the input operation which the main control part 30 obtained from the input position information.
 図14は、三接触点間の各移動方向が一致している場合の入力動作に基づく入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。図15は、図14の入力動作に基づくカーソルの移動軌跡を示す図である。図16は、三接触点間の移動方向が一致していない場合の入力動作に基づく入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。図17は、図16の入力動作に基づくカーソルの移動軌跡を示す図である。すなわち、図14および16は、操作パッド14a上で行われた入力動作に基づく入力位置情報より主制御部30が得た入力動作の軌跡を示す図である。図15および17は、タッチパネル14のディスプレイ35に表示されたカーソルの軌跡である。 FIG. 14 is a diagram showing the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation when the moving directions between the three contact points are the same. FIG. 15 is a diagram showing a movement locus of the cursor based on the input operation of FIG. FIG. 16 is a diagram illustrating the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation when the moving directions between the three contact points do not match. FIG. 17 is a diagram showing a movement locus of the cursor based on the input operation of FIG. That is, FIGS. 14 and 16 are diagrams showing the locus of the input operation obtained by the main control unit 30 from the input position information based on the input operation performed on the operation pad 14a. 15 and 17 show the locus of the cursor displayed on the display 35 of the touch panel 14.
 なお、図14~17で動作の移動方向を取得する際に用いられる入力位置情報は、上述した図7と同様に、例えば7ms毎に取得された入力位置情報の二回分の平均値とする。 Note that the input position information used when acquiring the movement direction of the movement in FIGS. 14 to 17 is, for example, the average value of the input position information acquired every 7 ms twice, similarly to FIG. 7 described above.
 図14~17に示す表示制御処理の説明においては、移動方向の取得に用いられる入力位置情報の数を三点とした場合を適用して説明する。また、図14~17に示す点のうち、点「Curr」はリリース動作の直前であり、最後に検出された入力位置情報を示す。点「Prev1」は、点「Curr」の一つ前に検出された入力位置情報を示す。点「Prev2」は、点「Curr」の二つ前に検出された入力位置情報を示す。また、図14~17では、移動方向を判断するため、各入力位置情報のX座標の値を用いた。 In the description of the display control processing shown in FIGS. 14 to 17, the case where the number of pieces of input position information used for acquiring the moving direction is three points will be applied. Of the points shown in FIGS. 14 to 17, the point “Curr” is immediately before the release operation and indicates the input position information detected last. A point “Prev1” indicates input position information detected immediately before the point “Curr”. The point “Prev2” indicates the input position information detected immediately before the point “Curr”. In FIGS. 14 to 17, the value of the X coordinate of each input position information is used to determine the moving direction.
 図14に示すように、点Prev2のX座標の値が5(x=5)であった。また、点Prev1のX座標の値が8(x=8)であった。さらに、点CurrのX座標の値が9(x=9)であった。各点のX座標値が大きくなる方向に移動しているため、主制御部30は接触点間の移動方向がX軸方向に全て一致していると判定する(ステップS13のYES)。 As shown in FIG. 14, the value of the X coordinate of the point Prev2 was 5 (x = 5). Further, the value of the X coordinate of the point Prev1 was 8 (x = 8). Further, the value of the X coordinate of the point Curr was 9 (x = 9). Since the X coordinate value of each point is moving in the increasing direction, the main control unit 30 determines that the moving directions between the contact points all coincide with the X axis direction (YES in step S13).
 このため、主制御部30は、各接触点のうち、最後に検出された点Prev1と点Curr間の移動を無効とする(ステップS15)。主制御部30は、無効とされた移動を考慮して、図15に示すようにカーソルをタッチパネル14(ディスプレイ35)上で移動させる(ステップS16)。なお、点線で示された部分は、無効とされたカーソルの軌道を示す。 Therefore, the main control unit 30 invalidates the movement between the last detected point Prev1 and the point Curr among the contact points (step S15). The main control unit 30 moves the cursor on the touch panel 14 (display 35) as shown in FIG. 15 in consideration of the invalid movement (step S16). A portion indicated by a dotted line indicates a trajectory of the cursor that is invalidated.
 一方、図16に示すように、点Prev2のX座標の値が6(x=6)であった。また、点Prev1のX座標の値が7(x=7)であった。さらに、点CurrのX座標の値が5(x=5)であった。各点のX座標値が大きくなる方向と小さくなる方向の両者に移動しているため、主制御部30は接触点間の移動方向がX軸方向に一致していないと判定する(ステップS13のNO)。 On the other hand, as shown in FIG. 16, the value of the X coordinate of the point Prev2 was 6 (x = 6). Further, the value of the X coordinate of the point Prev1 was 7 (x = 7). Further, the value of the X coordinate of the point Curr was 5 (x = 5). Since the X coordinate value of each point moves in both the increasing direction and decreasing direction, the main control unit 30 determines that the moving direction between the contact points does not match the X axis direction (in step S13). NO).
 このため、主制御部30は、各接触点のうち、三点(点Prev2、点Prev1、点Curr)間の移動を無効とする(ステップS15)。主制御部30は、無効とされた移動を考慮して、図17に示すようにカーソルをタッチパネル14上で移動させる(ステップS16)。 For this reason, the main control unit 30 invalidates the movement between three points (point Prev2, point Prev1, point Curr) among the contact points (step S15). The main controller 30 moves the cursor on the touch panel 14 as shown in FIG. 17 in consideration of the invalid movement (step S16).
 携帯端末1は、上述した処理を実行することで、リリースを伴う動作において発生した指の微妙な力加減の影響により発生した動作を無効とすることができ、より操作指示の検出精度を向上させることができる。 By executing the above-described processing, the mobile terminal 1 can invalidate the operation generated due to the influence of delicate finger force generated in the operation accompanied by the release, thereby further improving the operation instruction detection accuracy. be able to.
 また、接触点の移動方向が一軸上で全て一致している場合は、ユーザの指は、入力動作の進行方向に(過剰に)向かいながら操作パッド14aより離れたと考えられる。このため、携帯端末1は、最後に検出された接触点間の移動を無効とすることで、ユーザの操作感に近付けたカーソルの表示処理を行うことができる。 In addition, when the moving directions of the contact points all coincide on one axis, it is considered that the user's finger is separated from the operation pad 14a while facing (excessively) the input operation traveling direction. For this reason, the portable terminal 1 can perform the display processing of the cursor that is close to the user's operational feeling by invalidating the movement between the contact points detected last.
 さらに、接触点の移動方向が一軸上で一致しない場合は、ユーザの指は、操作パッド14aより垂直に離れたと考えられる。すなわち、移動方向と逆向きに向かう力が(過剰に)指に働いたと考えられる。このため、携帯端末1は、移動方向が一致した場合よりも多い移動、すなわち最後に検出された接触点間の移動に加えてさらにその前に検出された接触点間の移動をも無効とすることで、ユーザの操作感に近づけたカーソルの表示処理を行うことができる。 Furthermore, when the moving direction of the contact point does not coincide on one axis, it is considered that the user's finger is separated vertically from the operation pad 14a. That is, it is considered that a force directed in the direction opposite to the moving direction worked (excessively) on the finger. For this reason, the mobile terminal 1 invalidates the movement between the contact points detected before that in addition to the movement between the contact points detected last, in addition to the movement more than when the movement directions coincide. As a result, it is possible to perform a cursor display process close to the user's operational feeling.
 なお、リリースを伴う操作指示の無効処理において用いられる接触点の入力位置情報は、三点に限らず、三点以上の入力位置情報を用いてもよい。また、リリース動作の直前に取得された接触点の移動方向が一致する場合には、リリース直前の最後に検出された接触点間の移動を、移動方向が一致しない場合には、移動方向が一致した場合よりも多い移動を無効にする例を説明したが、無効にする接触点間の移動はこの例に限らず、より多いまたは少ない接触点間の移動を無効としてもよい。 Note that the input position information of the contact point used in the invalidation process of the operation instruction accompanying the release is not limited to three points, and input position information of three or more points may be used. In addition, if the movement directions of the contact points acquired immediately before the release operation are the same, the movement between the contact points detected immediately before the release operation is the same. If the movement directions are not the same, the movement directions are the same. Although an example of invalidating more movements than the above case has been described, movement between contact points to be invalidated is not limited to this example, and movement between more or less contact points may be invalidated.
 この携帯端末1によれば、上述した移動を伴う操作指示およびリリースを伴う操作指示の無効処理を実行するため、カーソルの移動動作を安定させることができ、カーソルを移動させる表示処理をより精度を高めて行うことができる。 According to the portable terminal 1, since the invalidation process of the operation instruction with movement and the operation instruction with release described above is executed, the movement operation of the cursor can be stabilized, and the display process for moving the cursor can be performed more accurately. It can be done at a higher level.
 なお、本実施形態においては、操作パッド14aに対して指Fで操作指示を入力する例を説明したが、指F以外の接触面積が大きいものを介して操作指示を入力する場合に適用してもよい。 In the present embodiment, an example in which an operation instruction is input to the operation pad 14a with the finger F has been described. However, the present embodiment is applied to the case where the operation instruction is input through a touch area other than the finger F. Also good.
 また、本実施形態における携帯端末1は、ディスプレイ35とタッチセンサ33が一体となったタッチパネル14を備えた構成合を適用して説明したが、タッチセンサ33はディスプレイ35と個別に構成されたタッチパッドなどのポインティングデバイスであってもよい。 Moreover, although the portable terminal 1 in this embodiment applied and demonstrated the structure provided with the touchscreen 14 with which the display 35 and the touch sensor 33 were united, the touch sensor 33 is the touch comprised separately with the display 35. It may be a pointing device such as a pad.
 さらに、本発明に係る携帯端末は、携帯電話機、PDA(PersonalDigitalAssistant)、パーソナルコンピュータ、携帯型ゲーム機、携帯型音楽再生機、携帯型動画再生機、その他の携帯端末にも適用することができる。 Furthermore, the mobile terminal according to the present invention can be applied to a mobile phone, a PDA (Personal Digital Assistant), a personal computer, a portable game machine, a portable music player, a portable video player, and other portable terminals.

Claims (8)

  1. 所定の画像を表示する表示部と、
     前記画像を移動させるための指示を受け付ける操作部と、
     前記指示を受け付けた前記操作部上の接触点に関する情報を検出する検出部と、
     前記検出部により連続して検出された複数の前記接触点に関する情報より、前記接触点間の移動速度を求める速度算出部と、
     前記速度算出部により算出された前記移動速度が閾値より大きい場合、前記移動速度が前記閾値より大きい前記接触点間の移動を無効とされた前記接触点に関する情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする携帯端末。
    A display unit for displaying a predetermined image;
    An operation unit that receives an instruction to move the image;
    A detection unit that detects information about a contact point on the operation unit that has received the instruction;
    From information on a plurality of the contact points detected continuously by the detection unit, a speed calculation unit for obtaining a moving speed between the contact points;
    When the movement speed calculated by the speed calculation unit is greater than a threshold value, the display control of the image is performed based on information on the contact points in which movement between the contact points is greater than the threshold value. A portable terminal comprising a control unit for performing the above.
  2. 前記制御部は、最新の前記接触点間の前記移動速度が前記閾値より大きく、かつ前記最新の前記接触点間の前に取得された少なくとも一の前記接触点間の移動速度が前記閾値以下である場合に、前記最新の接触点間の移動を無効とされた前記接触点に関する情報に基づいて、前記画像の表示制御を行う請求項1記載の携帯端末。 The control unit is configured such that the moving speed between the latest contact points is greater than the threshold value, and the moving speed between the at least one contact points acquired before the latest contact point is equal to or less than the threshold value. The mobile terminal according to claim 1, wherein in some cases, display control of the image is performed based on information on the contact point that is invalidated to move between the latest contact points.
  3. 前記制御部は、移動が無効にされた前記接触点に関する情報を含む前記接触点に関する情報を参照して前記画像の表示制御を行う請求項1記載の携帯端末。 The portable terminal according to claim 1, wherein the control unit performs display control of the image with reference to information on the contact point including information on the contact point whose movement is invalidated.
  4. 前記制御部は、無効とされた後の有効な前記画像の移動の起点が最後に有効とされた前記画像の終点と一致するように前記画像の表示制御を行う請求項1記載の携帯端末。 The mobile terminal according to claim 1, wherein the control unit performs display control of the image so that a starting point of movement of the valid image after being invalidated coincides with an end point of the image validated last.
  5. 所定の画像を表示する表示部と、
     前記画像を移動させるための指示を受け付ける操作部と、
     前記指示を受け付けた前記操作部上の接触点に関する情報を検出する検出部と、
     前記検出部が前記接触点に関する情報の検出を終了した場合、検出が終了する直前に連続して検出された複数の前記接触点に関する情報より、各前記接触点間の複数の移動方向を求める方向算出部と、
     前記方向算出部により算出された各前記接触点間の複数の前記移動方向が一致しない場合、少なくとも前記移動方向が変化した前記接触点間の移動を無効とした前記接触点に関する情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする携帯端末。
    A display unit for displaying a predetermined image;
    An operation unit that receives an instruction to move the image;
    A detection unit that detects information about a contact point on the operation unit that has received the instruction;
    When the detection unit finishes detecting information on the contact points, directions for obtaining a plurality of moving directions between the contact points from information on the plurality of contact points continuously detected immediately before the detection ends. A calculation unit;
    When a plurality of the movement directions between the contact points calculated by the direction calculation unit do not match, based on at least information on the contact points that invalidates the movement between the contact points whose movement direction has changed, A portable terminal comprising: a control unit that performs display control of the image.
  6. 前記制御部は、前記方向算出部により算出された各前記接触点間の複数の前記移動方向が一致する場合、少なくとも最後に検出された前記接触点間の移動を無効とした前記接触点に関する情報に基づいて、前記画像の表示制御を行う請求項5記載の携帯端末。 The control unit, when a plurality of the movement directions between the contact points calculated by the direction calculation unit match, information on the contact point that invalidates the movement between the contact points detected at least last. The mobile terminal according to claim 5, wherein display control of the image is performed based on the information.
  7. 前記制御部は、前記方向算出部により算出された各前記接触点間の複数の前記移動方向が一致する場合、最後に検出された前記接触点間の移動を無効とし、各前記接触点間の複数の前記移動方向が一致しない場合、前記移動方向が一致する場合よりも多い所定数の前記接触点間の移動を無効とする請求項5記載の携帯端末。 When the plurality of movement directions between the contact points calculated by the direction calculation unit coincide with each other, the control unit invalidates the movement between the contact points detected last, and between the contact points. The mobile terminal according to claim 5, wherein when a plurality of the movement directions do not coincide with each other, the movement between the predetermined number of contact points, which is larger than when the movement directions coincide with each other, is invalidated.
  8. 前記所定の画像は、前記表示部に表示された画面の少なくとも一部を移動するカーソルであり、
     前記操作部が受け付ける前記指示は、前記カーソルを前記画面上で移動させる指示である請求項1または5記載の携帯端末。
    The predetermined image is a cursor that moves at least a part of the screen displayed on the display unit,
    The portable terminal according to claim 1, wherein the instruction received by the operation unit is an instruction to move the cursor on the screen.
PCT/JP2010/050540 2009-04-22 2010-01-19 Mobile terminal WO2010122825A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-104344 2009-04-22
JP2009104344A JP2012141650A (en) 2009-04-22 2009-04-22 Mobile terminal

Publications (1)

Publication Number Publication Date
WO2010122825A1 true WO2010122825A1 (en) 2010-10-28

Family

ID=43010949

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/050540 WO2010122825A1 (en) 2009-04-22 2010-01-19 Mobile terminal

Country Status (2)

Country Link
JP (1) JP2012141650A (en)
WO (1) WO2010122825A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014102751A (en) * 2012-11-21 2014-06-05 Yahoo Japan Corp Terminal, wrong operation determination device, operation method and program
GB2531370A (en) * 2014-06-20 2016-04-20 Panasonic Ip Man Co Ltd Electronic apparatus, control method, program, and server
US9542904B2 (en) 2014-06-20 2017-01-10 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus
US10283075B2 (en) 2014-06-20 2019-05-07 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus which effects touch coordinate based on proximity and strain

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016115303A (en) * 2014-12-18 2016-06-23 株式会社東海理化電機製作所 Operation detection device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11134107A (en) * 1997-10-31 1999-05-21 Nec Home Electron Ltd Coordinate correcting method
JP2001084106A (en) * 1999-09-10 2001-03-30 Ricoh Co Ltd Coordinate inputting/detecting device and information storing medium
JP2005346235A (en) * 2004-06-01 2005-12-15 Sony Corp Coordinate detection unit, control method therefor, control program for coordinate detection unit, and recording medium recording program for control method of coordinate detection unit
JP2007122522A (en) * 2005-10-28 2007-05-17 Digital Electronics Corp Input device, touch panel input accepting method and operation indicator

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11134107A (en) * 1997-10-31 1999-05-21 Nec Home Electron Ltd Coordinate correcting method
JP2001084106A (en) * 1999-09-10 2001-03-30 Ricoh Co Ltd Coordinate inputting/detecting device and information storing medium
JP2005346235A (en) * 2004-06-01 2005-12-15 Sony Corp Coordinate detection unit, control method therefor, control program for coordinate detection unit, and recording medium recording program for control method of coordinate detection unit
JP2007122522A (en) * 2005-10-28 2007-05-17 Digital Electronics Corp Input device, touch panel input accepting method and operation indicator

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014102751A (en) * 2012-11-21 2014-06-05 Yahoo Japan Corp Terminal, wrong operation determination device, operation method and program
GB2531370A (en) * 2014-06-20 2016-04-20 Panasonic Ip Man Co Ltd Electronic apparatus, control method, program, and server
US9542904B2 (en) 2014-06-20 2017-01-10 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus
US9880679B2 (en) 2014-06-20 2018-01-30 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus which effects touch coordinate based on proximity and strain
US10001880B2 (en) 2014-06-20 2018-06-19 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus which determines effectiveness of a touch coordinate based on an amount of bend
US10283075B2 (en) 2014-06-20 2019-05-07 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus which effects touch coordinate based on proximity and strain
GB2531370B (en) * 2014-06-20 2021-06-09 Panasonic Ip Man Co Ltd Electronic apparatus, control method, program, and server

Also Published As

Publication number Publication date
JP2012141650A (en) 2012-07-26

Similar Documents

Publication Publication Date Title
JP5610644B2 (en) Input device, input support method, and program
EP2332023B1 (en) Two-thumb qwerty keyboard
JP5759660B2 (en) Portable information terminal having touch screen and input method
US9459704B2 (en) Method and apparatus for providing one-handed user interface in mobile device having touch screen
JP5780438B2 (en) Electronic device, position designation method and program
US20100201644A1 (en) Input processing device
JP5222967B2 (en) Mobile device
US10514796B2 (en) Electronic apparatus
JP2009123004A (en) Input device for portable electronic device, and portable electronic device
JPWO2012070682A1 (en) Input device and control method of input device
JP2010224764A (en) Portable game machine with touch panel display
JP2006340370A (en) Input device by fingertip-mounting sensor
US20120268359A1 (en) Control of electronic device using nerve analysis
WO2014147717A1 (en) Electronic device, display control method, and program
WO2010122825A1 (en) Mobile terminal
JP5461735B2 (en) Input device, input support method, and program
WO2012111227A1 (en) Touch input device, electronic apparatus, and input method
JP2014215750A (en) Electronic device, calibration method, and program
GB2517284A (en) Operation input device and input operation processing method
KR20140140407A (en) Terminal and method for controlling multi-touch operation in the same
JP2015088147A (en) Touch panel input device and input processing program
JP5995171B2 (en) Electronic device, information processing method, and information processing program
JP2015169948A (en) Information processing device, information processing method, and information processing program
JP2012150849A (en) Input device of portable electronic device, and portable electronic device
US20120133603A1 (en) Finger recognition methods and systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10766885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10766885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP