WO2010119713A1 - Portable terminal - Google Patents

Portable terminal Download PDF

Info

Publication number
WO2010119713A1
WO2010119713A1 PCT/JP2010/050506 JP2010050506W WO2010119713A1 WO 2010119713 A1 WO2010119713 A1 WO 2010119713A1 JP 2010050506 W JP2010050506 W JP 2010050506W WO 2010119713 A1 WO2010119713 A1 WO 2010119713A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
position information
input
display
touch panel
Prior art date
Application number
PCT/JP2010/050506
Other languages
French (fr)
Japanese (ja)
Inventor
英樹 根本
託也 千葉
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Publication of WO2010119713A1 publication Critical patent/WO2010119713A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a portable terminal equipped with a pointing device such as a touch pad or a touch panel.
  • a mobile terminal such as a mobile phone is equipped with various input devices for receiving operation instructions from a user.
  • these input devices one that can input an operation instruction by an intuitive operation such as a touch pad or a touch panel is known (see, for example, JP-A-2008-258805).
  • the touchpad and the touch panel are configured to accept an operation instruction based on input position information obtained by sensing a change in capacitance or contact pressure accompanying contact with the operation surface.
  • One of the input devices is an operation key that receives an operation instruction when pressed.
  • the user inputs an operation instruction with the operation key, the user can determine whether or not the input has been appropriately performed based on a click feeling obtained when the operation key is pressed.
  • the user performs an operation instruction input operation by applying a contact pressure to the operation surface by moving or touching a finger or the like on the operation surface. At this time, when the contact pressure on the operation surface is insufficient, the touch panel or the like cannot sense the contact.
  • the touch panel or the like is difficult to obtain a pressing feeling, and thus it is difficult for the user himself / herself to determine whether or not an operation instruction has been correctly input.
  • the process executed on the mobile terminal in response to the user's input operation may be different from the process expected by the user, or the process expected by the user may not be executed.
  • the occurrence of a discrepancy between the input operation and the processing to be performed in this manner causes a user to feel poor operability.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a portable terminal that can suitably perform a complementing process on a user's operation instruction and improve operability.
  • a mobile terminal includes a display unit that displays a predetermined image, and a detection unit that receives an instruction for the image via an operation surface and detects the instruction as input position information. And a time acquisition unit that acquires a time from when the detection unit ends detection of the input position information to when detection of the input position information starts next, and when the time is less than a predetermined time Complementary position information that complements input position information from the end of the detection to the start of the detection based on the input position information at the end of the detection and the input position information at the start of the detection.
  • the mobile terminal according to the present invention can suitably perform a complementing process on the user's operation instruction to improve operability.
  • a portable terminal As a portable terminal to which the present invention is applied, a portable terminal that is formed in a card shape and that allows a user to input an operation instruction by touching the display with a finger will be described as an example.
  • FIG. 1 is an external perspective view showing an embodiment of a portable terminal according to the present invention.
  • the mobile terminal 1 includes a rectangular plate-shaped casing 11.
  • a touch panel 14 occupies most of one surface of the casing 11.
  • the touch panel 14 has functions of both a display unit and a detection unit.
  • the touch panel 14 as a display unit is a display (display 35 in FIG. 2) provided with an area for displaying a display screen composed of characters, images, and the like.
  • This display is composed of, for example, an LCD (Liquid Crystal Display).
  • the touch panel 14 as a detection unit is a touch sensor (touch sensor 33 in FIG. 2) that detects a contact operation on the operation surface as input position information.
  • the touch sensor includes a plurality of elements for detecting a contact operation arranged on the upper surface of the display, and a transparent operation surface laminated thereon.
  • a pressure-sensitive method for detecting a change in pressure an electrostatic method for detecting an electric signal due to static electricity, or other methods can be applied.
  • a receiver 15 for outputting sound and a microphone 16 for inputting sound are disposed on the housing 11 at positions opposed to each other in the longitudinal direction via the touch panel 14.
  • FIG. 2 is a schematic functional block diagram showing the main functional configuration of the mobile terminal 1 in the present embodiment.
  • the mobile terminal 1 is configured by connecting a main control unit 30, a power supply circuit unit 31, an input control unit 32, a display control unit 34, a voice control unit 36, a communication control unit 37, and a storage unit 39 so that they can communicate with each other via a bus. Has been.
  • the main control unit 30 includes a CPU (Central Processing Unit).
  • the main control unit 30 operates based on various programs stored in the storage unit 39 and performs overall control of the mobile terminal 1.
  • the power supply circuit unit 31 includes a power supply source (not shown).
  • the power supply circuit unit 31 switches the power supply ON / OFF state of the mobile terminal 1 based on an operation of turning on the power supply.
  • the power supply circuit unit 31 supplies power from the power supply source to each unit when the power supply is in an ON state, thereby enabling the mobile terminal 1 to operate.
  • the input control unit 32 includes an input interface for the touch sensor 33.
  • the input control unit 32 receives the detection signal from the touch sensor 33 as input position information indicating the coordinates of the input position every predetermined time (for example, every 10 ms), generates a signal indicating the input, and transmits the signal to the main control unit 30. To do.
  • the display control unit 34 includes a display interface for the display 35.
  • the display control unit 34 causes the display 35 to display an image based on the document data and the image signal based on the control of the main control unit 30.
  • the sound control unit 36 generates an analog sound signal from the sound collected by the microphone 16 based on the control of the main control unit 30, and converts the analog sound signal into a digital sound signal. Further, when acquiring the digital audio signal, the audio control unit 36 converts the digital audio signal into an analog audio signal based on the control of the main control unit 30 and outputs the analog audio signal as audio from the receiver 15.
  • the communication control unit 37 restores data by performing a spectrum despreading process on the received signal received from the base station via the antenna 38 based on the control of the main control unit 30.
  • This data is transmitted to the voice control unit 36 and output from the receiver 15 according to an instruction from the main control unit 30, transmitted to the display control unit 34 and displayed on the display 35, or recorded in the storage unit 39.
  • the communication control unit 37 acquires the voice data collected by the microphone 16, the data input via the touch panel 14, or the data stored in the storage unit 39 based on the control of the main control unit 30, Is spread over the data and transmitted to the base station via the antenna 38.
  • the storage unit 39 is a ROM (Read Only Memory) that stores a processing program for processing performed by the main control unit 30 and data necessary for the processing, a hard disk, a non-volatile memory, a database, and is used when the main control unit 30 performs processing.
  • RAM Random Access Memory
  • the mobile terminal 1 in the present embodiment includes the touch panel 14 as described above.
  • the touch panel 14 receives an instruction for an image or the like displayed on the display 35 of the touch panel 14 via the operation surface.
  • the user inputs to the touch panel 14 via the user's finger or stylus pen.
  • an example in which input is mainly performed on the touch panel 14 with a finger will be described.
  • the present invention can also be applied to a case other than a finger.
  • FIG. 3 is a diagram conceptually illustrating a case where a movement instruction for an image displayed on the touch panel 14 is input. The example illustrated in FIG. 3 is described by applying a cursor 41 used to point an operation target on the touch panel 14 as an example of an image displayed on the touch panel 14.
  • a cursor 41 used to indicate an operation target is displayed on the touch panel 14.
  • an area assigned as an operation pad 40 that exclusively receives an operation instruction for the cursor 41 is displayed at a predetermined position on the touch panel 14.
  • the operation pad 40 detects input position information at predetermined time intervals (for example, 10 ms) based on the input operation of the user who moves the finger F in contact with the operation pad 40.
  • the main control unit 30 performs display processing for moving the cursor 41 based on the detected input position information.
  • the cursor 41 is moved in the direction of the illustrated arrow B based on the direction and movement amount corresponding to the input operation. .
  • FIG. 4 is a diagram conceptually illustrating an operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input.
  • FIG. 5 is a diagram for conceptually explaining another operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input.
  • FIG. 4A and FIG. 5A show the locus of input operations performed on the operation panel 40 displayed on the touch panel 14.
  • 4B and 5B show display examples of the cursor 41 displayed on the touch panel 14 based on the input operation. 4 and 5, the operation of the finger F performed on the operation pad 40 (FIGS. 4A and 5A) and the display processing of the cursor 41 corresponding to this operation (FIG. 4).
  • FIG. 4 (B) and FIG. 5 (B)) are individually shown, but actually, the operation of the finger F and the display process of the cursor 41 are simultaneously performed on the touch panel 14 as shown in FIG. .
  • FIG. 4A shows a case where the finger F is operated so as to draw a waveform.
  • the mobile terminal 1 is configured as shown in FIG.
  • (B) a display is performed in which the cursor 41 is moved while drawing a locus having substantially the same shape as the waveform drawn by the finger F.
  • the trajectory drawn in FIG. 4B is a trajectory having the cursor 41a position as the start point and the cursor 41b position as the end point, and shows a state in which the cursor 41 continuously moves on the display.
  • the mobile terminal 1 performs a display of moving the cursor 41 while drawing a locus having the cursor 41a position shown in FIG. 5B as the start point and the cursor 41b position as the end point. That is, the mobile terminal 1 performs display processing for moving the cursor 41 based on a locus detected from the operation pad 40 and not intended by the user.
  • the mobile terminal 1 makes the user feel the poor operability that the intended operation is not performed on the operation pad 40 operated by the finger F.
  • the portable terminal 1 performs input position information complementation processing even when the contact pressure is insufficient and the touch panel 14 (touch sensor 33) cannot detect an input operation associated with the contact.
  • the display process accompanying the operation intended by the user can be performed.
  • FIG. 6 is a flowchart for explaining the input position information complementing process executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
  • step S1 the main control unit 30 determines whether a touch event has occurred after a release event has occurred.
  • the release event is an event that occurs when the input position information detected by the touch panel 14 as the detection unit every predetermined time cannot be acquired (when the detection is finished).
  • the touch event is an event that occurs when the touch panel 14 detects input position information next (when detection is started) after the occurrence of a release event.
  • the case where the touch panel 14 cannot acquire the input position information means that when the touch sensor 33 is pressure sensitive, the detected contact pressure is less than a predetermined pressure (for example, 100 g / cm 2 ⁇ 9806 Pa). If it is an equation, it means a case where the detected capacitance is equal to or less than a predetermined capacitance.
  • a predetermined pressure for example, 100 g / cm 2 ⁇ 9806 Pa.
  • step S1 the main control unit 30 waits for processing until it occurs.
  • step S2 when it is determined that a release event and a touch event have occurred (YES in step S1), in step S2, the main control unit 30 as a time acquisition unit acquires the time from the release event occurrence to the touch event occurrence.
  • the mobile terminal 1 is provided with a dedicated timer for measuring the time between events, and the main control unit 30 acquires the time between events from this timer.
  • step S3 the main control unit 30 determines whether or not the time between events acquired in the time acquisition step S2 is less than a predetermined time (for example, 60 ms).
  • the predetermined time is determined by a value that allows the release event to be regarded as an event that has occurred due to an operation not intended by the user.
  • the time used for the determination in step S3 may have a range in value. For example, when the time between events is within a range of 20 ms to 60 ms, it may be determined that the release event is not intended by the user. When the time between events is 60 ms or more, the time between events is long, and it can be considered that the user has intentionally generated a release event. Further, when the time between events is 20 ms or less, the operation is negligibly discontinued and cannot be recognized by the user. If the time between events is less than 60 ms, all of them may be subjected to complement processing.
  • the main control unit 30 determines that the time between events is not less than the predetermined time (NO in step S3), the main control unit 30 ends the process because it is not necessary to perform the complement process.
  • step S3 when it is determined that the time between events is less than the predetermined time (YES in step S3), the main control unit 30 serving as a complement processing unit in step S4 ignores the release event and performs complementation for obtaining supplementary position information. Process.
  • the main control unit 30 complements the input position information at the time of occurrence of the touch event from the occurrence of the release event based on the input position information at the time of occurrence of the release event and the input position information at the time of occurrence of the touch event as complementary position information. .
  • the release operation unintended by the user is ignored, and an input operation in which contact with the operation surface of the touch panel 14 continues is obtained, and the complementary position information is obtained so that the cursor is displayed on the touch panel 14 based on the input operation.
  • the complementary position information is obtained that linearly connects the input position information when the release event occurs and the input position information when the touch event occurs.
  • the interpolation method for the discontinuous period of the input operation is not limited to this, and other methods may be used.
  • step S5 the main control unit 30 as the control unit displays a cursor on the touch panel 14 based on the input position information detected from the touch panel 14 and the complementary position information obtained in the complementary processing step S4.
  • FIG. 7 is a diagram conceptually illustrating a display example of an image (cursor 41) when the complementing process is performed on the input operation.
  • FIG. 7A shows a trajectory of the input operation of the finger F performed on the touch panel 14.
  • FIG. 7B shows a display example of the cursor 41 displayed based on the input operation of FIG.
  • a solid line in FIG. 7A indicates a locus where the user performs an input operation on the touch panel 14 using the finger F, and a broken line indicates a portion where detection is not performed although the user intended the input operation. Show the trajectory.
  • the mobile terminal 1 when the touch panel 14 cannot detect the input position information corresponding to the operation intended by the user and a release event occurs, the mobile terminal 1 performs the complementing process of FIG. . Specifically, the portion that is not detected, which is the dotted line portion shown in FIG. 7A, is continuously connected based on the input position information when the release event occurs and the input position information when the touch event occurs. Complement processing is performed to obtain complementary position information.
  • the mobile terminal 1 displays the cursor 41 on the touch panel 14 based on the input position information and the complementary position information obtained from the touch panel 14. As shown in FIG. 7B, the mobile terminal 1 can continuously display the movement of the cursor 41 close to the input operation intended by the user. For this reason, the portable terminal 1 can perform display processing in consideration of the user's intention even when the contact pressure of the user on the touch panel 14 is not sufficient and the touch panel 14 cannot detect the input operation intended by the user. it can.
  • the portable terminal 1 even when a release event occurs because the input operation intended by the user cannot be detected, it is determined appropriately whether or not the release event is intentional. Complementary processing for display processing can be performed. For this reason, the portable terminal 1 can implement
  • the present invention is not limited to this, and the present invention may be applied to a complementary process for an input operation accompanying an operation instruction for an image other than the cursor.
  • the present invention can be applied to a case where the mobile terminal 1 performs a display process in which an image displayed on the display 35 is continuously changed with a continuous input operation on the touch panel 14.
  • the present invention can be applied to a case where display processing is performed in which the image change amount or the display time is changed depending on the contact time with the operation surface of the touch panel 14.
  • the portable terminal 1 in this embodiment applied and demonstrated the structure provided with the touchscreen 14 with which the display 35 and the touch sensor 33 were united, the touch sensor 33 is the touch comprised separately with the display 35. It may be a pointing device such as a pad.
  • the mobile terminal according to the present invention can be applied to a mobile phone, a PDA (Personal Digital Assistant), a personal computer, a portable game machine, a portable music player, a portable video player, and other portable terminals.
  • a PDA Personal Digital Assistant

Abstract

A portable terminal is provided with a display section for displaying a predetermined image, a detection section for receiving an instruction to the image via an operation surface and detecting the instruction as input positional information, a time acquisition section for acquiring the time from when the detection section has completed the detection of the input positional information till the detection section next starts the detection of the input positional information, a complementary processing section for obtaining complementary positional information which complements input positional information from when the detection has been completed till the detection has been started on the basis of the input positional information when the detection has been completed and the input positional information when the detection has been started if the time acquired by the time acquisition section is less than predetermined time, and a control unit for controlling the display of the image on the basis of the input positional information detected by the detection section and the complementary positional information.

Description

携帯端末Mobile device
 本発明は、タッチパッドやタッチパネルなどのポインティングデバイスを備えた携帯端末に関する。 The present invention relates to a portable terminal equipped with a pointing device such as a touch pad or a touch panel.
 携帯電話機をはじめとする携帯端末には、ユーザからの操作指示を受け付けるための種々の入力装置が搭載されている。これらの入力装置のうち、タッチパッドやタッチパネルなどの直感的な動作で操作指示を入力することができるものが知られている(例えば、特開2008-258805号公報参照)。このタッチパッドやタッチパネルは、操作面に対する接触に伴う静電容量や接触圧力の変化を感知して得られた入力位置情報に基づいて、操作指示を受け付けるようになっている。 A mobile terminal such as a mobile phone is equipped with various input devices for receiving operation instructions from a user. Among these input devices, one that can input an operation instruction by an intuitive operation such as a touch pad or a touch panel is known (see, for example, JP-A-2008-258805). The touchpad and the touch panel are configured to accept an operation instruction based on input position information obtained by sensing a change in capacitance or contact pressure accompanying contact with the operation surface.
 入力装置の一つに、押下することにより操作指示を受け付ける操作キーがある。ユーザは、この操作キーで操作指示を入力した場合、操作キーの押下時に得られるクリック感により、入力が適切に行われたか否かを判断することができる。 One of the input devices is an operation key that receives an operation instruction when pressed. When the user inputs an operation instruction with the operation key, the user can determine whether or not the input has been appropriately performed based on a click feeling obtained when the operation key is pressed.
 また、タッチパネルやタッチパッドの場合、ユーザは、操作面上で指などを移動させたりタッチしたりすることにより操作面に接触圧力を付加することで、操作指示の入力動作を行う。このとき、操作面に対する接触圧力が不足した場合、タッチパネルなどはその接触を感知することができない。 Also, in the case of a touch panel or a touch pad, the user performs an operation instruction input operation by applying a contact pressure to the operation surface by moving or touching a finger or the like on the operation surface. At this time, when the contact pressure on the operation surface is insufficient, the touch panel or the like cannot sense the contact.
 さらに、タッチパネルなどは、上述した操作キーとは異なり押下感が得られ難いため、ユーザ自身も正しく操作指示が入力できているか否かを判断し難い。 Furthermore, unlike the above-described operation keys, the touch panel or the like is difficult to obtain a pressing feeling, and thus it is difficult for the user himself / herself to determine whether or not an operation instruction has been correctly input.
 このため、ユーザの入力動作に対して携帯端末で実行される処理が、ユーザの期待した処理と異なるものであったり、ユーザの期待した処理が実行されない場合が発生することになる。また、このように入力動作と実行される処理とに齟齬が生じることにより、ユーザに対して操作性の悪さを感じさせる要因となる。 For this reason, the process executed on the mobile terminal in response to the user's input operation may be different from the process expected by the user, or the process expected by the user may not be executed. In addition, the occurrence of a discrepancy between the input operation and the processing to be performed in this manner causes a user to feel poor operability.
発明の開示
 本発明はこのような事情を考慮してなされたもので、ユーザの操作指示に対して好適に補完処理を行い、操作性の向上を図ることができる携帯端末を提供することを目的とする。
DISCLOSURE OF THE INVENTION The present invention has been made in view of such circumstances, and an object of the present invention is to provide a portable terminal that can suitably perform a complementing process on a user's operation instruction and improve operability. And
 本発明に係る携帯端末は、上述した課題を解決するために、所定の画像を表示する表示部と、前記画像に対する指示を操作面を介して受け付け、前記指示を入力位置情報として検出する検出部と、前記検出部が、前記入力位置情報の検出を終了してから、次に前記入力位置情報の検出を開始するまでの時間を取得する時間取得部と、前記時間が所定時間未満である場合、前記検出の終了時の前記入力位置情報と前記検出の開始時の前記入力位置情報とに基づいて、前記検出の終了時から前記検出の開始時までの入力位置情報を補完する補完位置情報を求める補完処理部と、前記検出部により検出された前記入力位置情報および前記補完位置情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする。 In order to solve the above-described problem, a mobile terminal according to the present invention includes a display unit that displays a predetermined image, and a detection unit that receives an instruction for the image via an operation surface and detects the instruction as input position information. And a time acquisition unit that acquires a time from when the detection unit ends detection of the input position information to when detection of the input position information starts next, and when the time is less than a predetermined time Complementary position information that complements input position information from the end of the detection to the start of the detection based on the input position information at the end of the detection and the input position information at the start of the detection. A complement processing unit to be obtained, and a control unit that performs display control of the image based on the input position information and the complementary position information detected by the detection unit.
 本発明に係る携帯端末は、ユーザの操作指示に対して好適に補完処理を行い、操作性の向上を図ることができる。 The mobile terminal according to the present invention can suitably perform a complementing process on the user's operation instruction to improve operability.
本発明に係る携帯端末の実施形態を示す外観斜視図である。It is an external appearance perspective view which shows embodiment of the portable terminal which concerns on this invention. 本実施形態における携帯端末の主な機能構成を示す概略的な機能ブロック図である。It is a schematic functional block diagram which shows the main functional structures of the portable terminal in this embodiment. タッチパネルに表示された画像に対する移動指示を入力する場合を概念的に説明する図である。It is a figure which illustrates notionally the case where the movement instruction | indication with respect to the image displayed on the touch panel is input. 操作パッドに入力を行う動作およびその入力に基づくカーソルの表示例を概念的に説明する図である。It is a figure which illustrates notionally the operation which inputs into an operation pad, and the example of a cursor display based on the input. 操作パッドに入力を行う他の動作およびその入力に基づくカーソルの表示例を概念的に説明する図である。It is a figure which illustrates notionally the display example of the other operation | movement which inputs into an operation pad, and the input based on the input. 本実施形態における携帯端末の主制御部により実行される入力位置情報の補完処理を説明するフローチャートである。It is a flowchart explaining the complementary process of the input position information performed by the main control part of the portable terminal in this embodiment. 入力動作に対して図6の補完処理が行われた場合の画像(カーソル)の表示例を概念的に説明する図である。It is a figure which illustrates notionally the example of a display of the image (cursor) when the complementation process of FIG. 6 is performed with respect to input operation | movement.
 本発明に係る携帯端末の実施形態を添付図面に基づいて説明する。本発明を適用する携帯端末として、カード型に形成され、ユーザがディスプレイを指で触れることで操作指示を入力することができる携帯端末を例に挙げて説明する。 Embodiments of a portable terminal according to the present invention will be described with reference to the accompanying drawings. As a portable terminal to which the present invention is applied, a portable terminal that is formed in a card shape and that allows a user to input an operation instruction by touching the display with a finger will be described as an example.
 図1は、本発明に係る携帯端末の実施形態を示す外観斜視図である。 FIG. 1 is an external perspective view showing an embodiment of a portable terminal according to the present invention.
 携帯端末1は、矩形の板状の筐体11を備える。この筐体11の一方の表面には、タッチパネル14が大部分を占めて構成される。 The mobile terminal 1 includes a rectangular plate-shaped casing 11. A touch panel 14 occupies most of one surface of the casing 11.
 タッチパネル14は、表示部と検出部との双方の機能を備える。 The touch panel 14 has functions of both a display unit and a detection unit.
 表示部としてのタッチパネル14は、文字や画像などからなる表示画面を表示する領域が設けられたディスプレイ(図2のディスプレイ35)である。このディスプレイは、例えばLCD(Liquid Crystal Display)で構成される。 The touch panel 14 as a display unit is a display (display 35 in FIG. 2) provided with an area for displaying a display screen composed of characters, images, and the like. This display is composed of, for example, an LCD (Liquid Crystal Display).
 検出部としてのタッチパネル14は、操作面に対する接触動作を入力位置情報として検出するタッチセンサ(図2のタッチセンサ33)である。タッチセンサは、ディスプレイの上面に複数配置された接触動作を検出するための素子と、さらにその上に積層された透明な操作面で構成される。なお、タッチパネル14上で接触動作を検知する方法は、圧力の変化を感知する感圧式、静電気による電気信号を感知する静電式、その他の方法を適用することができる。 The touch panel 14 as a detection unit is a touch sensor (touch sensor 33 in FIG. 2) that detects a contact operation on the operation surface as input position information. The touch sensor includes a plurality of elements for detecting a contact operation arranged on the upper surface of the display, and a transparent operation surface laminated thereon. As a method for detecting the contact operation on the touch panel 14, a pressure-sensitive method for detecting a change in pressure, an electrostatic method for detecting an electric signal due to static electricity, or other methods can be applied.
 また、筐体11上であって、タッチパネル14を介した長手方向対向位置には、音声を出力するためのレシーバ15と、音声を入力するためのマイクロフォン16とがそれぞれ配置される。 Further, a receiver 15 for outputting sound and a microphone 16 for inputting sound are disposed on the housing 11 at positions opposed to each other in the longitudinal direction via the touch panel 14.
 図2は、本実施形態における携帯端末1の主な機能構成を示す概略的な機能ブロック図である。携帯端末1は、主制御部30、電源回路部31、入力制御部32、表示制御部34、音声制御部36、通信制御部37、記憶部39がバスによって相互に通信可能に接続されて構成されている。 FIG. 2 is a schematic functional block diagram showing the main functional configuration of the mobile terminal 1 in the present embodiment. The mobile terminal 1 is configured by connecting a main control unit 30, a power supply circuit unit 31, an input control unit 32, a display control unit 34, a voice control unit 36, a communication control unit 37, and a storage unit 39 so that they can communicate with each other via a bus. Has been.
 主制御部30は、CPU(Central Processing Unit)を具備する。主制御部30は、記憶部39に記憶された各種プログラムに基づき動作して、携帯端末1の総括的な制御を行う。 The main control unit 30 includes a CPU (Central Processing Unit). The main control unit 30 operates based on various programs stored in the storage unit 39 and performs overall control of the mobile terminal 1.
 電源回路部31は、電力供給源(図示せず)を備える。電源回路部31は、電源をONする操作に基づいて携帯端末1の電源のON/OFF状態を切り替える。電源回路部31は、電源がON状態の場合に電力供給源から各部に対して電力を供給して、携帯端末1を動作可能にする。 The power supply circuit unit 31 includes a power supply source (not shown). The power supply circuit unit 31 switches the power supply ON / OFF state of the mobile terminal 1 based on an operation of turning on the power supply. The power supply circuit unit 31 supplies power from the power supply source to each unit when the power supply is in an ON state, thereby enabling the mobile terminal 1 to operate.
 入力制御部32はタッチセンサ33に対する入力インタフェースを備える。入力制御部32は、所定時間毎(例えば10ms毎)にタッチセンサ33からの検知信号を入力位置の座標を示す入力位置情報として受け取り、その入力を示す信号を生成して主制御部30に伝送する。 The input control unit 32 includes an input interface for the touch sensor 33. The input control unit 32 receives the detection signal from the touch sensor 33 as input position information indicating the coordinates of the input position every predetermined time (for example, every 10 ms), generates a signal indicating the input, and transmits the signal to the main control unit 30. To do.
 表示制御部34はディスプレイ35に対する表示インタフェースを備える。表示制御部34は、主制御部30の制御に基づいて、文書データや画像信号に基づいた画像をディスプレイ35に表示させる。 The display control unit 34 includes a display interface for the display 35. The display control unit 34 causes the display 35 to display an image based on the document data and the image signal based on the control of the main control unit 30.
 音声制御部36は、主制御部30の制御に基づいて、マイクロフォン16で集音された音声からアナログ音声信号を生成し、このアナログ音声信号をデジタル音声信号に変換する。また音声制御部36は、デジタル音声信号を取得すると、主制御部30の制御に基づいて、このデジタル音声信号をアナログ音声信号に変換し、レシーバ15から音声として出力する。 The sound control unit 36 generates an analog sound signal from the sound collected by the microphone 16 based on the control of the main control unit 30, and converts the analog sound signal into a digital sound signal. Further, when acquiring the digital audio signal, the audio control unit 36 converts the digital audio signal into an analog audio signal based on the control of the main control unit 30 and outputs the analog audio signal as audio from the receiver 15.
 通信制御部37は、主制御部30の制御に基づいて、基地局からアンテナ38を介して受信した受信信号をスペクトラム逆拡散処理してデータを復元する。このデータは、主制御部30の指示により、音声制御部36に伝送されてレシーバ15から出力されたり、表示制御部34に伝送されてディスプレイ35に表示されたり、または記憶部39に記録されたりする。また通信制御部37は、主制御部30の制御に基づいて、マイクロフォン16で集音された音声データやタッチパネル14を介して入力されたデータや記憶部39に記憶されたデータを取得すると、これらのデータに対してスペクトラム拡散処理を行い、基地局に対してアンテナ38を介して送信する。 The communication control unit 37 restores data by performing a spectrum despreading process on the received signal received from the base station via the antenna 38 based on the control of the main control unit 30. This data is transmitted to the voice control unit 36 and output from the receiver 15 according to an instruction from the main control unit 30, transmitted to the display control unit 34 and displayed on the display 35, or recorded in the storage unit 39. To do. When the communication control unit 37 acquires the voice data collected by the microphone 16, the data input via the touch panel 14, or the data stored in the storage unit 39 based on the control of the main control unit 30, Is spread over the data and transmitted to the base station via the antenna 38.
 記憶部39は、主制御部30が行う処理についての処理プログラムや処理に必要なデータなどを格納するROM(ReadOnlyMemory)やハードディスク、不揮発性メモリ、データベース、主制御部30が処理を行う際に使用されるデータを一時的に記憶するRAM(RandomAccessMemory)などから構成される。 The storage unit 39 is a ROM (Read Only Memory) that stores a processing program for processing performed by the main control unit 30 and data necessary for the processing, a hard disk, a non-volatile memory, a database, and is used when the main control unit 30 performs processing. RAM (Random Access Memory) that temporarily stores data to be stored.
 本実施形態における携帯端末1は、上述したようにタッチパネル14を備える。このタッチパネル14は、操作面を介してタッチパネル14のディスプレイ35に表示された画像などに対する指示を受け付ける。ユーザは、ユーザの指やスタイラスペンなどを介して、このタッチパネル14に入力を行うようになっている。なお、本実施形態においては、主に指でタッチパネル14に入力を行う例を適用して説明するが、指以外であっても同様に適用することができる。 The mobile terminal 1 in the present embodiment includes the touch panel 14 as described above. The touch panel 14 receives an instruction for an image or the like displayed on the display 35 of the touch panel 14 via the operation surface. The user inputs to the touch panel 14 via the user's finger or stylus pen. In the present embodiment, an example in which input is mainly performed on the touch panel 14 with a finger will be described. However, the present invention can also be applied to a case other than a finger.
 図3は、タッチパネル14に表示された画像に対する移動指示を入力する場合を概念的に説明する図である。図3に示す例は、タッチパネル14に表示される画像の一例として、タッチパネル14上で操作の対象を指し示すために用いられるカーソル41を適用して説明する。 FIG. 3 is a diagram conceptually illustrating a case where a movement instruction for an image displayed on the touch panel 14 is input. The example illustrated in FIG. 3 is described by applying a cursor 41 used to point an operation target on the touch panel 14 as an example of an image displayed on the touch panel 14.
 タッチパネル14には、操作の対象を指し示すために用いられるカーソル41が表示されるようになっている。また、タッチパネル14の所定位置には、カーソル41に対する操作指示を専用に受け付ける操作パッド40として割り当てられた領域が表示される。操作パッド40は、指Fを接触させて移動させるユーザの入力動作に基づいて、入力位置情報を所定時間毎(例えば10ms)に検出する。主制御部30は、検出された入力位置情報に基づいて、カーソル41を移動させる表示処理を行うようになっている。 On the touch panel 14, a cursor 41 used to indicate an operation target is displayed. In addition, an area assigned as an operation pad 40 that exclusively receives an operation instruction for the cursor 41 is displayed at a predetermined position on the touch panel 14. The operation pad 40 detects input position information at predetermined time intervals (for example, 10 ms) based on the input operation of the user who moves the finger F in contact with the operation pad 40. The main control unit 30 performs display processing for moving the cursor 41 based on the detected input position information.
 例えば、操作パッド40上で図示矢印Aの方向に指Fを移動させると、この入力動作に対応した方向および移動量に基づいて、カーソル41は図示矢印Bの方向に移動するようになっている。 For example, when the finger F is moved on the operation pad 40 in the direction of the illustrated arrow A, the cursor 41 is moved in the direction of the illustrated arrow B based on the direction and movement amount corresponding to the input operation. .
 図4は、操作パッド40に入力を行う動作およびその入力に基づくカーソル41の表示例を概念的に説明する図である。図5は、操作パッド40に入力を行う他の動作およびその入力に基づくカーソル41の表示例を概念的に説明する図である。図4(A)および図5(A)は、タッチパネル14上に表示された操作パネル40に対して行われた入力動作の軌跡を示す。図4(B)および図5(B)は、入力動作に基づいてタッチパネル14に表示されるカーソル41の表示例を示す。なお、図4および図5においては、操作パッド40上で行われた指Fの動作(図4(A)、図5(A))と、この動作に対応するカーソル41の表示処理(図4(B)、図5(B))とが個別に表わされているが、実際には図3に示すようにタッチパネル14上で指Fの動作およびカーソル41の表示処理が同時に行われている。 FIG. 4 is a diagram conceptually illustrating an operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input. FIG. 5 is a diagram for conceptually explaining another operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input. FIG. 4A and FIG. 5A show the locus of input operations performed on the operation panel 40 displayed on the touch panel 14. 4B and 5B show display examples of the cursor 41 displayed on the touch panel 14 based on the input operation. 4 and 5, the operation of the finger F performed on the operation pad 40 (FIGS. 4A and 5A) and the display processing of the cursor 41 corresponding to this operation (FIG. 4). (B) and FIG. 5 (B)) are individually shown, but actually, the operation of the finger F and the display process of the cursor 41 are simultaneously performed on the touch panel 14 as shown in FIG. .
 ユーザは、タッチパネル14に表示されたカーソル41(図4(B))を動かすための操作指示の入力動作を行う場合、指Fを操作パッド40に押し当て、そこから所望の操作量だけ指Fを移動させる動作を行う。図4(A)では、指Fを波形を描くように動作した場合が示されている。 When the user performs an input operation of an operation instruction for moving the cursor 41 (FIG. 4B) displayed on the touch panel 14, the user presses the finger F against the operation pad 40, and the finger F by a desired operation amount is obtained therefrom. The operation to move is performed. FIG. 4A shows a case where the finger F is operated so as to draw a waveform.
 ここで、ユーザが、指Fから操作パッド40に対して十分な接触圧力を付与することができ、操作パッド40が指Fの接触を連続的に検出できた場合、携帯端末1は、図4(B)に示すように、カーソル41を指Fで描かれた波形とほぼ同形状の軌跡を描いて移動させる表示を行う。なお、図4(B)に描かれた軌跡は、カーソル41a位置を始点、カーソル41b位置を終点とする軌跡であり、カーソル41がディスプレイ上を連続的に移動した様子を示したものである。 Here, when the user can apply a sufficient contact pressure from the finger F to the operation pad 40 and the operation pad 40 can continuously detect the contact of the finger F, the mobile terminal 1 is configured as shown in FIG. As shown in (B), a display is performed in which the cursor 41 is moved while drawing a locus having substantially the same shape as the waveform drawn by the finger F. Note that the trajectory drawn in FIG. 4B is a trajectory having the cursor 41a position as the start point and the cursor 41b position as the end point, and shows a state in which the cursor 41 continuously moves on the display.
 一方、ユーザが、図4(A)と同様な波形を描いて操作パッド40上で指Fを移動させているつもりであっても、指Fから操作パッド40に対して十分な接触圧力が付与されていない場合が起こり得る。例えば、ユーザの指Fの接触圧力が検出可能な圧力よりも小さかった場合に起こり得る。このような場合、図5(A)に示すように操作パッド40が指Fの接触を連続的に検出できず、断続的な軌跡を検出することになる。 On the other hand, even if the user intends to move the finger F on the operation pad 40 while drawing the same waveform as in FIG. 4A, sufficient contact pressure is applied from the finger F to the operation pad 40. If not, it can happen. For example, this may occur when the contact pressure of the user's finger F is smaller than a detectable pressure. In such a case, as shown in FIG. 5A, the operation pad 40 cannot continuously detect the contact of the finger F, and detects an intermittent trajectory.
 すると、携帯端末1は検出された軌跡に基づいて、図5(B)に示すカーソル41a位置を始点、カーソル41b位置を終点とする軌跡を描いてカーソル41を移動させる表示を行うことになる。すなわち、携帯端末1は、操作パッド40から検出されたユーザが意図しない軌跡に基づいて、カーソル41を移動させる表示処理を行うこととなる。 Then, based on the detected locus, the mobile terminal 1 performs a display of moving the cursor 41 while drawing a locus having the cursor 41a position shown in FIG. 5B as the start point and the cursor 41b position as the end point. That is, the mobile terminal 1 performs display processing for moving the cursor 41 based on a locus detected from the operation pad 40 and not intended by the user.
 このため、携帯端末1は、ユーザは指Fで操作する操作パッド40に対して意図した操作が実行されないという操作性の悪さを感じさせてしまうことになる。 For this reason, the mobile terminal 1 makes the user feel the poor operability that the intended operation is not performed on the operation pad 40 operated by the finger F.
 これに対し、本実施形態における携帯端末1は、接触圧力が不足し、タッチパネル14(タッチセンサ33)が接触に伴う入力操作を検出できない場合であっても、入力位置情報の補完処理を行うことでユーザの意図した動作に伴う表示処理を行うことができる。 On the other hand, the portable terminal 1 according to the present embodiment performs input position information complementation processing even when the contact pressure is insufficient and the touch panel 14 (touch sensor 33) cannot detect an input operation associated with the contact. Thus, the display process accompanying the operation intended by the user can be performed.
 図6は、本実施形態における携帯端末1の主制御部30により実行される入力位置情報の補完処理を説明するフローチャートである。 FIG. 6 is a flowchart for explaining the input position information complementing process executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
 ステップS1において、主制御部30は、リリースイベントが発生した後、タッチイベントが発生したか否かの判定を行う。リリースイベントは、検出部としてのタッチパネル14が所定時間毎に検出する入力位置情報が取得できない場合(検出を終了した場合)に発生するイベントである。タッチイベントは、リリースイベントの発生後、タッチパネル14が次に入力位置情報を検出する場合(検出を開始した場合)に発生するイベントである。 In step S1, the main control unit 30 determines whether a touch event has occurred after a release event has occurred. The release event is an event that occurs when the input position information detected by the touch panel 14 as the detection unit every predetermined time cannot be acquired (when the detection is finished). The touch event is an event that occurs when the touch panel 14 detects input position information next (when detection is started) after the occurrence of a release event.
 タッチパネル14が入力位置情報を取得できない場合とは、タッチセンサ33が感圧式である場合には検出された接触圧力が所定圧力(例えば、100g/cm≒9806Pa)以下であった場合、静電式である場合には検出された静電容量が所定静電容量以下の場合をいう。 The case where the touch panel 14 cannot acquire the input position information means that when the touch sensor 33 is pressure sensitive, the detected contact pressure is less than a predetermined pressure (for example, 100 g / cm 2 ≈9806 Pa). If it is an equation, it means a case where the detected capacitance is equal to or less than a predetermined capacitance.
 主制御部30は、リリースイベントおよびタッチイベントが発生していないと判定した場合(ステップS1のNO)、発生するまで処理を待機する。 When it is determined that the release event and the touch event have not occurred (NO in step S1), the main control unit 30 waits for processing until it occurs.
 一方、リリースイベントおよびタッチイベントが発生したと判定された場合(ステップS1のYES)、ステップS2において時間取得部としての主制御部30は、リリースイベント発生からタッチイベント発生までの時間を取得する。なお、携帯端末1は、このイベント間の時間を計測するための専用のタイマを設けており、主制御部30はこのタイマよりイベント間の時間を取得する。 On the other hand, when it is determined that a release event and a touch event have occurred (YES in step S1), in step S2, the main control unit 30 as a time acquisition unit acquires the time from the release event occurrence to the touch event occurrence. The mobile terminal 1 is provided with a dedicated timer for measuring the time between events, and the main control unit 30 acquires the time between events from this timer.
 ステップS3において、主制御部30は、時間取得ステップS2で取得されたイベント間の時間が、所定時間(例えば60ms)未満であるか否かを判定する。この所定時間は、リリースイベントが、ユーザの意図しない動作により発生したイベントであったとみなせる値で決定される。 In step S3, the main control unit 30 determines whether or not the time between events acquired in the time acquisition step S2 is less than a predetermined time (for example, 60 ms). The predetermined time is determined by a value that allows the release event to be regarded as an event that has occurred due to an operation not intended by the user.
 なお、ステップS3で判定に用いられる時間は、値に範囲を設けてもよい。例えば、イベント間の時間が20ms~60msの範囲内であった場合に、ユーザの意図しないリリースイベントであったと判断するようにしてもよい。イベント間の時間が60ms以上であった場合には、イベント間の時間が長く、ユーザが意図的にリリースイベントを発生させたとみなすことができる。また、イベント間の時間が20ms以下であった場合には、極わずかな動作の途切れであり、ユーザが認識できない途切れであるため、補完処理の対象とはしない。なお、イベント間の時間が60ms未満の場合は、全て補完処理の対象としてもよい。 Note that the time used for the determination in step S3 may have a range in value. For example, when the time between events is within a range of 20 ms to 60 ms, it may be determined that the release event is not intended by the user. When the time between events is 60 ms or more, the time between events is long, and it can be considered that the user has intentionally generated a release event. Further, when the time between events is 20 ms or less, the operation is negligibly discontinued and cannot be recognized by the user. If the time between events is less than 60 ms, all of them may be subjected to complement processing.
 主制御部30は、イベント間の時間が所定時間未満ではないと判定した場合(ステップS3のNO)、補完処理を行う必要がないため処理を終了する。 If the main control unit 30 determines that the time between events is not less than the predetermined time (NO in step S3), the main control unit 30 ends the process because it is not necessary to perform the complement process.
 一方、イベント間の時間が所定時間未満であると判定された場合(ステップS3のYES)、ステップS4において補完処理部としての主制御部30は、リリースイベントを無視し、補完位置情報を求める補完処理を行う。 On the other hand, when it is determined that the time between events is less than the predetermined time (YES in step S3), the main control unit 30 serving as a complement processing unit in step S4 ignores the release event and performs complementation for obtaining supplementary position information. Process.
 主制御部30は、補完位置情報として、リリースイベント発生時の入力位置情報と、タッチイベント発生時の入力位置情報とに基づいて、リリースイベント発生時からタッチイベント発生時の入力位置情報を補完する。すなわち、ユーザの意図しないリリース動作を無視し、あたかもタッチパネル14の操作面に対する接触が継続した入力動作が得られ、その入力動作に基づきカーソルがタッチパネル14に表示されるように、補完位置情報を求める処理を行う。 The main control unit 30 complements the input position information at the time of occurrence of the touch event from the occurrence of the release event based on the input position information at the time of occurrence of the release event and the input position information at the time of occurrence of the touch event as complementary position information. . In other words, the release operation unintended by the user is ignored, and an input operation in which contact with the operation surface of the touch panel 14 continues is obtained, and the complementary position information is obtained so that the cursor is displayed on the touch panel 14 based on the input operation. Process.
 具体的には、操作面に対する入力動作が継続的に検出されたとみなすため、リリースイベント発生時の入力位置情報とタッチイベント発生時の入力位置情報とを直線的につなぐような補完位置情報を求める。なお、入力動作の不連続期間の補間方法はこれに限られず、他の方法を用いてもよい。 Specifically, since it is considered that the input operation on the operation surface has been continuously detected, the complementary position information is obtained that linearly connects the input position information when the release event occurs and the input position information when the touch event occurs. . The interpolation method for the discontinuous period of the input operation is not limited to this, and other methods may be used.
 ステップS5において、制御部としての主制御部30は、タッチパネル14より検出された入力位置情報および補完処理ステップS4において求められた補完位置情報に基づいて、カーソルをタッチパネル14に表示する。 In step S5, the main control unit 30 as the control unit displays a cursor on the touch panel 14 based on the input position information detected from the touch panel 14 and the complementary position information obtained in the complementary processing step S4.
 図7は、入力動作に対して補完処理が行われた場合の画像(カーソル41)の表示例を概念的に説明する図である。図7(A)は、タッチパネル14に対して行われた指Fの入力動作の軌跡を示す。図7(B)は、図7(A)の入力動作に基づいて表示されるカーソル41の表示例を示す。図7(A)における実線は、ユーザが指Fを用いてタッチパネル14に入力動作を行った軌跡を示し、破線は、ユーザが入力動作を意図したにも係わらず検出が行われなかった部分の軌跡を示す。 FIG. 7 is a diagram conceptually illustrating a display example of an image (cursor 41) when the complementing process is performed on the input operation. FIG. 7A shows a trajectory of the input operation of the finger F performed on the touch panel 14. FIG. 7B shows a display example of the cursor 41 displayed based on the input operation of FIG. A solid line in FIG. 7A indicates a locus where the user performs an input operation on the touch panel 14 using the finger F, and a broken line indicates a portion where detection is not performed although the user intended the input operation. Show the trajectory.
 図7(A)に示すように、タッチパネル14がユーザの意図した動作に対応した入力位置情報を検出できず、リリースイベントが発生してしまった場合、携帯端末1は図6の補完処理を行う。具体的には、図7(A)に示す点線部分である検出が行われなかった部分を、リリースイベント発生時の入力位置情報とタッチイベント発生時の入力位置情報とに基づいて連続的につなげる補完処理を行い、補完位置情報を求める。 As shown in FIG. 7A, when the touch panel 14 cannot detect the input position information corresponding to the operation intended by the user and a release event occurs, the mobile terminal 1 performs the complementing process of FIG. . Specifically, the portion that is not detected, which is the dotted line portion shown in FIG. 7A, is continuously connected based on the input position information when the release event occurs and the input position information when the touch event occurs. Complement processing is performed to obtain complementary position information.
 携帯端末1は、タッチパネル14より得られた入力位置情報および補完位置情報に基づいてカーソル41をタッチパネル14に表示する。図7(B)に示すように、携帯端末1は、カーソル41の動きをユーザが意図した入力動作に近付けて連続的に表示させることができる。このため、携帯端末1は、ユーザのタッチパネル14に対する接触圧力が十分でなく、タッチパネル14がユーザの意図した入力動作を検出できない場合であっても、ユーザの意図を考慮した表示処理を行うことができる。 The mobile terminal 1 displays the cursor 41 on the touch panel 14 based on the input position information and the complementary position information obtained from the touch panel 14. As shown in FIG. 7B, the mobile terminal 1 can continuously display the movement of the cursor 41 close to the input operation intended by the user. For this reason, the portable terminal 1 can perform display processing in consideration of the user's intention even when the contact pressure of the user on the touch panel 14 is not sufficient and the touch panel 14 cannot detect the input operation intended by the user. it can.
 以上で補完処理の説明を終了する。 This completes the explanation of the supplement processing.
 この携帯端末1によれば、ユーザの意図した入力動作を検出できずリリースイベントが発生した場合であっても、このリリースイベントが意図的なものであるか否かを好適に判断し、好適な表示処理のための補完処理を行うことができる。このため、携帯端末1は、ユーザの意図した操作指示と実際の表示処理とに齟齬を生じさせることなく、ユーザの期待した表示処理を実現することができる。 According to the portable terminal 1, even when a release event occurs because the input operation intended by the user cannot be detected, it is determined appropriately whether or not the release event is intentional. Complementary processing for display processing can be performed. For this reason, the portable terminal 1 can implement | achieve the display process which the user expected, without producing a habit with the operation instruction which the user intended, and an actual display process.
 なお、本実施形態においては、タッチパネル14のディスプレイ35に表示されるカーソル41を移動させるための操作指示に伴う入力動作に対して補完処理を行った例を適用して説明した。しかしこれに限らず、カーソル以外の他の画像に対する操作指示に伴う入力動作に対する補完処理に適用してもよい。具体的には、携帯端末1が、タッチパネル14に対する継続的な入力動作に伴いディスプレイ35に表示される画像を連続的に変化させる表示処理を行う場合に適用することができる。例えば、タッチパネル14の操作面に対する接触時間により画像変化量や表示時間を変化させる表示処理を行う場合などに適用することができる。 In the present embodiment, the description has been made by applying the example in which the complementary process is performed on the input operation accompanying the operation instruction for moving the cursor 41 displayed on the display 35 of the touch panel 14. However, the present invention is not limited to this, and the present invention may be applied to a complementary process for an input operation accompanying an operation instruction for an image other than the cursor. Specifically, the present invention can be applied to a case where the mobile terminal 1 performs a display process in which an image displayed on the display 35 is continuously changed with a continuous input operation on the touch panel 14. For example, the present invention can be applied to a case where display processing is performed in which the image change amount or the display time is changed depending on the contact time with the operation surface of the touch panel 14.
 また、本実施形態における携帯端末1は、ディスプレイ35とタッチセンサ33が一体となったタッチパネル14を備えた構成合を適用して説明したが、タッチセンサ33はディスプレイ35と個別に構成されたタッチパッドなどのポインティングデバイスであってもよい。 Moreover, although the portable terminal 1 in this embodiment applied and demonstrated the structure provided with the touchscreen 14 with which the display 35 and the touch sensor 33 were united, the touch sensor 33 is the touch comprised separately with the display 35. It may be a pointing device such as a pad.
 さらに、本発明に係る携帯端末は、携帯電話機、PDA(PersonalDigitalAssistant)、パーソナルコンピュータ、携帯型ゲーム機、携帯型音楽再生機、携帯型動画再生機、その他の携帯端末にも適用することができる。 Furthermore, the mobile terminal according to the present invention can be applied to a mobile phone, a PDA (Personal Digital Assistant), a personal computer, a portable game machine, a portable music player, a portable video player, and other portable terminals.

Claims (4)

  1. 所定の画像を表示する表示部と、
     前記画像に対する指示を操作面を介して受け付け、前記指示を入力位置情報として検出する検出部と、
     前記検出部が前記入力位置情報の検出を終了してから次に前記入力位置情報の検出を開始するまでの時間を取得する時間取得部と、
     前記時間が所定時間未満である場合、前記検出の終了時の前記入力位置情報と前記検出の開始時の前記入力位置情報とに基づいて、前記検出の終了時から前記検出の開始時までの入力位置情報を補完する補完位置情報を求める補完処理部と、
     前記検出部により検出された前記入力位置情報および前記補完位置情報に基づいて、前記画像の表示制御を行う制御部とを備えたことを特徴とする携帯端末。
    A display unit for displaying a predetermined image;
    A detection unit that receives an instruction to the image via an operation surface, and detects the instruction as input position information;
    A time acquisition unit that acquires a time from when the detection unit ends detection of the input position information to when detection of the input position information starts next;
    When the time is less than a predetermined time, the input from the end of the detection to the start of the detection based on the input position information at the end of the detection and the input position information at the start of the detection A complementary processing unit for obtaining complementary position information that complements the position information;
    A portable terminal comprising: a control unit that performs display control of the image based on the input position information and the complementary position information detected by the detection unit.
  2. 前記所定の画像は、前記表示部に表示された画面の少なくとも一部を移動するカーソルであり、
     前記指示は、前記カーソルを前記画面上で移動させる指示であり、
     前記制御部は、前記補完位置情報に基づいて、前記カーソルを移動させる表示制御を行う請求項1記載の携帯端末。
    The predetermined image is a cursor that moves at least a part of the screen displayed on the display unit,
    The instruction is an instruction to move the cursor on the screen,
    The mobile terminal according to claim 1, wherein the control unit performs display control for moving the cursor based on the complementary position information.
  3. 前記補完位置情報は、前記検出の終了時の前記入力位置情報と検出の開始時の前記入力位置情報とをほぼ直線的につなぐ情報である請求項2記載の携帯端末。 The portable terminal according to claim 2, wherein the complementary position information is information that connects the input position information at the end of the detection and the input position information at the start of the detection substantially linearly.
  4. 前記表示部と前記検出部とは、一体に形成されたタッチパネルである請求項1記載の携帯端末。 The mobile terminal according to claim 1, wherein the display unit and the detection unit are a touch panel formed integrally.
PCT/JP2010/050506 2009-04-17 2010-01-18 Portable terminal WO2010119713A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009101288A JP2012137800A (en) 2009-04-17 2009-04-17 Portable terminal
JP2009-101288 2009-04-17

Publications (1)

Publication Number Publication Date
WO2010119713A1 true WO2010119713A1 (en) 2010-10-21

Family

ID=42982384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/050506 WO2010119713A1 (en) 2009-04-17 2010-01-18 Portable terminal

Country Status (2)

Country Link
JP (1) JP2012137800A (en)
WO (1) WO2010119713A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113645A (en) * 2010-11-26 2012-06-14 Kyocera Corp Electronic apparatus
JP2013088891A (en) * 2011-10-14 2013-05-13 Konica Minolta Business Technologies Inc Information terminal, drawing control program, and drawing control method
WO2014141763A1 (en) * 2013-03-15 2014-09-18 シャープ株式会社 Touch panel system
JP2014228890A (en) * 2013-05-17 2014-12-08 シャープ株式会社 Touch panel system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000357046A (en) * 1999-06-15 2000-12-26 Mitsubishi Electric Corp Handwriting input device and computer readable recording medium recording handwriting input program
JP2007334420A (en) * 2006-06-12 2007-12-27 Dainippon Printing Co Ltd Processor and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000357046A (en) * 1999-06-15 2000-12-26 Mitsubishi Electric Corp Handwriting input device and computer readable recording medium recording handwriting input program
JP2007334420A (en) * 2006-06-12 2007-12-27 Dainippon Printing Co Ltd Processor and program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113645A (en) * 2010-11-26 2012-06-14 Kyocera Corp Electronic apparatus
JP2013088891A (en) * 2011-10-14 2013-05-13 Konica Minolta Business Technologies Inc Information terminal, drawing control program, and drawing control method
WO2014141763A1 (en) * 2013-03-15 2014-09-18 シャープ株式会社 Touch panel system
JPWO2014141763A1 (en) * 2013-03-15 2017-02-16 シャープ株式会社 Touch panel system
JP2014228890A (en) * 2013-05-17 2014-12-08 シャープ株式会社 Touch panel system

Also Published As

Publication number Publication date
JP2012137800A (en) 2012-07-19

Similar Documents

Publication Publication Date Title
AU2018282404B2 (en) Touch-sensitive button
US11449224B2 (en) Selective rejection of touch contacts in an edge region of a touch surface
JP7451593B2 (en) Touch-based input for stylus
CN103294300B (en) Sensor management apparatus, method, and computer program product
US10795492B2 (en) Input device and method for controlling input device
US20100265209A1 (en) Power reduction for touch screens
TWI576720B (en) Touch presentation device and control method thereof
TW200822682A (en) Multi-function key with scrolling
CN101573673A (en) Back-side interface for hand-held devices
WO2012127792A1 (en) Information terminal, and method and program for switching display screen
KR100950413B1 (en) Method for controlling input part, and input unit and electronic device using the same
WO2010119713A1 (en) Portable terminal
US20150009136A1 (en) Operation input device and input operation processing method
TW201337640A (en) Method of touch command integration and touch system using the same
AU2015271962B2 (en) Interpreting touch contacts on a touch surface
US20110001716A1 (en) Key module and portable electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10764299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10764299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP