WO2010119713A1 - Terminal portable - Google Patents

Terminal portable Download PDF

Info

Publication number
WO2010119713A1
WO2010119713A1 PCT/JP2010/050506 JP2010050506W WO2010119713A1 WO 2010119713 A1 WO2010119713 A1 WO 2010119713A1 JP 2010050506 W JP2010050506 W JP 2010050506W WO 2010119713 A1 WO2010119713 A1 WO 2010119713A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
position information
input
display
touch panel
Prior art date
Application number
PCT/JP2010/050506
Other languages
English (en)
Japanese (ja)
Inventor
英樹 根本
託也 千葉
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Publication of WO2010119713A1 publication Critical patent/WO2010119713A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a portable terminal equipped with a pointing device such as a touch pad or a touch panel.
  • a mobile terminal such as a mobile phone is equipped with various input devices for receiving operation instructions from a user.
  • these input devices one that can input an operation instruction by an intuitive operation such as a touch pad or a touch panel is known (see, for example, JP-A-2008-258805).
  • the touchpad and the touch panel are configured to accept an operation instruction based on input position information obtained by sensing a change in capacitance or contact pressure accompanying contact with the operation surface.
  • One of the input devices is an operation key that receives an operation instruction when pressed.
  • the user inputs an operation instruction with the operation key, the user can determine whether or not the input has been appropriately performed based on a click feeling obtained when the operation key is pressed.
  • the user performs an operation instruction input operation by applying a contact pressure to the operation surface by moving or touching a finger or the like on the operation surface. At this time, when the contact pressure on the operation surface is insufficient, the touch panel or the like cannot sense the contact.
  • the touch panel or the like is difficult to obtain a pressing feeling, and thus it is difficult for the user himself / herself to determine whether or not an operation instruction has been correctly input.
  • the process executed on the mobile terminal in response to the user's input operation may be different from the process expected by the user, or the process expected by the user may not be executed.
  • the occurrence of a discrepancy between the input operation and the processing to be performed in this manner causes a user to feel poor operability.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a portable terminal that can suitably perform a complementing process on a user's operation instruction and improve operability.
  • a mobile terminal includes a display unit that displays a predetermined image, and a detection unit that receives an instruction for the image via an operation surface and detects the instruction as input position information. And a time acquisition unit that acquires a time from when the detection unit ends detection of the input position information to when detection of the input position information starts next, and when the time is less than a predetermined time Complementary position information that complements input position information from the end of the detection to the start of the detection based on the input position information at the end of the detection and the input position information at the start of the detection.
  • the mobile terminal according to the present invention can suitably perform a complementing process on the user's operation instruction to improve operability.
  • a portable terminal As a portable terminal to which the present invention is applied, a portable terminal that is formed in a card shape and that allows a user to input an operation instruction by touching the display with a finger will be described as an example.
  • FIG. 1 is an external perspective view showing an embodiment of a portable terminal according to the present invention.
  • the mobile terminal 1 includes a rectangular plate-shaped casing 11.
  • a touch panel 14 occupies most of one surface of the casing 11.
  • the touch panel 14 has functions of both a display unit and a detection unit.
  • the touch panel 14 as a display unit is a display (display 35 in FIG. 2) provided with an area for displaying a display screen composed of characters, images, and the like.
  • This display is composed of, for example, an LCD (Liquid Crystal Display).
  • the touch panel 14 as a detection unit is a touch sensor (touch sensor 33 in FIG. 2) that detects a contact operation on the operation surface as input position information.
  • the touch sensor includes a plurality of elements for detecting a contact operation arranged on the upper surface of the display, and a transparent operation surface laminated thereon.
  • a pressure-sensitive method for detecting a change in pressure an electrostatic method for detecting an electric signal due to static electricity, or other methods can be applied.
  • a receiver 15 for outputting sound and a microphone 16 for inputting sound are disposed on the housing 11 at positions opposed to each other in the longitudinal direction via the touch panel 14.
  • FIG. 2 is a schematic functional block diagram showing the main functional configuration of the mobile terminal 1 in the present embodiment.
  • the mobile terminal 1 is configured by connecting a main control unit 30, a power supply circuit unit 31, an input control unit 32, a display control unit 34, a voice control unit 36, a communication control unit 37, and a storage unit 39 so that they can communicate with each other via a bus. Has been.
  • the main control unit 30 includes a CPU (Central Processing Unit).
  • the main control unit 30 operates based on various programs stored in the storage unit 39 and performs overall control of the mobile terminal 1.
  • the power supply circuit unit 31 includes a power supply source (not shown).
  • the power supply circuit unit 31 switches the power supply ON / OFF state of the mobile terminal 1 based on an operation of turning on the power supply.
  • the power supply circuit unit 31 supplies power from the power supply source to each unit when the power supply is in an ON state, thereby enabling the mobile terminal 1 to operate.
  • the input control unit 32 includes an input interface for the touch sensor 33.
  • the input control unit 32 receives the detection signal from the touch sensor 33 as input position information indicating the coordinates of the input position every predetermined time (for example, every 10 ms), generates a signal indicating the input, and transmits the signal to the main control unit 30. To do.
  • the display control unit 34 includes a display interface for the display 35.
  • the display control unit 34 causes the display 35 to display an image based on the document data and the image signal based on the control of the main control unit 30.
  • the sound control unit 36 generates an analog sound signal from the sound collected by the microphone 16 based on the control of the main control unit 30, and converts the analog sound signal into a digital sound signal. Further, when acquiring the digital audio signal, the audio control unit 36 converts the digital audio signal into an analog audio signal based on the control of the main control unit 30 and outputs the analog audio signal as audio from the receiver 15.
  • the communication control unit 37 restores data by performing a spectrum despreading process on the received signal received from the base station via the antenna 38 based on the control of the main control unit 30.
  • This data is transmitted to the voice control unit 36 and output from the receiver 15 according to an instruction from the main control unit 30, transmitted to the display control unit 34 and displayed on the display 35, or recorded in the storage unit 39.
  • the communication control unit 37 acquires the voice data collected by the microphone 16, the data input via the touch panel 14, or the data stored in the storage unit 39 based on the control of the main control unit 30, Is spread over the data and transmitted to the base station via the antenna 38.
  • the storage unit 39 is a ROM (Read Only Memory) that stores a processing program for processing performed by the main control unit 30 and data necessary for the processing, a hard disk, a non-volatile memory, a database, and is used when the main control unit 30 performs processing.
  • RAM Random Access Memory
  • the mobile terminal 1 in the present embodiment includes the touch panel 14 as described above.
  • the touch panel 14 receives an instruction for an image or the like displayed on the display 35 of the touch panel 14 via the operation surface.
  • the user inputs to the touch panel 14 via the user's finger or stylus pen.
  • an example in which input is mainly performed on the touch panel 14 with a finger will be described.
  • the present invention can also be applied to a case other than a finger.
  • FIG. 3 is a diagram conceptually illustrating a case where a movement instruction for an image displayed on the touch panel 14 is input. The example illustrated in FIG. 3 is described by applying a cursor 41 used to point an operation target on the touch panel 14 as an example of an image displayed on the touch panel 14.
  • a cursor 41 used to indicate an operation target is displayed on the touch panel 14.
  • an area assigned as an operation pad 40 that exclusively receives an operation instruction for the cursor 41 is displayed at a predetermined position on the touch panel 14.
  • the operation pad 40 detects input position information at predetermined time intervals (for example, 10 ms) based on the input operation of the user who moves the finger F in contact with the operation pad 40.
  • the main control unit 30 performs display processing for moving the cursor 41 based on the detected input position information.
  • the cursor 41 is moved in the direction of the illustrated arrow B based on the direction and movement amount corresponding to the input operation. .
  • FIG. 4 is a diagram conceptually illustrating an operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input.
  • FIG. 5 is a diagram for conceptually explaining another operation for inputting to the operation pad 40 and a display example of the cursor 41 based on the input.
  • FIG. 4A and FIG. 5A show the locus of input operations performed on the operation panel 40 displayed on the touch panel 14.
  • 4B and 5B show display examples of the cursor 41 displayed on the touch panel 14 based on the input operation. 4 and 5, the operation of the finger F performed on the operation pad 40 (FIGS. 4A and 5A) and the display processing of the cursor 41 corresponding to this operation (FIG. 4).
  • FIG. 4 (B) and FIG. 5 (B)) are individually shown, but actually, the operation of the finger F and the display process of the cursor 41 are simultaneously performed on the touch panel 14 as shown in FIG. .
  • FIG. 4A shows a case where the finger F is operated so as to draw a waveform.
  • the mobile terminal 1 is configured as shown in FIG.
  • (B) a display is performed in which the cursor 41 is moved while drawing a locus having substantially the same shape as the waveform drawn by the finger F.
  • the trajectory drawn in FIG. 4B is a trajectory having the cursor 41a position as the start point and the cursor 41b position as the end point, and shows a state in which the cursor 41 continuously moves on the display.
  • the mobile terminal 1 performs a display of moving the cursor 41 while drawing a locus having the cursor 41a position shown in FIG. 5B as the start point and the cursor 41b position as the end point. That is, the mobile terminal 1 performs display processing for moving the cursor 41 based on a locus detected from the operation pad 40 and not intended by the user.
  • the mobile terminal 1 makes the user feel the poor operability that the intended operation is not performed on the operation pad 40 operated by the finger F.
  • the portable terminal 1 performs input position information complementation processing even when the contact pressure is insufficient and the touch panel 14 (touch sensor 33) cannot detect an input operation associated with the contact.
  • the display process accompanying the operation intended by the user can be performed.
  • FIG. 6 is a flowchart for explaining the input position information complementing process executed by the main control unit 30 of the mobile terminal 1 in the present embodiment.
  • step S1 the main control unit 30 determines whether a touch event has occurred after a release event has occurred.
  • the release event is an event that occurs when the input position information detected by the touch panel 14 as the detection unit every predetermined time cannot be acquired (when the detection is finished).
  • the touch event is an event that occurs when the touch panel 14 detects input position information next (when detection is started) after the occurrence of a release event.
  • the case where the touch panel 14 cannot acquire the input position information means that when the touch sensor 33 is pressure sensitive, the detected contact pressure is less than a predetermined pressure (for example, 100 g / cm 2 ⁇ 9806 Pa). If it is an equation, it means a case where the detected capacitance is equal to or less than a predetermined capacitance.
  • a predetermined pressure for example, 100 g / cm 2 ⁇ 9806 Pa.
  • step S1 the main control unit 30 waits for processing until it occurs.
  • step S2 when it is determined that a release event and a touch event have occurred (YES in step S1), in step S2, the main control unit 30 as a time acquisition unit acquires the time from the release event occurrence to the touch event occurrence.
  • the mobile terminal 1 is provided with a dedicated timer for measuring the time between events, and the main control unit 30 acquires the time between events from this timer.
  • step S3 the main control unit 30 determines whether or not the time between events acquired in the time acquisition step S2 is less than a predetermined time (for example, 60 ms).
  • the predetermined time is determined by a value that allows the release event to be regarded as an event that has occurred due to an operation not intended by the user.
  • the time used for the determination in step S3 may have a range in value. For example, when the time between events is within a range of 20 ms to 60 ms, it may be determined that the release event is not intended by the user. When the time between events is 60 ms or more, the time between events is long, and it can be considered that the user has intentionally generated a release event. Further, when the time between events is 20 ms or less, the operation is negligibly discontinued and cannot be recognized by the user. If the time between events is less than 60 ms, all of them may be subjected to complement processing.
  • the main control unit 30 determines that the time between events is not less than the predetermined time (NO in step S3), the main control unit 30 ends the process because it is not necessary to perform the complement process.
  • step S3 when it is determined that the time between events is less than the predetermined time (YES in step S3), the main control unit 30 serving as a complement processing unit in step S4 ignores the release event and performs complementation for obtaining supplementary position information. Process.
  • the main control unit 30 complements the input position information at the time of occurrence of the touch event from the occurrence of the release event based on the input position information at the time of occurrence of the release event and the input position information at the time of occurrence of the touch event as complementary position information. .
  • the release operation unintended by the user is ignored, and an input operation in which contact with the operation surface of the touch panel 14 continues is obtained, and the complementary position information is obtained so that the cursor is displayed on the touch panel 14 based on the input operation.
  • the complementary position information is obtained that linearly connects the input position information when the release event occurs and the input position information when the touch event occurs.
  • the interpolation method for the discontinuous period of the input operation is not limited to this, and other methods may be used.
  • step S5 the main control unit 30 as the control unit displays a cursor on the touch panel 14 based on the input position information detected from the touch panel 14 and the complementary position information obtained in the complementary processing step S4.
  • FIG. 7 is a diagram conceptually illustrating a display example of an image (cursor 41) when the complementing process is performed on the input operation.
  • FIG. 7A shows a trajectory of the input operation of the finger F performed on the touch panel 14.
  • FIG. 7B shows a display example of the cursor 41 displayed based on the input operation of FIG.
  • a solid line in FIG. 7A indicates a locus where the user performs an input operation on the touch panel 14 using the finger F, and a broken line indicates a portion where detection is not performed although the user intended the input operation. Show the trajectory.
  • the mobile terminal 1 when the touch panel 14 cannot detect the input position information corresponding to the operation intended by the user and a release event occurs, the mobile terminal 1 performs the complementing process of FIG. . Specifically, the portion that is not detected, which is the dotted line portion shown in FIG. 7A, is continuously connected based on the input position information when the release event occurs and the input position information when the touch event occurs. Complement processing is performed to obtain complementary position information.
  • the mobile terminal 1 displays the cursor 41 on the touch panel 14 based on the input position information and the complementary position information obtained from the touch panel 14. As shown in FIG. 7B, the mobile terminal 1 can continuously display the movement of the cursor 41 close to the input operation intended by the user. For this reason, the portable terminal 1 can perform display processing in consideration of the user's intention even when the contact pressure of the user on the touch panel 14 is not sufficient and the touch panel 14 cannot detect the input operation intended by the user. it can.
  • the portable terminal 1 even when a release event occurs because the input operation intended by the user cannot be detected, it is determined appropriately whether or not the release event is intentional. Complementary processing for display processing can be performed. For this reason, the portable terminal 1 can implement
  • the present invention is not limited to this, and the present invention may be applied to a complementary process for an input operation accompanying an operation instruction for an image other than the cursor.
  • the present invention can be applied to a case where the mobile terminal 1 performs a display process in which an image displayed on the display 35 is continuously changed with a continuous input operation on the touch panel 14.
  • the present invention can be applied to a case where display processing is performed in which the image change amount or the display time is changed depending on the contact time with the operation surface of the touch panel 14.
  • the portable terminal 1 in this embodiment applied and demonstrated the structure provided with the touchscreen 14 with which the display 35 and the touch sensor 33 were united, the touch sensor 33 is the touch comprised separately with the display 35. It may be a pointing device such as a pad.
  • the mobile terminal according to the present invention can be applied to a mobile phone, a PDA (Personal Digital Assistant), a personal computer, a portable game machine, a portable music player, a portable video player, and other portable terminals.
  • a PDA Personal Digital Assistant

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un terminal portable comprenant une section d'affichage pour afficher une image prédéterminée, une section de détection pour recevoir une instruction concernant l'image par l'intermédiaire d'une surface fonctionnelle et pour détecter l'instruction comme étant une information de position d'entrée, une section d'acquisition de temps pour obtenir le temps entre le moment où la section de détection a terminé la détection de l'information de position d'entrée et le moment où la section de détection commence ensuite la détection de l'information de position d'entrée, une section de traitement complémentaire pour obtenir une information de position complémentaire qui complète une information de position d'entrée entre le moment où la détection a été terminée et le moment où la détection a commencé sur la base de l'information de position d'entrée quand la détection a été terminée et l'information de position d'entrée quand la détection a été commencée, si le temps obtenu par la section d'acquisition de temps est inférieur au temps prédéterminé et une unité de commande pour commander l'affichage de l'image sur la base de l'information de position d'entrée détectée par la section de détection et l'information de position complémentaire.
PCT/JP2010/050506 2009-04-17 2010-01-18 Terminal portable WO2010119713A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009101288A JP2012137800A (ja) 2009-04-17 2009-04-17 携帯端末
JP2009-101288 2009-04-17

Publications (1)

Publication Number Publication Date
WO2010119713A1 true WO2010119713A1 (fr) 2010-10-21

Family

ID=42982384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/050506 WO2010119713A1 (fr) 2009-04-17 2010-01-18 Terminal portable

Country Status (2)

Country Link
JP (1) JP2012137800A (fr)
WO (1) WO2010119713A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113645A (ja) * 2010-11-26 2012-06-14 Kyocera Corp 電子機器
JP2013088891A (ja) * 2011-10-14 2013-05-13 Konica Minolta Business Technologies Inc 情報端末及び描画制御プログラム並びに描画制御方法
WO2014141763A1 (fr) * 2013-03-15 2014-09-18 シャープ株式会社 Système à panneau tactile
JP2014228890A (ja) * 2013-05-17 2014-12-08 シャープ株式会社 タッチパネルシステム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000357046A (ja) * 1999-06-15 2000-12-26 Mitsubishi Electric Corp 手書入力装置、方法ならびに手書入力プログラムを記録したコンピュータで読取可能な記録媒体
JP2007334420A (ja) * 2006-06-12 2007-12-27 Dainippon Printing Co Ltd 処理装置及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000357046A (ja) * 1999-06-15 2000-12-26 Mitsubishi Electric Corp 手書入力装置、方法ならびに手書入力プログラムを記録したコンピュータで読取可能な記録媒体
JP2007334420A (ja) * 2006-06-12 2007-12-27 Dainippon Printing Co Ltd 処理装置及びプログラム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113645A (ja) * 2010-11-26 2012-06-14 Kyocera Corp 電子機器
JP2013088891A (ja) * 2011-10-14 2013-05-13 Konica Minolta Business Technologies Inc 情報端末及び描画制御プログラム並びに描画制御方法
WO2014141763A1 (fr) * 2013-03-15 2014-09-18 シャープ株式会社 Système à panneau tactile
JPWO2014141763A1 (ja) * 2013-03-15 2017-02-16 シャープ株式会社 タッチパネルシステム
JP2014228890A (ja) * 2013-05-17 2014-12-08 シャープ株式会社 タッチパネルシステム

Also Published As

Publication number Publication date
JP2012137800A (ja) 2012-07-19

Similar Documents

Publication Publication Date Title
AU2018282404B2 (en) Touch-sensitive button
JP7451593B2 (ja) スタイラス用タッチベース入力
CN103294300B (zh) 传感器管理设备、方法和计算机程序产品
AU2010235941B2 (en) Interpreting touch contacts on a touch surface
US10795492B2 (en) Input device and method for controlling input device
TWI576720B (zh) 觸感呈現裝置以及觸感呈現裝置的控制方法
US20100265209A1 (en) Power reduction for touch screens
TW200822682A (en) Multi-function key with scrolling
CN101573673A (zh) 用于手持设备的背侧接口
WO2012127792A1 (fr) Terminal d'informations et procédé et programme de commutation d'écran d'affichage
KR100950413B1 (ko) 입력부 제어 방법과 이를 이용한 입력 장치 및 전자 기기
WO2010119713A1 (fr) Terminal portable
US20150009136A1 (en) Operation input device and input operation processing method
CN101794194A (zh) 在触摸屏上模拟鼠标右键输入的方法和装置
AU2015271962B2 (en) Interpreting touch contacts on a touch surface
US20110001716A1 (en) Key module and portable electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10764299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10764299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP