WO2013128703A1 - Dispositif de traitement d'informations, procédé d'entrée et programme - Google Patents

Dispositif de traitement d'informations, procédé d'entrée et programme Download PDF

Info

Publication number
WO2013128703A1
WO2013128703A1 PCT/JP2012/076898 JP2012076898W WO2013128703A1 WO 2013128703 A1 WO2013128703 A1 WO 2013128703A1 JP 2012076898 W JP2012076898 W JP 2012076898W WO 2013128703 A1 WO2013128703 A1 WO 2013128703A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
input
input areas
enlarged
touch position
Prior art date
Application number
PCT/JP2012/076898
Other languages
English (en)
Japanese (ja)
Inventor
研一 荏原
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Publication of WO2013128703A1 publication Critical patent/WO2013128703A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection

Definitions

  • the present invention relates to an information processing apparatus in which input is performed by touching.
  • an information processing apparatus such as a portable information terminal has been provided with a physical keyboard such as a 10 keyboard or a QWERTY keyboard as an input device, and a user can use the physical keyboard to perform SMS (Short Message service), password input, and e-mail creation.
  • SMS Short Message service
  • password input password input
  • e-mail creation e-mail creation
  • the touch input device has an advantage that the user can intuitively operate with a finger or the like, but on the other hand, there is a problem that an erroneous operation is likely to occur compared to a physical keyboard.
  • the user usually touches the touch input device with the belly of the finger, but when the fingernail is attached to the finger (see FIG. 1) or when the fingernail is long, the user does not touch it. Since the tip of the nail or fingernail and the belly of the finger are separated, an erroneous operation is likely to occur. For example, when a false nail is attached, as shown in FIG. 2, the user tries to touch an area to which “ka line” is assigned, and the lower right of the area to which “ka line” is accidentally assigned. There is a case where an area to which “ha” is assigned is touched.
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide an information processing apparatus, an input method, and a program that can reduce the occurrence of erroneous operations.
  • An information processing apparatus includes a display unit, a touch operation unit that detects a touch on the operation surface and a touched touch position, a plurality of input areas on the operation surface, and a touch corresponding to the touch position. And a control unit that displays an instruction image for instructing the change of the touch position on the display unit when the region extends over two or more input regions of the plurality of input regions.
  • An input method is an input method by an information processing apparatus including a display unit, a touch on an operation surface, and a touch operation unit that detects the touched touch position, and includes a plurality of input areas on the operation surface.
  • an instruction image for instructing the change of the touch position is displayed on the display unit And a control step.
  • a program includes a setting procedure for setting a plurality of input areas on the operation surface in a computer connected to a display unit, and a touch operation unit that detects a touch on the operation surface and the touch position touched.
  • FIG. 3 is a block diagram showing the configuration of the portable information terminal according to the first embodiment of the present invention.
  • the unit 3 includes a wireless communication unit 101, antennas 102 and 104, a broadcast receiving unit 103, a camera unit 105, a microphone unit 106, a speaker unit 107, an audio processing unit 108, positioning, and the like.
  • the unit 109, the dictionary unit 110, the storage unit 111, and the key operation unit 112 include a display operation unit 113 and a control unit 114.
  • the wireless communication unit 101 performs wireless communication via the antenna 102.
  • a communication method of wireless communication performed by the wireless communication unit 101 is not particularly limited.
  • the wireless communication unit 101 may support a plurality of communication methods.
  • the broadcast receiving unit 103 receives a broadcast signal transmitted from a broadcast station (for example, a ground station or a satellite) via the antenna 104.
  • the broadcast signal is a signal that is distributed to a plurality of information processing apparatuses at once, and includes an image signal indicating an image, an audio signal indicating sound, and the like.
  • the camera unit 105 performs shooting to acquire image data, and outputs the image data.
  • the microphone unit 106 acquires sound and outputs a sound signal.
  • the speaker unit 107 outputs sounds such as musical sounds and notification sounds.
  • the audio processing unit 108 is connected to the speaker unit 107 and the microphone unit 106, and performs audio input / output using the speaker unit 107 and the microphone unit 106.
  • the positioning unit 109 measures the position of the portable information terminal 100.
  • the positioning unit 109 measures the position of the portable information terminal 100 using a GPS (Global Positioning System) function or an advanced GPS function.
  • GPS Global Positioning System
  • the dictionary unit 110 stores dictionary data for converting the input information into kanji kana mixed text.
  • the storage unit 111 stores telephone directory information, transmission / reception mail information, incoming / outgoing call information, contents, application programs, image information, various setting information of the portable information terminal 100, and the like. Note that the dictionary unit 110 and the storage unit 111 may be realized by a part or one block of the same storage device.
  • the key operation unit 112 has a key pressed by the user, and outputs a key signal indicating the pressed key.
  • the display operation unit 113 has a display function for displaying an image and a switch function for detecting a touch from the user.
  • FIG. 4 is a block diagram illustrating an example of a specific configuration of the display operation unit 113.
  • the display operation unit 113 illustrated in FIG. 4 includes a display unit 121 and a touch operation unit 122.
  • the display unit 121 displays an image.
  • the touch operation unit 122 is, for example, a touch panel or a touch pad.
  • the touch operation unit 122 includes an operation surface, and detects a touch on the operation surface from the user and a touched touch position. When the touch operation unit 122 detects a touch and a touch position, the touch operation unit 122 outputs a touch signal indicating the touch position. Note that the display unit 121 and the touch operation unit 122 may be integrated.
  • the control unit 114 controls each unit of the portable information terminal 100 to perform a communication function (a call function, an e-mail transmission / reception function, a connection function to the Internet), a shooting function, a television reception function, a content reproduction function, and Various functions such as an information input function are realized.
  • a communication function a call function, an e-mail transmission / reception function, a connection function to the Internet
  • shooting function a television reception function
  • a content reproduction function Various functions such as an information input function are realized.
  • the control unit 114 sets a plurality of input areas on the operation surface of the touch operation unit 122 and displays a plurality of input images corresponding to the input areas on the display unit 121.
  • the control unit 114 displays each input image in an input area corresponding to the input image.
  • FIG. 5 is a diagram illustrating an example of a plurality of input images.
  • the control unit 114 displays a plurality of software keys 302 of the software keyboard 301 as a plurality of input images on the display surface 300 of the display unit 121. More specifically, there are twelve software keys 302, and each line of the Japanese syllabary from “A line” to “Wa line” and one of two symbols (* and #) are assigned.
  • the display unit 121 and the touch operation unit 122 are integrated, and the display surface 300 of the touch operation unit 122 is also used as the operation surface of the touch operation unit 122.
  • the control unit 114 performs a touch corresponding to the touch position indicated by the touch signal. Determine the area.
  • the touch area corresponds to an area where the user touches the operation surface. For example, the control unit 114 determines a circle area included in a circle with a predetermined radius centered on the touch position as the touch area.
  • control unit 114 determines whether or not the touch area extends over two or more input areas of the plurality of input areas.
  • control unit 114 displays an instruction image for instructing the user to change the touch position on the display unit 121.
  • the instruction image is, for example, two or more enlarged images obtained by enlarging each of the two or more input images corresponding to each of the two or more input areas across the touch area.
  • the control unit 114 sets two or more enlarged input areas corresponding to the two or more enlarged images on the operation surface of the touch operation unit 122. For example, when the display unit 121 and the touch operation unit 122 are integrated, the control unit 114 sets each enlarged input area to an area in which enlarged images corresponding to the enlarged input areas are displayed.
  • the control unit 114 determines the touch area corresponding to the changed touch position as the post-change area, and the post-change area is 2 It is determined whether it fits in any one of the above enlarged input areas.
  • the control unit 114 When the post-change area is in one of two or more enlarged input areas, the control unit 114 performs information processing according to the enlarged input area in which the post-change area is contained. More specifically, the control unit 114 performs information processing according to the input area corresponding to the original input image of the enlarged image corresponding to the enlarged input area in which the changed area is accommodated.
  • FIG. 6A to 6D are diagrams for explaining a specific example of the information input function performed by the control unit 114.
  • FIG. 6A to 6D are diagrams for explaining a specific example of the information input function performed by the control unit 114.
  • the display surface 300 of the touch operation unit 122 and the operation surface of the touch operation unit 122 are combined. Further, as shown in FIG. 5, the control unit 114 displays 12 software keys 302 on the display surface 300 as a plurality of input images. In this state, the user touches the area where the four software keys 302 to which “ka line”, “sa line”, “na line”, and “ha line” are assigned are displayed with a finger to perform control. Assume that the unit 114 determines that the touch area extends over four input areas corresponding to the four software keys 302, respectively.
  • the control unit 114 displays four enlarged images 311 obtained by enlarging each of the four software keys 302 corresponding to each of the four input regions straddling the target region, as shown in FIG. 6B. Displayed on the display unit 121. Then, the control unit 114 sets each of the areas on the display surface 300 (operation surface) on which each of the four enlarged images is displayed as an enlarged input area.
  • the control unit 114 indicates that the post-change area, which is a touch area corresponding to the changed touch position, is displayed in each enlarged image. It is determined whether it falls within any one of 311.
  • the user touches the enlarged image 311 obtained by enlarging the software key 302 assigned with “ka line” with a finger, and the control unit 114 assigns “ka line” as the changed area.
  • the control unit 114 performs a character input process of inputting “ka line” as information processing corresponding to the enlarged input area. .
  • FIG. 7 is a flowchart for explaining the operation of the portable information terminal 100.
  • control unit 114 sets a plurality of input areas on the operation surface of the touch operation unit 122, and displays a plurality of input images corresponding to the input areas on the display unit 121 (step S501).
  • Step S502 when the user touches the operation surface of the touch operation unit 122, the touch operation unit 122 detects the touch by the user and the touch position touched, and sends a touch signal indicating the touch position to the control unit 114.
  • the control unit 114 receives the touch signal and obtains a touch area corresponding to the touch position indicated by the received touch signal. Then, the control unit 114 determines whether or not the touch area extends over two or more input areas (step S503).
  • the control unit 114 performs information processing according to the input area where the touch area falls. (Step S504).
  • the control unit 114 enlarges each of the two or more input images corresponding to each of the two or more input areas over which the touch area extends as the instruction image.
  • the two or more enlarged input images are displayed on the display unit 121, and two or more enlarged input regions corresponding to the two or more enlarged images are set on the operation surface of the touch operation unit 122. (Step S505).
  • control unit 114 monitors the touch position indicated by the touch signal, and when the touch position changes, obtains a post-change area that is a touch area corresponding to the changed touch position, and the post-change area is enlarged by two or more. It is determined whether or not it falls within any of the input areas (step S506).
  • step S507 When the touch area is included in any of the two or more enlarged input images, the control unit 114 performs information processing corresponding to the enlarged input area (step S507), and the touch area is set to any one of the two or more enlarged input areas. If not, the process returns to step S506.
  • FIG. 8 is an overview diagram showing an example of an overview of the portable information terminal 100.
  • the display surface of the touch operation unit 122 is also used as the operation surface of the touch operation unit 122.
  • the display surface 300 of the touch operation unit 122, the microphone unit 106, and the speaker A unit 107 and a key operation unit 112 are provided.
  • a key operation unit 112 is provided on the side surface of the portable information terminal 100.
  • a camera unit 105 is provided on the back surface of the portable information terminal 100.
  • the instruction image for instructing the change of the touch position is displayed, so that it is possible to reduce the occurrence of an erroneous operation.
  • the instruction image two or more enlarged images obtained by enlarging each of the two or more input images corresponding to each of the two or more input regions straddling the touch area are displayed.
  • the touch position is changed so as to fit in one of the two or more enlarged input areas, information processing corresponding to the enlarged input area is performed. For this reason, the user can easily perform an accurate operation.
  • the portable information terminal 100 is described as an example of the information processing apparatus.
  • the information processing apparatus is not limited to the portable information terminal 100, but is a mobile phone terminal having a touch operation unit, a smartphone, or a PC (Personal computer). ) And game machines.
  • the PC includes a tablet PC, a notebook PC, and a desktop PC.
  • An information processing apparatus 400 illustrated in FIG. 9 includes a display unit 401, a touch operation unit 402, and a control unit 403.
  • Display unit 401 displays various information.
  • the touch operation unit 402 detects a touch on the operation surface and the touched touch position.
  • the control unit 403 sets a plurality of input areas on the operation surface and the touch area corresponding to the touch position detected by the touch operation unit 402 extends over two or more input areas of the plurality of input areas
  • the instruction image for instructing the change of the touch position is displayed on the display unit 401.
  • the functions of the portable information terminal 100 and the information processing apparatus 400 record a program for realizing the functions on a computer-readable recording medium and read the program recorded on the recording medium into the computer. It may be realized by executing.
  • the touch area where the touch operation unit 122 or 402 is touched by the user may be detected.
  • the touch operation unit 122402 is a capacitive touch panel
  • the touch operation unit 122 detects the amount of change in capacitance of each of the plurality of electrodes provided in its own matrix, and the amount of change is detected in advance.
  • a region where an electrode having a predetermined threshold value or more is arranged is detected as a touch region.
  • Appendix 1 A display unit; A touch operation unit for detecting a touch on the operation surface and the touched touch position; An instruction image for setting a plurality of input areas on the operation surface and instructing a change of the touch position when a touch area corresponding to the touch position extends over two or more input areas of the plurality of input areas An information processing apparatus.
  • Appendix 2 The information processing apparatus according to appendix 1, wherein the control unit displays a plurality of input images corresponding to each of the plurality of input regions on the display unit.
  • the said control part displays the 2 or more enlarged image which expanded each of the 2 or more input image corresponding to each of the 2 or more input area which the said touch area
  • the control unit sets two or more enlarged input areas corresponding to each of the two or more enlarged images on the operation surface, and when the touch position changes, the touch area corresponding to the changed touch position becomes the 2 It is determined whether or not it fits in any of the above enlarged input areas, and when the touch area fits in any of the two or more enlarged input areas, information processing according to the enlarged input area is performed.
  • Appendix 9 A computer connected to the display unit and a touch operation unit that detects a touch on the operation surface and the touched touch position, A setting procedure for setting a plurality of input areas on the operation surface; A control procedure for displaying an instruction image for instructing change of the touch position on the display unit when a touch area corresponding to the touch position extends over two or more input areas of the plurality of input areas; The program to be executed.
  • Appendix 10 The program according to appendix 9, further causing the computer to further execute a display procedure for displaying a plurality of input images corresponding to each of the plurality of input areas on the display unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations qui est apte à réduire la génération d'erreurs d'opération. Une section d'opération tactile (402) détecte : un contact sur une surface d'opération; et la position de contact qui a été touchée. Une unité de commande (403) règle de multiples zones d'entrée pour la surface d'opération, et affiche une image d'instruction qui ordonne un changement de la position de contact, ladite image d'instruction étant affichée sur une section d'affichage (401) si une zone de contact correspondant à la position de contact détectée par la section d'opération tactile (402) s'étend sur au moins deux zones d'entrée parmi les multiples zones d'entrée.
PCT/JP2012/076898 2012-03-01 2012-10-18 Dispositif de traitement d'informations, procédé d'entrée et programme WO2013128703A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-045213 2012-03-01
JP2012045213 2012-03-01

Publications (1)

Publication Number Publication Date
WO2013128703A1 true WO2013128703A1 (fr) 2013-09-06

Family

ID=49081930

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/076898 WO2013128703A1 (fr) 2012-03-01 2012-10-18 Dispositif de traitement d'informations, procédé d'entrée et programme

Country Status (1)

Country Link
WO (1) WO2013128703A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062419A (zh) * 2018-08-09 2018-12-21 郑州大学 一种优化的激光投影虚拟键盘

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001175375A (ja) * 1999-12-22 2001-06-29 Casio Comput Co Ltd 携帯情報端末装置、及び記憶媒体

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001175375A (ja) * 1999-12-22 2001-06-29 Casio Comput Co Ltd 携帯情報端末装置、及び記憶媒体

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062419A (zh) * 2018-08-09 2018-12-21 郑州大学 一种优化的激光投影虚拟键盘

Similar Documents

Publication Publication Date Title
JP6151157B2 (ja) 電子機器および制御プログラム並びに電子機器の動作方法
US8949743B2 (en) Language input interface on a device
US7574672B2 (en) Text entry interface for a portable communication device
CN102119376B (zh) 触敏显示器的多维导航
US20070155434A1 (en) Telephone Interface for a Portable Communication Device
US9703418B2 (en) Mobile terminal and display control method
JP5777645B2 (ja) 携帯端末機の文字入力方法及びこれをサポートする携帯端末機
US20140287724A1 (en) Mobile terminal and lock control method
JP2015518604A (ja) テキスト選択及び入力
US20090225034A1 (en) Japanese-Language Virtual Keyboard
KR20150109755A (ko) 이동 단말기 및 그것의 제어방법
KR102639193B1 (ko) 메시지 처리 방법 및 전자기기
JP2012008866A (ja) 携帯端末、キー表示プログラムおよびキー表示方法
US20120302291A1 (en) User character input interface with modifier support
KR20150005354A (ko) 전자 장치의 문자 입력 방법
KR20140106801A (ko) 시각 장애인들을 위한 휴대 단말기의 음성 서비스 지원 방법 및 장치
JP2010165289A (ja) 携帯電話機
US20210157415A1 (en) Text input method and terminal
EP3211510B1 (fr) Dispositif électronique portatif et procédé pour fournir une rétroaction haptique
KR20110076283A (ko) 사용자 입력 패턴에 따른 피드백 제공 방법 및 장치
JPWO2013047182A1 (ja) 携帯型電子機器、タッチ操作処理方法およびプログラム
US20150309590A1 (en) Inputting method and associated electronic device
JP2013090242A (ja) 携帯端末装置、プログラムおよび実行抑制方法
US20130021380A1 (en) Electronic device and method for sensing input gesture and inputting selected symbol
WO2013128703A1 (fr) Dispositif de traitement d'informations, procédé d'entrée et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12870039

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12870039

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP