WO2012127733A1 - 情報処理装置、情報処理装置の制御方法、および、プログラム - Google Patents
情報処理装置、情報処理装置の制御方法、および、プログラム Download PDFInfo
- Publication number
- WO2012127733A1 WO2012127733A1 PCT/JP2011/076080 JP2011076080W WO2012127733A1 WO 2012127733 A1 WO2012127733 A1 WO 2012127733A1 JP 2011076080 W JP2011076080 W JP 2011076080W WO 2012127733 A1 WO2012127733 A1 WO 2012127733A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display unit
- information processing
- contact point
- processing apparatus
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to an information processing apparatus in which operation input is performed via a touch panel, a control method for the information processing apparatus, and a program.
- information processing apparatuses such as smartphones and portable game machines have a display unit and a touch panel that is arranged on the display unit and detects contact of a pointing unit such as a finger or a stylus pen. Some operations are input.
- an operation image including an operation button indicating the operation content is displayed on the display unit, and is displayed on the operation button by bringing an instruction unit into contact with the operation button. Processing according to the operation content is performed.
- the user can perform the operation input intuitively, which is easy to use, and the physical key structure for the operation input can be reduced from the information processing apparatus. Therefore, for example, the size of the display unit can be increased by the amount that the key structure is reduced.
- the size of the display unit is limited to a size that fits in the user's palm. The If the entire operation image is displayed while the size of the display unit is limited, the display of the operation button may be reduced, and it may be difficult to select the operation button.
- Patent Document 1 Japanese Patent Laid-Open No. 2003-330613
- Patent Document 2 Japanese Patent Laid-Open No. 2010-204781
- Information processing that scrolls the display on the display unit by the amount corresponding to the movement amount of the contact point in the same direction as the movement direction of the contact point accompanying the drag operation when a drag operation to slide the contact point is input after the contact is made
- An apparatus is disclosed. According to this information processing apparatus, by displaying only a part of the region on the operation image on the display unit, the display of the operation button is prevented from becoming too small, and the target operation button is displayed on the display unit. If not, the desired operation button can be displayed by scrolling the display on the display unit.
- the present invention provides an information processing apparatus, a control method for the information processing apparatus, and a program that can solve the above-described problems and can reduce the labor of operation input.
- the information processing apparatus of the present invention provides: A display unit; A touch panel that is arranged on the display unit, detects contact or proximity of the instruction means, and outputs a detection result;
- a display unit A touch panel that is arranged on the display unit, detects contact or proximity of the instruction means, and outputs a detection result;
- the control method of the information processing apparatus of the present invention comprises: Displaying all or part of the operation image on the display unit of the information processing apparatus; When a part of the operation image is displayed on the display unit, the touch panel arranged on the display unit, detects contact or proximity of the instruction unit, and outputs a detection result, and the instruction unit
- a drag operation is performed to slide a contact point or a close contact point, a movement direction and a movement amount of the contact point associated with the drag operation are specified, and the specified movement amount is set in a direction opposite to the specified movement direction.
- the display on the display unit is scrolled by a corresponding amount.
- the program of the present invention In the information processing device, Processing for displaying the whole or part of the operation image on the display unit of the information processing apparatus; When a part of the operation image is displayed on the display unit, the touch panel arranged on the display unit, detects contact or proximity of the instruction unit, and outputs a detection result, and the instruction unit When a drag operation is performed to slide a contact point or a close contact point, a movement direction and a movement amount of the contact point associated with the drag operation are specified, and the specified movement amount is set in a direction opposite to the specified movement direction. And a process of scrolling the display on the display unit by a corresponding amount.
- FIG. 1 It is a block diagram which shows the structure of the information processing apparatus of the 1st Embodiment of this invention. It is a flowchart which shows operation
- FIG. 1 is a block diagram showing the configuration of the information processing apparatus according to the first embodiment of this invention.
- An information processing apparatus 10 shown in FIG. 1 includes a CPU (Central Processing Unit) 11, a memory 12, a display unit 13, a touch panel 14, a key input unit 15, a communication unit 16, a microphone 17, and a speaker 18.
- a CPU Central Processing Unit
- Specific examples of the information processing apparatus 10 include a smartphone, a portable game machine, a notebook type or a tablet type personal computer.
- the CPU 11 is an example of a control unit, and is connected to each unit described above to control each unit.
- the memory 12 includes a ROM (Read Only Memory) and a RAM (Random Access Memory), stores various control programs executed by the CPU 11 and fixed data in the ROM, and the CPU 11 executes various control programs. Temporarily necessary data is stored in the RAM.
- ROM Read Only Memory
- RAM Random Access Memory
- the display unit 13 displays various images such as operation images under the control of the CPU 11.
- the touch panel 14 is arranged on the display surface of the display unit 13 so as to detect the contact of the pointing means such as a finger and a stylus pen, and outputs information such as the coordinates of the contact point to the CPU 11 as a detection result. Note that the touch panel 14 can also detect the proximity of the instruction unit and output information such as the coordinates of the contact point using the point at which the instruction unit approaches as a contact point.
- the key input unit 15 When the key input unit 15 is pressed for operation input, the key input unit 15 outputs information indicating the pressed state to the CPU 11.
- the communication unit 16 communicates with other information processing apparatuses and the like via a network according to the control of the CPU 11.
- the microphone 17 outputs audio data obtained by collecting ambient sounds to the CPU 11.
- the speaker 18 outputs sound according to the control of the CPU 11.
- the target operation button When the target operation button is on the left side of the region displayed on the display unit 13 on the operation image in a state where a part of the region on the operation image is displayed, the target operation button is displayed. In addition, it is necessary to scroll the display on the display unit 13 in the right direction.
- the display of the display unit 13 is scrolled in the direction opposite to the moving direction of the contact point. Therefore, in order to scroll the display on the display unit 13 in the right direction, a drag operation for sliding the contact point in the left direction is performed. Along with this drag operation, the contact point moves to the left, the display on the display unit 13 is scrolled to the right, and the target operation button is displayed from the left on the display unit 13. The operation buttons are displayed on the display unit 13 so as to approach each other. Therefore, the area displayed on the display unit 13 in accordance with the drag operation is added to the display area in the direction in which the drag operation is currently displayed, and the display area in the direction of scrolling the display on the display unit 13 is added. Since it is twice as large as the area, the amount of movement of the contact point can be reduced compared with the case where the display on the display unit is scrolled in the same direction as the movement direction of the contact point. it can.
- the contact point can be moved onto the target operation button. If the drag operation is released while the contact point is on the operation button, the operation button is processed as being selected. Therefore, scrolling of the display on the display unit 13 and selection of the operation button can be performed by one drag operation, so that it is possible to reduce the labor of operation input.
- the CPU 11 clears the scroll amount of the display unit 13 stored in the memory 12 (step S101).
- the CPU 11 displays a virtual keyboard on the display unit 13.
- FIG. 3 is a diagram showing a state where the virtual keyboard 22 including the virtual input key 21 is displayed on the display unit 13.
- the CPU 11 has a selection item display range 24 and a selection result display range 25 on the display surface of the display unit 13 inside the effective detection range 23 in which the touch panel 14 detects the contact of the instruction unit 20.
- a part of the virtual keyboard 22 is displayed in the selection item display range 24, and characters corresponding to the selected virtual input key are displayed in the selection result display range 25.
- the area displayed in the selection item display range 24 is indicated by a solid line, and the area other than the area displayed in the selection item display range 24 is indicated by a dotted line. It is intended to make it easier, and it doesn't mean there is a real situation.
- the virtual input key 21 corresponding to the character “w” (hereinafter referred to as the virtual input key 21W) on the display unit 13.
- the virtual input key 21 ⁇ / b> W is not displayed on the display unit 13
- the user brings the instruction unit 20 into contact with the touch panel 14 in order to scroll the display on the display unit 13 by performing a drag operation.
- FIG. 4 it is assumed that the pointing means 20 is touched on a virtual input key 21 (hereinafter referred to as virtual input key 21M) corresponding to the character “m”.
- CPU11 determines whether the contact of the instruction means 20 was performed based on the output from the touch panel 14 (step S102).
- the touch panel 14 outputs information such as the coordinates of the contact point to the CPU 11.
- step S102 If it is determined that the instruction means 20 has made contact with the touch panel 14 based on the output from the touch panel 14 (step S102: Yes), the CPU 11 uses the coordinates of the contact point as the moving direction and moving amount of the contact point associated with the drag operation. Is set as a starting point for specifying and stored in the memory 12 (step S103).
- the CPU 11 determines whether the contact point is on the virtual input key 21 displayed on the display unit 13 according to whether the coordinates of the contact point are within the area where the virtual input key 21 is displayed. It is determined whether or not (step S104).
- step S104 If the contact point is not on the virtual input key 21 (step S104: No), the CPU 11 proceeds to the process of step S106 described later.
- step S104 When the contact point is on the virtual input key 21 (step S104: Yes), the CPU 11 highlights the virtual input key 21 on the display unit 13 (step S105).
- the pointing means touches the virtual input key 21M
- the coordinates of the contact point are within the area where the virtual input key 21M is displayed.
- the CPU 11 causes the display unit 13 to reversely display the virtual input key 21 ⁇ / b> M as an input candidate.
- highlighting an example in which the input candidate virtual input key 21 is highlighted is described.
- the present invention is not limited to this.
- the input candidate virtual input key 21 is displayed. May be enlarged or the characters on the virtual input key 21 may be displayed thickly.
- the CPU 11 determines whether or not the contact between the instruction unit 20 and the touch panel 14 is continued (step S106).
- step S106 If the touch to the touch panel 14 continues (step S106: Yes), the CPU 11 determines whether or not the contact point has moved from the starting point (step S107).
- step S107: No If the contact point has not moved (step S107: No), the CPU 11 returns to the process of step S106.
- step S107: Yes the CPU 11 causes the display unit 13 to cancel the reverse display of the virtual input key 21 (step S108).
- the CPU 11 specifies the moving direction and moving amount of the contact point from the starting point, determines the scroll amount of display on the display unit 13 according to the specified moving direction and moving amount (step S109), and the memory 12 Remember me.
- the CPU 11 stores, in the memory 12, a point moved by the movement amount specified in the movement direction specified from the starting point stored in the memory 12 in step S ⁇ b> 103 as a new starting point, and the display unit 13 stores the determined scroll amount.
- the display is scrolled (step S110).
- the CPU 11 scrolls the display of the display unit 13 in the direction opposite to the moving direction of the contact point by an amount equal to the moving amount of the contact point. Therefore, when the contact point indicated by the movement vector 51 in FIG. 5 is moved, the size of the vector is the same as that of the movement vector 51, and the virtual keyboard 22 is moved by the display scroll vector 52 of the opposite direction.
- An image is displayed in the selection item display range 24. Therefore, the area on the operation image displayed on the display unit 13 is increased with a small amount of movement of the contact point, compared with the case where the display unit is scrolled in the same direction as the movement direction of the contact point. It is possible to reduce input labor.
- the size of the display scroll vector 52 may be larger or smaller than the size of the moving vector 51. Further, the size of the display scroll vector 52 is the same as the size of the movement vector 51 when the moving speed of the contact point is less than a predetermined threshold, and the moving speed of the contact point is equal to or larger than the predetermined threshold. In this case, it may be made larger than the size of the movement vector 51.
- CPU11 returns to the process of step S102 after performing the process of step S110.
- the CPU 11 repeats the processing from step S102 to step S110 at predetermined time intervals, for example, so that the virtual input key is reversed and the reversed display is released according to the movement of the contact point accompanying the drag operation. It is.
- step S105 Since the touch point is on the virtual input key 21W, the CPU 11 highlights the virtual input key 21W (step S105), and then the instruction unit 20 is released from the touch panel 14 (step S106: No). It is determined whether or not the release point at which 20 is separated from the touch panel 14 is within the detection effective range 23 (step S111).
- step S111: Yes the CPU 11 determines whether or not the release point is on the virtual input key 21 that is highlighted (step S112).
- step S112 When the release point is on the virtual input key 21 that is highlighted (step S112: Yes), the CPU 11 assumes that the virtual input key 21 has been selected, and the characters corresponding to the virtual input key 21 are displayed. Is displayed in the selection result display range 25 (step S113). As described above, the virtual input key 21W is highlighted and the instruction means 20 is released in a state where the contact point is on the virtual input key 21W, so that the character “w” is selected as shown in FIG. It is displayed in the result display range 25.
- step S114 After canceling the reverse display of the virtual input key 21 on the display unit 13 (step S114), the CPU 11 ends the input process of one character.
- step S112 If the release point is not on the highlighted virtual input key 21 (step S112: No), the drag operation is canceled at a position deviating from the target virtual input key 21. Since there is a high possibility that the operation has been performed, the CPU 11 proceeds to the process of step S114 without any virtual input key 21 being selected.
- a virtual input key 21 (hereinafter referred to as a virtual input key 21E) corresponding to the character “e” is displayed in the selection item display range 24, as shown in FIG.
- the CPU 11 performs steps S101 to S106 and S111 to S114, and selects the character “e” on the display unit 13. It is displayed in the result display range 25.
- step S102 when the CPU 11 detects the contact of the instruction unit 20 to the touch panel 14 (step S102: Yes) and determines that the contact point is on the virtual input key 21E (step S104: Yes), the virtual input key 21E is highlighted (step S105).
- step S106: No When the instruction means 20 is released (step S106: No), the CPU 11 displays the character “e” in the selection result display range 25 of the display unit 13 because the release point is on the virtual input key 21E (step S113). Then, the reverse display of the virtual input key 21E is canceled (step S114).
- the virtual input key 21 corresponding to the character “l” (hereinafter referred to as the virtual input key 21L) is not displayed in the selection item display range 24 as shown in FIG.
- the user performs a drag operation toward the virtual input key 21L.
- the virtual input key 21L is displayed as a selection item in the display scrolling of the display unit 13 according to the amount of movement of the contact point. Suppose that it cannot be displayed in the range 24. In this case, the user leaves the instruction means 20 in contact with the touch panel 14 beyond the effective detection range 23.
- step S106 the CPU 11 applies the instruction means 20 to the touch panel 14 at step S106. It determines with the contact not continuing, and progresses to the process of step S111. In the process of step S ⁇ b> 111, the CPU 11 determines a point where the contact point has reached the outer edge of the detection effective range 23 as a release point, and determines whether the release point is within the detection effective range 23 of the touch panel 14.
- the CPU 11 determines that the release point is not within the effective detection range 23 (No at Step S111). In this case, the CPU 11 returns to the process of step S ⁇ b> 102 and waits for the contact point to recover within the effective detection range 23.
- step S115 the CPU 11 reduces the scroll amount stored in the memory 12 (step S115). Further, the CPU 11 scrolls the display on the display unit 13 by the scroll amount after reduction, and updates the starting point stored in the memory 12 to the coordinates of the point moved from the starting point by the scroll amount after reduction (step). S116). Therefore, when the contact point of the instruction means 20 with the touch panel 14 reaches the outer edge of the detection effective range 23 in accordance with the drag operation, as shown in FIG. 9, the direction of the vector is opposite to the movement vector 51 of the contact point. In addition to the display scroll vector 52 having the same size, the display on the display unit 13 is scrolled by the size of the vector 53 having the same direction as the display scroll vector 52. The size of the vector 53 is a size corresponding to the scroll amount after reduction stored in the memory 12 in step S115.
- the CPU 11 determines whether or not the virtual keyboard 22 exceeds the selection item display range 24 of the display unit 13 (step S117).
- step S118: Yes the CPU 11 returns to the process of step S102.
- step S117: No the CPU 11 determines that the display scroll limit of the display unit 13 has been reached and ends the process. .
- step S102 Yes
- the CPU 11 returns to the process of step S103.
- the user waits for the virtual input key 21L to be displayed in the selection item display range 24, As shown in FIG. 10, if the point of contact with the touch panel 14 of the pointing unit 20 is returned to the effective detection range 23 and the point of contact 20 is on the virtual input key 21L and the pointing unit 20 is moved away from the touch panel 14, the characters “L” is input.
- the fourth character “l” is input by releasing the touch panel 14 after the pointing means 20 is brought into contact with the virtual input key 21L. Note that the input procedure for the fourth character “l” is the same as the input procedure for the character “e”, and thus description thereof is omitted.
- the information processing apparatus 10 scrolls the display on the display unit 13 in the direction opposite to the moving direction of the contact point accompanying the drag operation.
- the information processing apparatus 10 assumes that the operation button is selected when the release point of the drag operation is on the operation button, and the content of the operation indicated by the operation button is determined. Perform appropriate processing.
- the information processing apparatus 10 highlights the operation button when the contact point is on the operation button.
- the information processing apparatus 10 displays an operation image inside the effective detection range 23, and when the contact point reaches the outer edge of the effective detection range 23 due to the drag operation, the contact associated with the drag operation is performed.
- the moving direction and moving amount of the point are specified, and the display on the display unit 13 is scrolled in the direction opposite to the specified moving direction until the contact point is detected again within the effective detection range.
- FIG. 12 is a block diagram showing the configuration of the information processing apparatus according to the second embodiment of this invention.
- the 12 includes a display unit 61, a touch panel 62, and a control unit 63.
- Display unit 61 displays various images such as operation images.
- the touch panel 62 is arranged so as to overlap the display unit 61, detects the contact or proximity of the instruction means 20, and outputs a detection result.
- the control unit 63 displays, for example, the entire or part of the operation image on the display unit 61.
- the control unit 63 specifies the moving direction and the moving amount of the contact point accompanying the drag operation. Further, the control unit 63 scrolls the display on the display unit by an amount corresponding to the movement amount specified in the direction opposite to the specified movement direction.
- the information processing apparatus 60 scrolls the display of the display unit 61 in the direction opposite to the moving direction of the contact point.
- the method performed by the information processing apparatus of the present invention may be applied to a program for causing a computer to execute.
- the program can be stored in a storage medium and can be provided to the outside via a network.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
表示部と、
前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと、
操作画像の全体または一部を前記表示部に表示させ、前記操作画像の一部を前記表示部に表示させている場合に、前記指示手段と前記タッチパネルとが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせる制御部と、を有する。
操作画像の全体または一部を前記情報処理装置の表示部に表示させ、
前記操作画像の一部を前記表示部に表示させている場合に、前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと前記指示手段とが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせる。
情報処理装置に、
操作画像の全体または一部を前記情報処理装置の表示部に表示させる処理と、
前記操作画像の一部を前記表示部に表示させている場合に、前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと前記指示手段とが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせる処理と、を実行させる。
図1は、本発明の第1の実施形態の情報処理装置の構成を示すブロック図である。
図12は、本発明の第2の実施形態の情報処理装置の構成を示すブロック図である。
Claims (9)
- 表示部と、
前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと、
操作画像の全体または一部を前記表示部に表示させ、前記操作画像の一部を前記表示部に表示させている場合に、前記指示手段と前記タッチパネルとが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせる制御部と、を有することを特徴とする情報処理装置。 - 請求項1記載の情報処理装置において、
前記制御部は、前記操作画像に含まれる、操作の内容を示す操作ボタン上に前記ドラッグ操作の終了時の接触点がある場合には、該操作ボタンに示される操作の内容に応じた処理を行うことを特徴とする情報処理装置。 - 請求項1または2記載の情報処理装置において、
前記制御部は、前記操作画像に含まれる、操作の内容を示す操作ボタン上に前記接触点がある場合には、該操作ボタンを前記表示部に強調表示させることを特徴とする情報処理装置。 - 請求項1から3のいずれか1項に記載の情報処理装置において、
前記制御部は、前記ドラッグ操作に伴い、前記接触点が前記タッチパネルが前記指示手段の接触または近接を検出する検出有効範囲の外縁に達すると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記接触点が前記検出有効範囲内で再び検出されるまで前記特定した移動方向とは反対方向に前記表示部の表示をスクロールさせることを特徴とする情報処理装置。 - 情報処理装置の制御方法であって、
操作画像の全体または一部を前記情報処理装置の表示部に表示させ、
前記操作画像の一部を前記表示部に表示させている場合に、前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと前記指示手段とが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせることを特徴とする情報処理装置の制御方法。 - 請求項5記載の情報処理装置の制御方法において、
前記操作画像に含まれる、操作の内容を示す操作ボタン上に前記ドラッグ操作の終了時の接触点がある場合には、該操作ボタンに示される操作の内容に応じた処理を行うことを特徴とする情報処理装置の制御方法。 - 請求項5または6記載の情報処理装置の制御方法において、
前記操作画像に含まれる、操作の内容を示す操作ボタン上に前記接触点がある場合には、該操作ボタンを前記表示部に強調表示させることを特徴とする情報処理装置の制御方法。 - 請求項5から7のいずれか1項に記載の情報処理装置の制御方法において、
前記ドラッグ操作に伴い、前記接触点が前記タッチパネルが前記指示手段の接触または近接を検出する検出有効範囲の外縁に達すると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記接触点が前記検出有効範囲内で再び検出されるまで前記特定した移動方向とは反対方向に前記表示部の表示をスクロールさせることを特徴とする情報処理装置の制御方法。 - 情報処理装置に、
操作画像の全体または一部を前記情報処理装置の表示部に表示させる処理と、
前記操作画像の一部を前記表示部に表示させている場合に、前記表示部に重ね合わせて配置され、指示手段の接触または近接を検出し、検出結果を出力するタッチパネルと前記指示手段とが接触または近接した接触点をスライドさせるドラッグ操作が行われると、前記ドラッグ操作に伴う前記接触点の移動方向および移動量を特定し、前記特定した移動方向とは反対方向に前記特定した移動量に対応する量だけ前記表示部の表示をスクロールさせる処理と、を実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/978,440 US9563337B2 (en) | 2011-03-23 | 2011-11-11 | Information processing device, method for controlling an information processing device, and program |
CN2011800692011A CN103403660A (zh) | 2011-03-23 | 2011-11-11 | 信息处理装置、用于控制信息处理装置的方法和程序 |
JP2013505774A JPWO2012127733A1 (ja) | 2011-03-23 | 2011-11-11 | 情報処理装置、情報処理装置の制御方法、および、プログラム |
EP11861343.9A EP2690536A4 (en) | 2011-03-23 | 2011-11-11 | INFORMATION PROCESSING DEVICE, ITS CONTROL METHOD, AND ASSOCIATED PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-064170 | 2011-03-23 | ||
JP2011064170 | 2011-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012127733A1 true WO2012127733A1 (ja) | 2012-09-27 |
Family
ID=46878921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/076080 WO2012127733A1 (ja) | 2011-03-23 | 2011-11-11 | 情報処理装置、情報処理装置の制御方法、および、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9563337B2 (ja) |
EP (1) | EP2690536A4 (ja) |
JP (1) | JPWO2012127733A1 (ja) |
CN (1) | CN103403660A (ja) |
WO (1) | WO2012127733A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2516029A (en) * | 2013-07-08 | 2015-01-14 | Ibm | Touchscreen keyboard |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015148991A (ja) * | 2014-02-07 | 2015-08-20 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
FR3020481B1 (fr) * | 2014-04-25 | 2017-08-11 | Thales Sa | Dispositif de visualisation a surface tactile fonctionnant en environnement degrade |
JP6410537B2 (ja) * | 2014-09-16 | 2018-10-24 | キヤノン株式会社 | 情報処理装置、その制御方法、プログラム、及び記憶媒体 |
US9678656B2 (en) * | 2014-12-19 | 2017-06-13 | International Business Machines Corporation | Preventing accidental selection events on a touch screen |
CN109002247A (zh) * | 2018-06-15 | 2018-12-14 | 维沃移动通信有限公司 | 一种数据清理方法、移动终端 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003330613A (ja) | 2002-05-13 | 2003-11-21 | Mobile Computing Technologies:Kk | 携帯型情報端末装置、表示制御情報、及び表示制御方法 |
JP2007026349A (ja) * | 2005-07-21 | 2007-02-01 | Casio Comput Co Ltd | 文字入力装置及び文字入力プログラム |
JP2010204781A (ja) | 2009-03-02 | 2010-09-16 | Alpine Electronics Inc | 入力装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09319502A (ja) * | 1996-05-28 | 1997-12-12 | Toshiba Corp | 表示一体型座標入力装置を備えた情報機器 |
GB0406056D0 (en) * | 2004-03-18 | 2004-04-21 | Ibm | Method and apparatus for two-dimensional scrolling in a graphical display window |
US9785329B2 (en) * | 2005-05-23 | 2017-10-10 | Nokia Technologies Oy | Pocket computer and associated methods |
US20080046496A1 (en) * | 2006-05-18 | 2008-02-21 | Arthur Kater | Multi-functional keyboard on touch screen |
KR100913962B1 (ko) * | 2007-05-14 | 2009-08-26 | 삼성전자주식회사 | 이동통신 단말기의 문자 입력 방법 및 장치 |
JP2008305294A (ja) * | 2007-06-11 | 2008-12-18 | Sharp Corp | フルキーボードを搭載した携帯型端末装置及びフルキーボード表示方法 |
DE202008018283U1 (de) * | 2007-10-04 | 2012-07-17 | Lg Electronics Inc. | Menüanzeige für ein mobiles Kommunikationsendgerät |
KR101387527B1 (ko) * | 2007-12-06 | 2014-04-23 | 엘지전자 주식회사 | 단말기 및 그 메뉴 아이콘 디스플레이 방법 |
JP4244068B1 (ja) * | 2008-08-21 | 2009-03-25 | 任天堂株式会社 | オブジェクト表示順変更プログラム及び装置 |
GB2462579A (en) * | 2008-06-10 | 2010-02-17 | Sony Service Ct | Touch screen display including proximity sensor |
JP2010108061A (ja) * | 2008-10-28 | 2010-05-13 | Sony Corp | 情報処理装置、情報処理方法および情報処理プログラム |
US8610673B2 (en) * | 2008-12-03 | 2013-12-17 | Microsoft Corporation | Manipulation of list on a multi-touch display |
EP2392999A4 (en) * | 2009-02-02 | 2013-11-27 | Panasonic Corp | INFORMATION DISPLAY DEVICE |
JP2011192179A (ja) * | 2010-03-16 | 2011-09-29 | Kyocera Corp | 文字入力装置、文字入力方法及び文字入力プログラム |
US8327296B2 (en) * | 2010-04-16 | 2012-12-04 | Google Inc. | Extended keyboard user interface |
-
2011
- 2011-11-11 US US13/978,440 patent/US9563337B2/en not_active Expired - Fee Related
- 2011-11-11 WO PCT/JP2011/076080 patent/WO2012127733A1/ja active Application Filing
- 2011-11-11 EP EP11861343.9A patent/EP2690536A4/en not_active Withdrawn
- 2011-11-11 CN CN2011800692011A patent/CN103403660A/zh active Pending
- 2011-11-11 JP JP2013505774A patent/JPWO2012127733A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003330613A (ja) | 2002-05-13 | 2003-11-21 | Mobile Computing Technologies:Kk | 携帯型情報端末装置、表示制御情報、及び表示制御方法 |
JP2007026349A (ja) * | 2005-07-21 | 2007-02-01 | Casio Comput Co Ltd | 文字入力装置及び文字入力プログラム |
JP2010204781A (ja) | 2009-03-02 | 2010-09-16 | Alpine Electronics Inc | 入力装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2690536A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2516029A (en) * | 2013-07-08 | 2015-01-14 | Ibm | Touchscreen keyboard |
US9959039B2 (en) | 2013-07-08 | 2018-05-01 | International Business Machines Corporation | Touchscreen keyboard |
US10754543B2 (en) | 2013-07-08 | 2020-08-25 | International Business Machines Corporation | Touchscreen keyboard |
Also Published As
Publication number | Publication date |
---|---|
EP2690536A4 (en) | 2014-08-27 |
US20140009423A1 (en) | 2014-01-09 |
US9563337B2 (en) | 2017-02-07 |
JPWO2012127733A1 (ja) | 2014-07-24 |
CN103403660A (zh) | 2013-11-20 |
EP2690536A1 (en) | 2014-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11816330B2 (en) | Display device, display controlling method, and computer program | |
JP6177669B2 (ja) | 画像表示装置およびプログラム | |
US8854316B2 (en) | Portable electronic device with a touch-sensitive display and navigation device and method | |
US20100245275A1 (en) | User interface apparatus and mobile terminal apparatus | |
WO2012127733A1 (ja) | 情報処理装置、情報処理装置の制御方法、および、プログラム | |
WO2012157562A1 (ja) | 表示装置、ユーザインタフェース方法及びプログラム | |
JP6013714B2 (ja) | 画面スクロール方法及びその装置 | |
JP6319298B2 (ja) | 情報端末、表示制御方法及びそのプログラム | |
JP2011070474A (ja) | 携帯端末装置 | |
TWI659353B (zh) | 電子設備以及電子設備的工作方法 | |
KR20140073245A (ko) | 후면 입력을 가능하게 하기 위한 방법 및 그 방법을 처리하는 전자 장치 | |
US20120218208A1 (en) | Electronic device, operation control method, and storage medium storing operation control program | |
US20130044061A1 (en) | Method and apparatus for providing a no-tap zone for touch screen displays | |
JP5737380B1 (ja) | 情報処理装置、及びプログラム | |
JP2013089037A (ja) | 描画装置、描画制御方法、及び描画制御プログラム | |
US20120218207A1 (en) | Electronic device, operation control method, and storage medium storing operation control program | |
CN103809794A (zh) | 一种信息处理方法以及电子设备 | |
JP5872979B2 (ja) | 携帯情報表示装置および拡大表示方法 | |
JP5288206B2 (ja) | 携帯端末装置、文字入力方法、及び文字入力プログラム | |
EP2407867B1 (en) | Portable electronic device with a touch-sensitive display and navigation device and method | |
JP2013162202A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP2014095942A (ja) | 情報表示装置および情報表示方法 | |
JP2014160301A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2015225417A (ja) | 携帯端末装置及び表示制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11861343 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13978440 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011861343 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2013505774 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |