WO2015029239A1 - Dispositif de traitement d'informations, procédé de commande d'affichage et programme - Google Patents

Dispositif de traitement d'informations, procédé de commande d'affichage et programme Download PDF

Info

Publication number
WO2015029239A1
WO2015029239A1 PCT/JP2013/073423 JP2013073423W WO2015029239A1 WO 2015029239 A1 WO2015029239 A1 WO 2015029239A1 JP 2013073423 W JP2013073423 W JP 2013073423W WO 2015029239 A1 WO2015029239 A1 WO 2015029239A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
processing apparatus
touch operation
icon
display control
Prior art date
Application number
PCT/JP2013/073423
Other languages
English (en)
Japanese (ja)
Inventor
喩暄 黄
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Priority to PCT/JP2013/073423 priority Critical patent/WO2015029239A1/fr
Publication of WO2015029239A1 publication Critical patent/WO2015029239A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • Embodiments described herein relate generally to an information processing apparatus, a display control method, and a program.
  • buttons are displayed over a wide range of the display screen of the mobile terminal, and the user carries the buttons by touching the buttons.
  • the terminal is being operated.
  • buttons are arranged over a wide range of the display screen of the mobile terminal, for example, when the mobile terminal is operated with the other hand while holding the train strap with one hand, the mobile terminal When one of them is operated with one hand, the mobile terminal may not be operated without a finger reaching the button on the display screen of the mobile terminal.
  • the information processing apparatus includes a display unit, a detection unit, and a display control unit.
  • the detection unit detects a touch operation on the display screen of the display unit.
  • the display control unit is displayed on the display screen at least around the detection position of the detection position where the touch operation is detected and the detection position, and is operated by the touch operation. Move possible icons.
  • FIG. 1 is a diagram illustrating an appearance of the information processing apparatus according to the present embodiment.
  • FIG. 2 is a block diagram illustrating a configuration of the information processing apparatus according to the present embodiment.
  • FIG. 3 is a flowchart showing a flow of processing for moving the icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 4 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 5 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 6 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 1 is a diagram illustrating an appearance of the information processing apparatus according to the present embodiment.
  • FIG. 2 is a block diagram illustrating a configuration of the information processing apparatus according to the present embodiment.
  • FIG. 3 is a flowchart showing a flow of processing for moving the icon displayed on
  • FIG. 7 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 8 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 9 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 10 is a diagram for explaining processing for moving an icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • FIG. 1 is a diagram illustrating an appearance of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 1 according to the present embodiment is a tablet terminal, a smartphone, a notebook PC (Personal Computer), or the like, and a display D as a display unit having a display screen G on which various types of information are displayed.
  • a touch panel T as a detection unit for detecting a touch operation is provided.
  • the information processing apparatus 1 updates the display content of the display screen G in response to a touch operation on the display screen G by an instruction unit such as a user's finger Y of the information processing apparatus 1 or a stylus pen (not shown). A process corresponding to the touch operation is executed.
  • FIG. 2 is a block diagram showing the configuration of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 1 includes a CPU (Central Processing Unit) 10, a memory 11, a communication unit 12, a storage medium 13, a display D, and a touch panel T.
  • a CPU Central Processing Unit
  • the CPU 10 is a control device that controls each unit of the information processing apparatus 1.
  • the CPU 10 controls each unit of the information processing apparatus 1 by executing an OS (Operating System) and applications stored in the storage medium 13.
  • the CPU 10 executes the display control application 110 stored in the storage medium 13, thereby realizing the display control unit 10 a that controls display of various types of information on the display screen G.
  • the memory 11 is a storage medium having a ROM (Read Only Memory) that stores various data such as image data to be displayed on the display screen G, a RAM (Random Access Memory) that functions as a work area of the CPU 10, a flash memory, and the like. is there.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the display D has a display screen G composed of an LCD (Liquid Crystal Display) or the like.
  • the touch panel T is stacked on the display screen G of the display D and detects a touch operation on the display screen G. Then, the touch panel T notifies the display control unit 10a of position information indicating a detected position that is a position where the touch operation is detected on the display screen G.
  • the storage medium 13 stores various applications such as an HDD (Hard Disk Drive), such as an OS and a display control application, and has a larger capacity than the memory 11 such as a flash memory.
  • Various applications (programs) executed by the information processing apparatus 1 according to the present embodiment may be configured to be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. Further, the program executed by the information processing apparatus 1 of the present embodiment may be configured to be provided or distributed via a network such as the Internet.
  • the communication unit 12 is a connection interface for connecting to the Internet or the like.
  • FIG. 3 is a flowchart showing a flow of processing for moving the icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • 4 to 10 are diagrams for explaining processing for moving the icon displayed on the display screen in the information processing apparatus according to the present embodiment.
  • the display control unit 10a executes the process described below when the information processing apparatus 1 is set to a one-hand operation mode in which the information processing apparatus 1 is operated with one hand.
  • the display control unit 10a causes the display screen G to display a mode setting screen that can set a one-handed operation mode in which the information processing apparatus 1 is operated with one hand or a two-handed operation mode in which the information processing apparatus 1 is operated with both hands. be able to. Then, the display control unit 10a determines whether or not the one-hand operation mode is set according to the mode set on the mode setting screen.
  • the display control unit 10a displays buttons I1 to I15 in the areas A1 and A2 (an example of a predetermined position) of the display screen G (in the following description, the buttons I1 to I15). If it is not necessary to distinguish I15, it is simply referred to as button I). Then, as shown in FIG. 5, the display control unit 10a touches the button I to move at an arbitrary detection position Z on the display screen G (for example, long press or double tap at the detection position Z). Is detected, the position information indicating the detection position Z is received from the touch panel T.
  • the display control unit 10a moves a button I (an example of an icon) displayed on the display screen G based on the detection position indicated by the received position information and the position of the side of the display screen G (hereinafter, an icon). , Referred to as a collection area) (step S302). Specifically, the display control unit 10a determines at least the periphery of the detection position among the detection position indicated by the received position information and the periphery of the detection position as a collection region. In the present embodiment, as shown in FIG. 6, the display control unit 10a, among the edges H1 to H4 of the display screen G, around the detection position Z indicated by the received position information, with the detection position Z as a reference.
  • the gathering region R is a predetermined range from the detection position Z where the finger Y of the user of the information processing apparatus 1 can reach. More specifically, the gathering region R is preferably a fan-shaped region including the detection position Z, as shown in FIG.
  • the display control unit 10a uses the detected position Z as a reference when the detected position Z indicated by the received position information is closest to the side H4 among the sides H1 to H4 of the display screen G. As a result, the area on the opposite side of the side H4 in the periphery of the detection position Z is determined as the collection area R. Further, as shown in FIG. 7B, the display control unit 10a uses the detected position Z as a reference when the detected position Z indicated by the received position information is closest to the side H1 among the sides H1 to H4 of the display screen G. As a result, the region on the opposite side of the side H1 in the periphery of the detection position Z is determined as the collective region R.
  • the display control unit 10a uses the detected position Z as a reference when the detected position Z indicated by the received position information is closest to the side H2 among the sides H1 to H4 of the display screen G. As a result, the region on the opposite side of the side H2 in the periphery of the detection position Z is determined as the collection region R. Further, as shown in FIG. 7D, the display control unit 10a uses the detected position Z as a reference when the detected position Z indicated by the received position information is closest to the side H3 among the sides H1 to H4 of the display screen G. As a result, the region on the opposite side of the side H3 in the periphery of the detection position Z is determined as the collection region R.
  • the region on the opposite side of the fulcrum of the finger Y that is touch-operated on the display screen G can be determined as the collective region R.
  • the button I can be moved to a region where the touch operation can be easily performed by the finger Y in the periphery of the finger Y to be touch-operated.
  • the display control unit 10a determines the gathering area based on the detection position at which the touch operation instructing the button I to be moved by the touch panel T is detected.
  • a pressure sensor is provided on the side of the housing of the information processing apparatus 1, and based on the position of the user's hand detected by the pressure sensor, the vicinity of the hand or the position of the hand on the display screen G You may determine at least any one of the circumference
  • the display control unit 10a recognizes the button I displayed on the display screen G (step S303). Specifically, the display control unit 10a acquires button information related to the button I displayed on the display screen G (in this embodiment, an operation history such as the number and frequency of touch operations of the button I).
  • the display control unit 10a determines whether or not each button I displayed on the display screen G is a button I that satisfies a predetermined selection condition (hereinafter referred to as a fixed button) based on the acquired button information. Determination is made (step S304).
  • the predetermined selection condition includes the number of times the user performs a touch operation such as that the number of times that the button I has been touched is greater than or equal to the predetermined number of times, This is a condition for displaying the frequently used button I at a position where the touch operation is always easy.
  • step S304 If it is determined that the button I is a fixed button (step S304: Yes), the display control unit 10a determines whether the button I determined to be a fixed button is located in the set area determined in step S302. Is determined (step S305). When the display control unit 10a determines that the button I determined to be a fixed button is located in the collection area (step S305: Yes), the display control unit 10a does not move the button I determined to be the fixed button (step S307). ).
  • step S305: No when the display control unit 10a determines that the button I determined to be a fixed button is not located in the collection area (step S305: No), and the button I other than the fixed button that satisfies the predetermined selection condition. If it is determined that it is (step S304: No), the button I is moved to the collection area (step S306).
  • the button I when the information processing apparatus 1 is operated with one hand, such as when the information processing apparatus 1 is operated with one hand while holding the train strap with one hand, the finger of the hand holding the information processing apparatus 1 Since the button I is arranged in the vicinity of Y, the button I can be easily operated even when the information processing apparatus 1 is operated with one hand.
  • the button I when the button I is a fixed button, the button I can be moved following the set area determined according to the touch operation (in other words, the fixed button of the button I is always in the vicinity of the finger Y). Can be displayed.)
  • the display control unit 10a keeps the button I displayed on the display screen G while moving the button I to the collective region R as shown in FIG. Thereby, the user of the information processing apparatus 1 can easily recognize to which position in the assembly area R the button I arranged in the areas A1 and A2 (predetermined positions) has moved.
  • the display control unit 10a moves the button I to the collection area without changing the display size of the button I, as shown in FIG. Move to R.
  • the display size of the button I becomes smaller than the display size when the button I is arranged in the regions A1 and A2 (predetermined positions), and the button I Prevents difficult operation.
  • the display control unit 10a is displayed in the areas A1 and A2 (predetermined positions) when the plurality of buttons I are moved to the gathering area R as shown in FIGS.
  • the plurality of buttons I are randomly arranged in the gathering region R while ignoring the arrangement relationship between the plurality of buttons I.
  • the plurality of buttons I are moved to positions where the user can easily perform a touch operation in the collective region R without being limited by the arrangement relationship of the plurality of buttons I when displayed in the regions A1 and A2 (predetermined positions). be able to.
  • the display control unit 10 a satisfies whether a predetermined condition is satisfied after the touch operation instructing the movement of the button I (in this embodiment, a long press or double tap at the detection position Z) is released. Is determined (step S308).
  • the predetermined condition is that any of the buttons I moved to the gathering area is confirmed to be selected by the touch operation (for example, the button I is tapped and the button I is selected).
  • a predetermined time elapses, or a movement instruction is input to return the button I to the areas A1 and A2 (predetermined positions). It is.
  • the display control unit 10a moves the button I to the collective area, and then performs a button operation when a touch operation such as long press or double tap is performed in an area other than the collective area on the display screen G. It is determined that a movement instruction that instructs to return I to the areas A1 and A2 (predetermined positions) is input.
  • step S309 the display control unit 10a determines whether or not a fixed button is included in the button I displayed in the collection area. In the present embodiment, the display control unit 10a determines whether or not the button I displayed in the gathering area is a fixed button, as in step S304.
  • step S309: Yes the display control unit 10a leaves the fixed button in the collective area (holds it). The button is moved to areas A1 and A2 (predetermined positions) (step S310).
  • step S309: No the display control unit 10a displays all the buttons I arranged in the collection area as areas A1 and A2 (predetermined). (Step S311).
  • the display control unit 10a moves the button I from the collection area R to the areas A1, A2 as shown in FIG. 9, as in the case of moving from the areas A1, A2 (predetermined positions) to the collection area R.
  • the button I is kept displayed on the display screen G while being moved to (predetermined position).
  • the user of the information processing device 1 can easily recognize to which position of the predetermined position the button I arranged in the gathering region R has moved.
  • the buttons I1 to I15 displayed in the collection area R include the fixed button I12
  • the display control unit 10a leaves the fixed button I12 in the collection area R (holding).
  • a button I other than the fixed button I12 is moved to a predetermined position.
  • the button I that is frequently or frequently touched by the user of the information processing apparatus 1 can always be displayed at a position where the user can easily touch.
  • the display control unit 10a executes the process of moving the icon displayed on the display screen G when the one-handed operation mode is set.
  • the present invention is not limited to this. Regardless of whether the information processing apparatus 1 is set to the one-handed operation mode or the two-handed operation mode, the process of moving the icon displayed on the display screen G may always be executed.
  • the button I when the information processing apparatus 1 is operated with one hand, the button I is arranged in the vicinity of the finger Y of the hand that holds the information processing apparatus 1. Even when the processing apparatus 1 is operated with one hand, the button I can be easily operated.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

 Un mode de réalisation de la présente invention porte sur un dispositif de traitement d'informations qui est pourvu d'une unité d'affichage, d'une unité de détection et d'une unité de commande d'affichage. L'unité de détection détecte une opération tactile sur un écran d'affichage de l'unité d'affichage. Lorsqu'une opération tactile est détectée par l'unité de détection, l'unité de commande d'affichage amène une icône, affichée sur l'écran d'affichage et pouvant être actionnée par une opération tactile, à se déplacer vers, parmi la position de détection où l'opération tactile a été détectée et la zone autour de la position de détection, au moins la zone autour de la position de détection.
PCT/JP2013/073423 2013-08-30 2013-08-30 Dispositif de traitement d'informations, procédé de commande d'affichage et programme WO2015029239A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/073423 WO2015029239A1 (fr) 2013-08-30 2013-08-30 Dispositif de traitement d'informations, procédé de commande d'affichage et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/073423 WO2015029239A1 (fr) 2013-08-30 2013-08-30 Dispositif de traitement d'informations, procédé de commande d'affichage et programme

Publications (1)

Publication Number Publication Date
WO2015029239A1 true WO2015029239A1 (fr) 2015-03-05

Family

ID=52585846

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/073423 WO2015029239A1 (fr) 2013-08-30 2013-08-30 Dispositif de traitement d'informations, procédé de commande d'affichage et programme

Country Status (1)

Country Link
WO (1) WO2015029239A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446641A (zh) * 2015-11-06 2016-03-30 广东欧珀移动通信有限公司 单手操作触摸屏图标的方法、系统和移动终端
CN105824508A (zh) * 2016-04-01 2016-08-03 广东欧珀移动通信有限公司 一种终端界面的显示方法及终端设备
JP2017117239A (ja) * 2015-12-24 2017-06-29 ブラザー工業株式会社 プログラムおよび情報処理装置
WO2019119799A1 (fr) * 2017-12-22 2019-06-27 华为技术有限公司 Procédé d'affichage d'icone d'application et dispositif terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100306650A1 (en) * 2009-05-26 2010-12-02 Pantech Co., Ltd. User interface apparatus and method for user interface in touch device
JP2010271982A (ja) * 2009-05-22 2010-12-02 Nec Casio Mobile Communications Ltd 携帯端末装置及びプログラム
JP2011060209A (ja) * 2009-09-14 2011-03-24 Sony Corp 情報処理装置、表示方法及びプログラム
US20120154301A1 (en) * 2010-12-16 2012-06-21 Lg Electronics Inc. Mobile terminal and operation control method thereof
JP2013073529A (ja) * 2011-09-28 2013-04-22 Kyocera Corp 装置、方法、及びプログラム
JP2013161249A (ja) * 2012-02-03 2013-08-19 Sharp Corp 入力装置、入力装置の制御方法、制御プログラム、および記録媒体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010271982A (ja) * 2009-05-22 2010-12-02 Nec Casio Mobile Communications Ltd 携帯端末装置及びプログラム
US20100306650A1 (en) * 2009-05-26 2010-12-02 Pantech Co., Ltd. User interface apparatus and method for user interface in touch device
JP2011060209A (ja) * 2009-09-14 2011-03-24 Sony Corp 情報処理装置、表示方法及びプログラム
US20120154301A1 (en) * 2010-12-16 2012-06-21 Lg Electronics Inc. Mobile terminal and operation control method thereof
JP2013073529A (ja) * 2011-09-28 2013-04-22 Kyocera Corp 装置、方法、及びプログラム
JP2013161249A (ja) * 2012-02-03 2013-08-19 Sharp Corp 入力装置、入力装置の制御方法、制御プログラム、および記録媒体

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446641A (zh) * 2015-11-06 2016-03-30 广东欧珀移动通信有限公司 单手操作触摸屏图标的方法、系统和移动终端
JP2017117239A (ja) * 2015-12-24 2017-06-29 ブラザー工業株式会社 プログラムおよび情報処理装置
CN105824508A (zh) * 2016-04-01 2016-08-03 广东欧珀移动通信有限公司 一种终端界面的显示方法及终端设备
CN105824508B (zh) * 2016-04-01 2019-03-26 Oppo广东移动通信有限公司 一种终端界面的显示方法及终端设备
WO2019119799A1 (fr) * 2017-12-22 2019-06-27 华为技术有限公司 Procédé d'affichage d'icone d'application et dispositif terminal

Similar Documents

Publication Publication Date Title
JP5922480B2 (ja) 表示機能を備える携帯機器、プログラムおよび表示機能を備える携帯機器の制御方法
US9703382B2 (en) Device, method, and storage medium storing program with control for terminating a program
JP5703873B2 (ja) 情報処理装置、情報処理方法およびプログラム
US20140380209A1 (en) Method for operating portable devices having a touch screen
KR20190100339A (ko) 애플리케이션 스위칭 방법, 디바이스 및 그래픽 사용자 인터페이스
JP6157885B2 (ja) 携帯端末装置の表示制御方法
US9733667B2 (en) Information processing device, information processing method and recording medium
JP2009110286A (ja) 情報処理装置、ランチャー起動制御プログラムおよびランチャー起動制御方法
JP2009536385A (ja) スクロール付き多機能キー
JP5846129B2 (ja) 情報処理端末およびその制御方法
JP6109788B2 (ja) 電子機器及び電子機器の作動方法
JP6102474B2 (ja) 表示装置、入力制御方法、及び入力制御プログラム
EP2829967A2 (fr) Procédé de traitement d'entrées et son dispositif électronique
US20130050120A1 (en) Device, method, and storage medium storing program
JP2015043135A (ja) 情報処理装置
JP2013145556A (ja) 電子機器およびその制御方法
WO2015029239A1 (fr) Dispositif de traitement d'informations, procédé de commande d'affichage et programme
JP2014164718A (ja) 情報端末
JP2014106806A (ja) 情報処理装置
CA2932267C (fr) Liaison d'un appareil a un dispositif informatique
JP6153487B2 (ja) 端末及び制御方法
US20170075453A1 (en) Terminal and terminal control method
JP6411067B2 (ja) 情報処理装置及び入力方法
JP2013073365A (ja) 情報処理装置
KR101251021B1 (ko) 터치스크린 출력화면 조정방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13892657

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13892657

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP