WO2012140883A1 - Display processing device - Google Patents

Display processing device Download PDF

Info

Publication number
WO2012140883A1
WO2012140883A1 PCT/JP2012/002513 JP2012002513W WO2012140883A1 WO 2012140883 A1 WO2012140883 A1 WO 2012140883A1 JP 2012002513 W JP2012002513 W JP 2012002513W WO 2012140883 A1 WO2012140883 A1 WO 2012140883A1
Authority
WO
WIPO (PCT)
Prior art keywords
display processing
display
processing device
user
control unit
Prior art date
Application number
PCT/JP2012/002513
Other languages
French (fr)
Japanese (ja)
Inventor
酒井 将史
岩崎 実
真鳥 黒田
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012140883A1 publication Critical patent/WO2012140883A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a display processing device that performs processing to display a plurality of icons on a display.
  • a touch panel formed integrally with a display is used as an input interface in some portable electronic devices such as mobile phones, smartphones, PDAs, and the like that are currently on the market.
  • the user of the device can perform a desired operation by touching the surface of the touch panel with a finger.
  • FIG. 8 is a diagram illustrating a state in which a user operating a portable electronic device holds the device with his right hand. As shown in FIG. 8, the user operates the device by holding the entire device with the palm and touching a desired position on the touch panel with the thumb. At this time, various information such as icons are displayed on the display of the device. For example, a menu screen for selecting one of the functions of the device is displayed. When the menu screen is displayed on the display, the user selects an icon indicating each function displayed on the display with a thumb or the like.
  • FIG. 9 shows an example of a menu screen displayed on the display of the portable electronic device shown in FIG.
  • the menu screen shown in FIG. 9 five icons representing each function are arranged in a specific arc shape.
  • the other icons are displayed in succession on the curve.
  • the example shown in FIG. 9 shows a change in the menu screen when tracing from the upper right to the lower left along the curve. In this way, the user can perform a desired operation by tracing or clicking the surface of the touch panel formed integrally with the display mainly with the thumb.
  • the size of the hand varies from person to person, and the way it is held depends on the user. For this reason, if the arrangement position of the icons arranged on the menu screen or the like is fixed, the natural movement of the thumb does not match the curve shown by the dotted line in FIG. 9 for some users. Such a user needs to change the way the device is held or operate the device by moving his thumb while feeling cramped.
  • An object of the present invention is to provide a display processing device that performs processing to display a plurality of icons on a display in a form according to the characteristics of the user's hand or when the user operates the device.
  • the present invention is a display processing apparatus that performs a predetermined operation when a user holds a device having an input interface having an operation surface with a palm and the user touches the operation surface of the input interface with a finger.
  • An analysis unit that analyzes a trajectory based on a slide operation performed on the interface, a control unit that determines a screen form in which a plurality of icons are arranged on or along the trajectory, and the control unit determines.
  • a display processing device including a display processing unit for processing to display a screen of the above form on a display.
  • the analysis unit provides a display processing device that analyzes, as the locus, a portion in which a user moves a finger while touching the operation surface of the input interface.
  • the analysis unit performs a motion prediction based on a portion where the user moves a finger while touching the operation surface of the input interface, and a portion where the finger actually touches the operation surface. And a display processing device that analyzes a portion obtained by combining the subsequent motion prediction portion as the locus.
  • the above display processing device provides a display processing device in which the screen determined by the control unit includes lines indicating the plurality of icons and the locus.
  • a display processing device in which the screen of the form determined by the control unit includes the name of the corresponding function arranged in a margin part beside each icon.
  • control unit provides a display processing device that determines the number of the plurality of icons based on the length of the locus.
  • the control unit when an operation of selecting any one of the plurality of icons is performed, the control unit has a function indicated by the selected icon on another locus similar to the locus.
  • a display processing device for changing to a screen on which another icon is arranged.
  • the display processing apparatus can perform processing to display a plurality of icons on the display in a form according to the characteristics of the user's hand or when the user operates the device. As a result, a user who holds the device by hand can operate with natural finger movement.
  • the block diagram which shows the internal structure of the display processing apparatus of one Embodiment, and the relationship between the said display processing apparatus, a touchscreen, and a display.
  • the figure which shows the example of the menu screen in one Embodiment The figure which shows the example of the menu screen in one Embodiment
  • trajectory The figure which shows an example of the menu screen by a user's operation with a small hand
  • the figure which shows the state in which the user who operates a portable electronic device has the said device with the right hand
  • a portable electronic device including a display processing device according to an embodiment described below can be operated by a user such as a mobile phone, a smartphone, a PDA, a removable car navigation device, a portable TV, etc. Equipment.
  • the device of this embodiment includes a touch panel bonded to the display surface of a display as an input interface.
  • the user of the device operates the device by holding the entire device in the palm of the hand and touching the touch panel with the tip of the belly side of the thumb (hereinafter simply referred to as “finger”).
  • the operations include a slide operation, a click operation, a flick operation, and the like that the user moves while touching a finger on the touch panel.
  • FIG. 1 is a block diagram illustrating an internal configuration of a display processing device according to an embodiment and a relationship between the display processing device, a touch panel, and a display.
  • the display processing apparatus 100 includes an analysis unit 101, a memory 103, a control unit 105, and a display processing unit 107.
  • the touch panel 51 illustrated in FIG. 1 is bonded to the display surface of the display 53.
  • a menu screen showing a list of functions of the device or a part thereof as icons is displayed.
  • the icon is a symbol indicating an object that can be selected by the user.
  • the analysis unit 101 analyzes an operation performed on the touch panel 51 by the user.
  • the analysis unit 101 first recognizes a portion where the user's finger touches the touch panel 51. Therefore, if the operation performed on the touch panel 51 by the user is a slide operation, the analysis unit 101 analyzes a portion where the user moves the finger while touching the touch panel 51 as a locus. Further, the analysis unit 101 performs finger motion prediction based on a portion where the user moves the finger while touching the touch panel 51, and determines a portion where the finger actually touches the touch panel 51 and a portion due to subsequent motion prediction. The combined part may be analyzed as a trajectory.
  • the memory 103 stores data relating to each design of icons to be displayed on the display 53, various setting information, and a program of each function possessed by the device.
  • the control unit 105 determines the form of a menu screen in which a plurality of icons are arranged along the trajectory analyzed by the analysis unit 101, reads data related to the design of the icon to be displayed from the memory 103, and sends the data to the display processing unit 107.
  • the control unit 105 executes a function related to an icon that the user clicks with a finger among the icons displayed on the display 53.
  • the display processing unit 107 performs processing to display each icon on the display 53 according to the form of the menu screen determined by the control unit 105.
  • a menu screen is displayed when the user slides the touch panel 51 when the device is in a standby state.
  • the analysis unit 101 analyzes the trajectory of the finger on the touch panel 51 by the slide operation, and the control unit 105 instructs the display processing unit 107 to display a plurality of icons in a predetermined order on the trajectory.
  • . 2 and 3 are diagrams illustrating an example of a menu screen according to an embodiment.
  • Reference numeral 201A illustrated in FIG. 2 and reference numeral 201B illustrated in FIG. 3 indicate a trajectory in which the user slides the touch panel 51 with a finger before the menu screen is displayed. Note that the user who has operated the locus 201A in FIG. 2 is different from the user who has operated the locus 201B in FIG. Alternatively, even for the same user, the way of holding the device when the operation of the locus 201A in FIG. 2 is performed is different from the way of holding the device when the operation of the locus 201B in FIG. 3 is performed.
  • the operation performed by the user when displaying the menu screen is not limited to the slide operation shown in FIGS. 2 and 3, but may be an operation of drawing an S character or a zigzag operation of a lightning bolt.
  • a trace may be displayed in a predetermined form on the icon.
  • the trajectory display form may be a line drawn with crayons as shown in FIGS. 2 and 3, a line like a tail of a shooting star, or a dotted line.
  • the menu screen may indicate the name of the corresponding function in a margin part beside each icon.
  • the control unit 105 analyzes the arrangement of each icon in the menu screen, and determines a blank portion where the name of each function can be displayed.
  • FIG. 4 is a diagram illustrating an example of a menu screen in a case where there is no blank portion on the right side of the locus that can display the name of each function.
  • the control unit 105 instructs the display processing unit 107 to display the name of each function on the left side of the trajectory.
  • the control unit 105 determines whether or not to display the locus and the name of each function on the menu screen according to various settings recorded in the memory 103.
  • the analysis unit 101 analyzes the slide operation to determine the locus, and the control unit 105 Instructs the display processing unit 107 to display the icons in a predetermined order on the trajectory.
  • the icons constituting the menu screen are displayed on a line that matches the movement of the user's finger. Therefore, the user can perform an operation with a natural finger movement when the menu screen is displayed on the display 53 when the device is held by hand.
  • FIG. 5 is a diagram illustrating an example of a menu screen operated by a user with a small hand. As shown in FIG. 5, since the trajectory by a user with a small hand is short, the menu screen is displayed with a reduced number of icons.
  • FIG. 7 is a diagram showing an example of the screen after the user clicks the mail icon when the menu screen shown in FIG. 2 is displayed.
  • the control unit 105 reads the data of each function related to the mail from the memory 103 and resembles the locus 201A.
  • the display processing unit 107 is instructed to change to a screen in which icons of functions related to mail are arranged in a predetermined order on the trajectory 201C.
  • the trajectory 201A is displayed with the trajectory 201A being the same size and the trajectory 201C having a reduced size.
  • the trajectory 201A is reduced and 201C may be displayed in the same size as the original trajectory 201A.
  • the display processing device is useful as a display processing device for a portable electronic device that performs processing to display a plurality of icons on a display.

Abstract

Provided is a display processing device which performs a predetermined action when the user holds a machine provided with an input interface having an operating surface and touches the operating surface of an input interface with their finger. The display processing device is equipped with: an analysis unit for analyzing a trace based on a sliding operation performed on the input interface; a control unit for determining a mode of a screen in which a plurality of icons are arranged either on the trace or along the trace; and a display processing unit for processing such that the screen of the mode determined by the control unit is displayed on the display.

Description

表示処理装置Display processing device
 本発明は、ディスプレイに複数のアイコンを表示するよう処理する表示処理装置に関する。 The present invention relates to a display processing device that performs processing to display a plurality of icons on a display.
 現在、市販されている携帯電話やスマートフォン、PDA等といった携帯型電子機器の一部には、入力インターフェースとして、ディスプレイと一体に成形されたタッチパネルが用いられている。当該機器のユーザは、タッチパネルの表面を指で触れることによって、所望の操作を行える。 At present, a touch panel formed integrally with a display is used as an input interface in some portable electronic devices such as mobile phones, smartphones, PDAs, and the like that are currently on the market. The user of the device can perform a desired operation by touching the surface of the touch panel with a finger.
 図8は、携帯型電子機器を操作するユーザが当該機器を右手で持った状態を示す図である。図8に示すように、ユーザは機器全体を手のひらで持ち、タッチパネル上の所望の位置を親指で触れることで当該機器を操作する。このとき、機器のディスプレイには、アイコン等の様々な情報が表示される。例えば、当該機器が有する機能の内の1つを選択するためのメニュー画面が表示される。ディスプレイにメニュー画面が表示されているとき、ユーザは、ディスプレイに表示された各機能を示すアイコンを親指等で選択する。 FIG. 8 is a diagram illustrating a state in which a user operating a portable electronic device holds the device with his right hand. As shown in FIG. 8, the user operates the device by holding the entire device with the palm and touching a desired position on the touch panel with the thumb. At this time, various information such as icons are displayed on the display of the device. For example, a menu screen for selecting one of the functions of the device is displayed. When the menu screen is displayed on the display, the user selects an icon indicating each function displayed on the display with a thumb or the like.
国際公開第2009/090704号International Publication No. 2009/090704
 図9は、図8に示した携帯型電子機器のディスプレイに表示されるメニュー画面の例を示す。図9に示すメニュー画面には、各機能を表す5つのアイコンが特定の弧状に配置されている。ユーザは、アイコンが配置された図9に点線で示す曲線を指でなぞると、他のアイコンが当該曲線上に連なって表示される。図9に示した例は、曲線に沿って右上から左下になぞったときのメニュー画面の変化を示す。このように、ユーザは、ディスプレイと一体に成形されたタッチパネルの表面を、主に親指でなぞったりクリックすることによって所望の操作を行うことができる。 FIG. 9 shows an example of a menu screen displayed on the display of the portable electronic device shown in FIG. In the menu screen shown in FIG. 9, five icons representing each function are arranged in a specific arc shape. When the user traces the curve indicated by the dotted line in FIG. 9 where the icons are arranged with his / her finger, the other icons are displayed in succession on the curve. The example shown in FIG. 9 shows a change in the menu screen when tracing from the upper right to the lower left along the curve. In this way, the user can perform a desired operation by tracing or clicking the surface of the touch panel formed integrally with the display mainly with the thumb.
 しかし、手の大きさは人それぞれ異なり、ユーザによっては持ち方も異なる。このため、メニュー画面等で配置されるアイコンの並び位置が固定されていると、ユーザによっては、親指の自然な動きが図9に点線で示した曲線に合致しない。このようなユーザは、機器の持ち方を変えたり、窮屈な思いをしながら親指を動かして、機器を操作する必要がある。 However, the size of the hand varies from person to person, and the way it is held depends on the user. For this reason, if the arrangement position of the icons arranged on the menu screen or the like is fixed, the natural movement of the thumb does not match the curve shown by the dotted line in FIG. 9 for some users. Such a user needs to change the way the device is held or operate the device by moving his thumb while feeling cramped.
 本発明の目的は、ユーザの手の特徴又はユーザが機器を操作するときの特徴に応じた形態で、ディスプレイに複数のアイコンを表示するよう処理する表示処理装置を提供することである。 An object of the present invention is to provide a display processing device that performs processing to display a plurality of icons on a display in a form according to the characteristics of the user's hand or when the user operates the device.
 本発明は、操作面を有する入力インターフェースを備えた機器をユーザが手のひらで持ち、当該ユーザが前記入力インターフェースの前記操作面を指で触れると所定の動作を行う表示処理装置であって、前記入力インターフェースに対して行われたスライド操作に基づく軌跡を解析する解析部と、前記軌跡上に又は前記軌跡に沿って複数のアイコンを配置した画面の形態を決定する制御部と、前記制御部が決定した形態の画面をディスプレイに表示するよう処理する表示処理部と、を備えた表示処理装置を提供する。 The present invention is a display processing apparatus that performs a predetermined operation when a user holds a device having an input interface having an operation surface with a palm and the user touches the operation surface of the input interface with a finger. An analysis unit that analyzes a trajectory based on a slide operation performed on the interface, a control unit that determines a screen form in which a plurality of icons are arranged on or along the trajectory, and the control unit determines There is provided a display processing device including a display processing unit for processing to display a screen of the above form on a display.
 上記表示処理装置では、前記解析部は、ユーザが指を前記入力インターフェースの前記操作面に触れた状態で動かした部分を前記軌跡として解析する表示処理装置を提供する。 In the display processing device, the analysis unit provides a display processing device that analyzes, as the locus, a portion in which a user moves a finger while touching the operation surface of the input interface.
 上記表示処理装置では、前記解析部は、ユーザが指を前記入力インターフェースの前記操作面に触れた状態で動かした部分に基づいて動き予測を行い、前記指が実際に前記操作面に触れた部分とその後の動き予測による部分とを併せた部分を前記軌跡として解析する表示処理装置を提供する。 In the display processing device, the analysis unit performs a motion prediction based on a portion where the user moves a finger while touching the operation surface of the input interface, and a portion where the finger actually touches the operation surface. And a display processing device that analyzes a portion obtained by combining the subsequent motion prediction portion as the locus.
 上記表示処理装置では、前記制御部が決定する形態の画面には、前記複数のアイコン及び前記軌跡を示す線が含まれる表示処理装置を提供する。 The above display processing device provides a display processing device in which the screen determined by the control unit includes lines indicating the plurality of icons and the locus.
 上記表示処理装置では、前記制御部が決定する形態の画面には、各アイコンの脇の余白部分に配置される、該当する機能の名称が含まれる表示処理装置を提供する。 In the above display processing device, there is provided a display processing device in which the screen of the form determined by the control unit includes the name of the corresponding function arranged in a margin part beside each icon.
 上記表示処理装置では、前記制御部は、前記軌跡の長さに基づいて、前記複数のアイコンの数を決定する表示処理装置を提供する。 In the display processing device, the control unit provides a display processing device that determines the number of the plurality of icons based on the length of the locus.
 上記表示処理装置では、前記複数のアイコンの内のいずれか一つを選択する操作が行われたとき、前記制御部は、前記軌跡に相似した別の軌跡上に、選択されたアイコンが示す機能に係る別のアイコンを配置した画面に変更する表示処理装置を提供する。 In the display processing device, when an operation of selecting any one of the plurality of icons is performed, the control unit has a function indicated by the selected icon on another locus similar to the locus. Provided is a display processing device for changing to a screen on which another icon is arranged.
 本発明に係る表示処理装置によれば、ユーザの手の特徴又はユーザが機器を操作するときの特徴に応じた形態で、ディスプレイに複数のアイコンを表示するよう処理することができる。その結果、機器を手で持つユーザは、自然な指の動きで操作できる。 The display processing apparatus according to the present invention can perform processing to display a plurality of icons on the display in a form according to the characteristics of the user's hand or when the user operates the device. As a result, a user who holds the device by hand can operate with natural finger movement.
一実施形態の表示処理装置の内部構成、並びに、当該表示処理装置とタッチパネル及びディスプレイとの関係を示すブロック図The block diagram which shows the internal structure of the display processing apparatus of one Embodiment, and the relationship between the said display processing apparatus, a touchscreen, and a display. 一実施形態におけるメニュー画面の例を示す図The figure which shows the example of the menu screen in one Embodiment 一実施形態におけるメニュー画面の例を示す図The figure which shows the example of the menu screen in one Embodiment 各機能の名称を表示可能な余白部分が軌跡の右側にない場合のメニュー画面の一例を示す図The figure which shows an example of a menu screen when the margin part which can display the name of each function is not on the right side of a locus | trajectory 手が小さなユーザの操作によるメニュー画面の一例を示す図The figure which shows an example of the menu screen by a user's operation with a small hand 利き手が左手のユーザの操作によるメニュー画面の一例を示す図The figure which shows an example of the menu screen by the user's operation of the left hand of the dominant hand 図2に示すメニュー画面が表示されているときに、ユーザがメールのアイコンをクリックした後の画面の一例を示す図The figure which shows an example of the screen after a user clicks the icon of mail when the menu screen shown in FIG. 2 is displayed 携帯型電子機器を操作するユーザが当該機器を右手で持った状態を示す図The figure which shows the state in which the user who operates a portable electronic device has the said device with the right hand 図8に示した携帯型電子機器のディスプレイに表示されるメニュー画面の例を示す図The figure which shows the example of the menu screen displayed on the display of the portable electronic device shown in FIG.
 以下、本発明に係る表示処理装置の実施形態について、図面を参照して説明する。以下説明する実施形態の表示処理装置を含む携帯型電子機器(以下「機器」という)は、携帯電話やスマートフォン、PDA、取り外し可能なカーナビ機器、ポータブルテレビ等といった、ユーザが手で持って操作可能な機器である。本実施形態の機器は、入力インターフェースとして、ディスプレイの表示面に貼り合わされたタッチパネルを備える。当該機器のユーザは、機器全体を手のひらで持ち、親指の腹側の先端部(以下、単に「指」という)でタッチパネルに触れることで当該機器を操作する。当該操作には、ユーザが指をタッチパネルに触れながら動かすスライド操作、クリック操作及びフリック操作等が含まれる。 Hereinafter, an embodiment of a display processing apparatus according to the present invention will be described with reference to the drawings. A portable electronic device (hereinafter referred to as “device”) including a display processing device according to an embodiment described below can be operated by a user such as a mobile phone, a smartphone, a PDA, a removable car navigation device, a portable TV, etc. Equipment. The device of this embodiment includes a touch panel bonded to the display surface of a display as an input interface. The user of the device operates the device by holding the entire device in the palm of the hand and touching the touch panel with the tip of the belly side of the thumb (hereinafter simply referred to as “finger”). The operations include a slide operation, a click operation, a flick operation, and the like that the user moves while touching a finger on the touch panel.
 図1は、一実施形態の表示処理装置の内部構成、並びに、当該表示処理装置とタッチパネル及びディスプレイとの関係を示すブロック図である。図1に示すように、第1の実施形態の表示処理装置100は、解析部101と、メモリ103と、制御部105と、表示処理部107とを備える。図1に示したタッチパネル51は、ディスプレイ53の表示面に貼り合わされている。ディスプレイ53には、機器が有する機能の一覧又はその一部をアイコンで示すメニュー画面が表示される。なお、アイコンとは、ユーザが選択可能な対象を示す記号である。 FIG. 1 is a block diagram illustrating an internal configuration of a display processing device according to an embodiment and a relationship between the display processing device, a touch panel, and a display. As illustrated in FIG. 1, the display processing apparatus 100 according to the first embodiment includes an analysis unit 101, a memory 103, a control unit 105, and a display processing unit 107. The touch panel 51 illustrated in FIG. 1 is bonded to the display surface of the display 53. On the display 53, a menu screen showing a list of functions of the device or a part thereof as icons is displayed. The icon is a symbol indicating an object that can be selected by the user.
 以下、表示処理装置100が備える各構成要素について説明する。 Hereinafter, each component provided in the display processing apparatus 100 will be described.
 解析部101は、ユーザがタッチパネル51に対して行った操作を解析する。解析部101は、ユーザの指がタッチパネル51に触れた部分をまず認識する。したがって、ユーザがタッチパネル51に対して行った操作がスライド操作であれば、解析部101は、ユーザが指をタッチパネル51に触れた状態で動かした部分を軌跡として解析する。さらに、解析部101は、ユーザが指をタッチパネル51に触れた状態で動かした部分に基づいて指の動き予測を行い、指が実際にタッチパネル51に触れた部分とその後の動き予測による部分とを併せた部分を軌跡として解析しても良い。 The analysis unit 101 analyzes an operation performed on the touch panel 51 by the user. The analysis unit 101 first recognizes a portion where the user's finger touches the touch panel 51. Therefore, if the operation performed on the touch panel 51 by the user is a slide operation, the analysis unit 101 analyzes a portion where the user moves the finger while touching the touch panel 51 as a locus. Further, the analysis unit 101 performs finger motion prediction based on a portion where the user moves the finger while touching the touch panel 51, and determines a portion where the finger actually touches the touch panel 51 and a portion due to subsequent motion prediction. The combined part may be analyzed as a trajectory.
 メモリ103は、ディスプレイ53に表示するアイコンの各デザインに係るデータ、各種設定情報、及び機器が有する各機能のプログラムを記憶する。制御部105は、解析部101が解析した軌跡に沿って複数のアイコンを配置したメニュー画面の形態を決定し、表示するアイコンのデザインに係るデータをメモリ103から読み出して表示処理部107に送る。また、制御部105は、ディスプレイ53に表示されたアイコンの内、ユーザが指でクリック操作したアイコンに係る機能を実行する。表示処理部107は、ディスプレイ53にメニュー画面を表示する際、制御部105が決定したメニュー画面の形態に応じて、各アイコンをディスプレイ53に表示するよう処理する。 The memory 103 stores data relating to each design of icons to be displayed on the display 53, various setting information, and a program of each function possessed by the device. The control unit 105 determines the form of a menu screen in which a plurality of icons are arranged along the trajectory analyzed by the analysis unit 101, reads data related to the design of the icon to be displayed from the memory 103, and sends the data to the display processing unit 107. In addition, the control unit 105 executes a function related to an icon that the user clicks with a finger among the icons displayed on the display 53. When displaying the menu screen on the display 53, the display processing unit 107 performs processing to display each icon on the display 53 according to the form of the menu screen determined by the control unit 105.
 本実施形態では、機器が待ち受け状態のとき、ユーザがタッチパネル51をスライド操作するとメニュー画面が表示される。このとき、解析部101は、当該スライド操作によるタッチパネル51上の指の軌跡を解析し、制御部105は、当該軌跡上に複数のアイコンを所定の順に並べて表示するよう表示処理部107に指示する。図2及び図3は、一実施形態におけるメニュー画面の例を示す図である。図2に示す符号201A及び図3に示す符号201Bは、メニュー画面を表示する前にユーザがタッチパネル51を指でスライド操作した軌跡を示す。なお、図2の軌跡201Aの操作を行ったユーザは、図3の軌跡201Bの操作を行ったユーザとは異なる。または、仮に同じユーザであっても、図2の軌跡201Aの操作を行ったときの機器の持ち方は、図3の軌跡201Bの操作を行ったときの同機器の持ち方と異なる。 In the present embodiment, a menu screen is displayed when the user slides the touch panel 51 when the device is in a standby state. At this time, the analysis unit 101 analyzes the trajectory of the finger on the touch panel 51 by the slide operation, and the control unit 105 instructs the display processing unit 107 to display a plurality of icons in a predetermined order on the trajectory. . 2 and 3 are diagrams illustrating an example of a menu screen according to an embodiment. Reference numeral 201A illustrated in FIG. 2 and reference numeral 201B illustrated in FIG. 3 indicate a trajectory in which the user slides the touch panel 51 with a finger before the menu screen is displayed. Note that the user who has operated the locus 201A in FIG. 2 is different from the user who has operated the locus 201B in FIG. Alternatively, even for the same user, the way of holding the device when the operation of the locus 201A in FIG. 2 is performed is different from the way of holding the device when the operation of the locus 201B in FIG. 3 is performed.
 なお、メニュー画面を表示する際にユーザが行う操作は、図2及び図3に示したスライド操作に限らず、S字を描く操作や稲妻形のジグザグした操作であっても良い。また、メニュー画面には、アイコンに所定の形態で軌跡を重ねて表示しても良い。軌跡の表示形態は、図2及び図3に示したようにクレヨンで描いた線であっても、流れ星の尾のような線であっても、点線であっても良い。また、メニュー画面は、各アイコンの脇の余白部分に、該当する機能の名称を示しても良い。このとき、制御部105は、メニュー画面内における各アイコンの配置を解析して、各機能の名称を表示可能な余白部分を決定する。図4は、各機能の名称を表示可能な余白部分が軌跡の右側にない場合のメニュー画面の一例を示す図である。図4に示すように、軌跡の右側に十分な余白がなく左側にはある場合、制御部105は、軌跡の左側に各機能の名称を表示するよう表示処理部107に指示する。但し、メニュー画面に軌跡や各機能の名称を表示するか否かは、制御部105が、メモリ103に記録されている各種設定に応じて判断する。 The operation performed by the user when displaying the menu screen is not limited to the slide operation shown in FIGS. 2 and 3, but may be an operation of drawing an S character or a zigzag operation of a lightning bolt. Further, on the menu screen, a trace may be displayed in a predetermined form on the icon. The trajectory display form may be a line drawn with crayons as shown in FIGS. 2 and 3, a line like a tail of a shooting star, or a dotted line. In addition, the menu screen may indicate the name of the corresponding function in a margin part beside each icon. At this time, the control unit 105 analyzes the arrangement of each icon in the menu screen, and determines a blank portion where the name of each function can be displayed. FIG. 4 is a diagram illustrating an example of a menu screen in a case where there is no blank portion on the right side of the locus that can display the name of each function. As shown in FIG. 4, when there is not enough margin on the right side of the trajectory and on the left side, the control unit 105 instructs the display processing unit 107 to display the name of each function on the left side of the trajectory. However, the control unit 105 determines whether or not to display the locus and the name of each function on the menu screen according to various settings recorded in the memory 103.
 以上説明したように、本実施形態では、ディスプレイ53にメニュー画面を表示するためにユーザがタッチパネル51をスライド操作すると、解析部101は、当該スライド操作を解析して軌跡を決定し、制御部105は、軌跡上に各アイコンを所定の順に並べて表示するよう表示処理部107に指示する。その結果、メニュー画面を構成するアイコンは、ユーザの指の動きと合致した線上に表示される。したがって、ユーザは、機器を手で持った状態のとき、ディスプレイ53にメニュー画面が表示されているときの操作を自然な指の動きで行うことができる。 As described above, in the present embodiment, when the user slides the touch panel 51 to display the menu screen on the display 53, the analysis unit 101 analyzes the slide operation to determine the locus, and the control unit 105 Instructs the display processing unit 107 to display the icons in a predetermined order on the trajectory. As a result, the icons constituting the menu screen are displayed on a line that matches the movement of the user's finger. Therefore, the user can perform an operation with a natural finger movement when the menu screen is displayed on the display 53 when the device is held by hand.
 なお、図2及び図3に示した例では、軌跡上に5つのアイコンが表示されている。しかし、メニュー画面におけるアイコンの数は5つに限らず、軌跡の長さに応じて変えても良い。すなわち、制御部105は、解析部101が決定した軌跡の長さに基づいて、アイコンの数を決定する。図5は、手が小さなユーザの操作によるメニュー画面の一例を示す図である。図5に示すように、手が小さなユーザによる軌跡は短いため、アイコンの数を減らしてメニュー画面を表示する。 In the example shown in FIGS. 2 and 3, five icons are displayed on the locus. However, the number of icons on the menu screen is not limited to five, and may be changed according to the length of the locus. That is, the control unit 105 determines the number of icons based on the length of the trajectory determined by the analysis unit 101. FIG. 5 is a diagram illustrating an example of a menu screen operated by a user with a small hand. As shown in FIG. 5, since the trajectory by a user with a small hand is short, the menu screen is displayed with a reduced number of icons.
 また、メニュー画面を表示するためにユーザが左手の指でスライド操作を行うと、図6に示すように、図2~図5及び図7とは弧の向きが左右逆の軌跡上にアイコンが表示される。したがって、ユーザは、利き手によらず自然な指の動きで操作することができる。 Also, when the user performs a slide operation with the finger of the left hand to display the menu screen, as shown in FIG. 6, an icon appears on the trajectory whose arc direction is opposite to that of FIGS. 2 to 5 and FIG. Is displayed. Therefore, the user can operate with natural finger movements regardless of the dominant hand.
 図7は、図2に示すメニュー画面が表示されているときに、ユーザがメールのアイコンをクリックした後の画面の一例を示す図である。図7に示すように、解析部101によってメールのアイコンがクリック操作されたことが判断されると、制御部105は、メールに係る各機能のデータをメモリ103から読み出して、軌跡201Aに相似した軌跡201C上にメールに係る各機能のアイコンを所定の順に並べた画面に変更するよう表示処理部107に指示する。なお、図7に示した例では、軌跡201Aはそのままの大きさのまま、軌跡201Aを縮小した大きさの軌跡201Cが表示されているが、他の形態として、軌跡201Aを縮小して、軌跡201Cを元の軌跡201Aの大きさと同じ大きさで表示しても良い。 FIG. 7 is a diagram showing an example of the screen after the user clicks the mail icon when the menu screen shown in FIG. 2 is displayed. As shown in FIG. 7, when it is determined that the mail icon is clicked by the analysis unit 101, the control unit 105 reads the data of each function related to the mail from the memory 103 and resembles the locus 201A. The display processing unit 107 is instructed to change to a screen in which icons of functions related to mail are arranged in a predetermined order on the trajectory 201C. In the example illustrated in FIG. 7, the trajectory 201A is displayed with the trajectory 201A being the same size and the trajectory 201C having a reduced size. However, as another form, the trajectory 201A is reduced and 201C may be displayed in the same size as the original trajectory 201A.
 なお、本実施形態では、タッチパネルを備えた機器を例に説明したが、入力インターフェースとしてのタッチパネル又はタッチパッドをリモコン等の別体として備えた機器にも適用可能である。 In addition, although this embodiment demonstrated the apparatus provided with the touch panel as an example, it is applicable also to the apparatus provided with the touch panel or touchpad as an input interface as separate bodies, such as a remote control.
 本発明を詳細にまた特定の実施態様を参照して説明したが、本発明の精神と範囲を逸脱することなく様々な変更や修正を加えることができることは当業者にとって明らかである。 Although the present invention has been described in detail and with reference to specific embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention.
 本出願は、2011年4月11日出願の日本特許出願(特願2011-087607)に基づくものであり、その内容はここに参照として取り込まれる。 This application is based on a Japanese patent application filed on April 11, 2011 (Japanese Patent Application No. 2011-087607), the contents of which are incorporated herein by reference.
 本発明に係る表示処理装置は、ディスプレイに複数のアイコンを表示するよう処理する携帯型電子機器の表示処理装置等として有用である。 The display processing device according to the present invention is useful as a display processing device for a portable electronic device that performs processing to display a plurality of icons on a display.
100 表示処理装置
101 解析部
103 メモリ
105 制御部
107 表示処理部
51 タッチパネル
53 ディスプレイ
DESCRIPTION OF SYMBOLS 100 Display processing apparatus 101 Analysis part 103 Memory 105 Control part 107 Display processing part 51 Touch panel 53 Display

Claims (7)

  1.  操作面を有する入力インターフェースを備えた機器をユーザが手のひらで持ち、当該ユーザが前記入力インターフェースの前記操作面を指で触れると所定の動作を行う表示処理装置であって、
     前記入力インターフェースに対して行われたスライド操作に基づく軌跡を解析する解析部と、
     前記軌跡上に又は前記軌跡に沿って複数のアイコンを配置した画面の形態を決定する制御部と、
     前記制御部が決定した形態の画面をディスプレイに表示するよう処理する表示処理部と、
    を備えたことを特徴とする表示処理装置。
    A display processing apparatus in which a user has a device having an input interface having an operation surface with a palm, and the user performs a predetermined operation when the user touches the operation surface of the input interface with a finger,
    An analysis unit for analyzing a trajectory based on a slide operation performed on the input interface;
    A control unit for determining a form of a screen on which a plurality of icons are arranged on or along the locus;
    A display processing unit for processing to display a screen of the form determined by the control unit on a display;
    A display processing apparatus comprising:
  2.  請求項1に記載の表示処理装置であって、
     前記解析部は、ユーザが指を前記入力インターフェースの前記操作面に触れた状態で動かした部分を前記軌跡として解析することを特徴とする表示処理装置。
    The display processing device according to claim 1,
    The display processing apparatus, wherein the analysis unit analyzes, as the trajectory, a portion where a user moves a finger while touching the operation surface of the input interface.
  3.  請求項1に記載の表示処理装置であって、
     前記解析部は、ユーザが指を前記入力インターフェースの前記操作面に触れた状態で動かした部分に基づいて動き予測を行い、前記指が実際に前記操作面に触れた部分とその後の動き予測による部分とを併せた部分を前記軌跡として解析することを特徴とする表示処理装置。
    The display processing device according to claim 1,
    The analysis unit performs motion prediction based on a portion where a user moves a finger while touching the operation surface of the input interface, and based on a portion where the finger actually touches the operation surface and subsequent motion prediction A display processing apparatus that analyzes a portion combined with a portion as the locus.
  4.  請求項1~3のいずれか一項に記載の表示処理装置であって、
     前記制御部が決定する形態の画面には、前記複数のアイコン及び前記軌跡を示す線が含まれることを特徴とする表示処理装置。
    The display processing device according to any one of claims 1 to 3,
    The display processing apparatus, wherein the screen determined by the control unit includes the plurality of icons and a line indicating the locus.
  5.  請求項1~4のいずれか一項に記載の表示処理装置であって、
     前記制御部が決定する形態の画面には、各アイコンの脇の余白部分に配置される、該当する機能の名称が含まれることを特徴とする表示処理装置。
    The display processing device according to any one of claims 1 to 4,
    The display processing apparatus according to claim 1, wherein the screen determined by the control unit includes a name of a corresponding function arranged in a margin part beside each icon.
  6.  請求項1~5のいずれか一項に記載の表示処理装置であって、
     前記制御部は、前記軌跡の長さに基づいて、前記複数のアイコンの数を決定することを特徴とする表示処理装置。
    A display processing device according to any one of claims 1 to 5,
    The control unit determines the number of the plurality of icons based on the length of the trajectory.
  7.  請求項1~6のいずれか一項に記載の表示処理装置であって、
     前記複数のアイコンの内のいずれか一つを選択する操作が行われたとき、前記制御部は、前記軌跡に相似した別の軌跡上に、選択されたアイコンが示す機能に係る別のアイコンを配置した画面に変更することを特徴とする表示処理装置。
    The display processing device according to any one of claims 1 to 6,
    When an operation for selecting any one of the plurality of icons is performed, the control unit displays another icon related to the function indicated by the selected icon on another locus similar to the locus. A display processing device characterized by changing to an arranged screen.
PCT/JP2012/002513 2011-04-11 2012-04-11 Display processing device WO2012140883A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-087607 2011-04-11
JP2011087607A JP2014123159A (en) 2011-04-11 2011-04-11 Display processor

Publications (1)

Publication Number Publication Date
WO2012140883A1 true WO2012140883A1 (en) 2012-10-18

Family

ID=47009074

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/002513 WO2012140883A1 (en) 2011-04-11 2012-04-11 Display processing device

Country Status (2)

Country Link
JP (1) JP2014123159A (en)
WO (1) WO2012140883A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103096157A (en) * 2013-01-11 2013-05-08 北京奇艺世纪科技有限公司 Method for controlling moving focal point on television application interface by mobile phone
WO2014121523A1 (en) * 2013-02-08 2014-08-14 Motorola Solutions, Inc. Method and apparatus for managing user interface elements on a touch-screen device
CN104156148A (en) * 2014-07-18 2014-11-19 百度在线网络技术(北京)有限公司 Method and device for providing virtual keyboards for mobile equipment
JP2014219906A (en) * 2013-05-10 2014-11-20 富士ゼロックス株式会社 Information processing apparatus and information processing program
CN104657029A (en) * 2013-11-21 2015-05-27 深圳市九洲电器有限公司 Method and system for arranging interface icons
CN107577357A (en) * 2017-08-18 2018-01-12 中山叶浪智能科技有限责任公司 A kind of automatic matching method and system for inputting information
CN107765976A (en) * 2016-08-16 2018-03-06 腾讯科技(深圳)有限公司 A kind of information push method, terminal and system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6604274B2 (en) * 2016-06-15 2019-11-13 カシオ計算機株式会社 Output control device, output control method, and program
JP2020035368A (en) * 2018-08-31 2020-03-05 株式会社Jvcケンウッド Display control device, display unit, display control method, and program
JP7420016B2 (en) 2020-08-27 2024-01-23 株式会社リコー Display device, display method, program, display system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003671A (en) * 2006-06-20 2008-01-10 Sharp Corp Electronic device, and operation method of electronic device
JP2010079442A (en) * 2008-09-24 2010-04-08 Toshiba Corp Mobile terminal
JP2010108081A (en) * 2008-10-28 2010-05-13 Sharp Corp Menu display device, method of controlling the menu display device, and menu display program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003671A (en) * 2006-06-20 2008-01-10 Sharp Corp Electronic device, and operation method of electronic device
JP2010079442A (en) * 2008-09-24 2010-04-08 Toshiba Corp Mobile terminal
JP2010108081A (en) * 2008-10-28 2010-05-13 Sharp Corp Menu display device, method of controlling the menu display device, and menu display program

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103096157A (en) * 2013-01-11 2013-05-08 北京奇艺世纪科技有限公司 Method for controlling moving focal point on television application interface by mobile phone
CN103096157B (en) * 2013-01-11 2016-06-01 北京奇艺世纪科技有限公司 A kind of method utilizing mobile phone to control moving focal point on TV applications interface
CN104969166A (en) * 2013-02-08 2015-10-07 摩托罗拉解决方案公司 Method and apparatus for managing user interface elements on a touch-screen device
WO2014121523A1 (en) * 2013-02-08 2014-08-14 Motorola Solutions, Inc. Method and apparatus for managing user interface elements on a touch-screen device
US10019151B2 (en) 2013-02-08 2018-07-10 Motorola Solutions, Inc. Method and apparatus for managing user interface elements on a touch-screen device
JP2014219906A (en) * 2013-05-10 2014-11-20 富士ゼロックス株式会社 Information processing apparatus and information processing program
CN104657029A (en) * 2013-11-21 2015-05-27 深圳市九洲电器有限公司 Method and system for arranging interface icons
CN104156148B (en) * 2014-07-18 2018-05-08 百度在线网络技术(北京)有限公司 A kind of method and apparatus for being used to provide the dummy keyboard in mobile equipment
CN104156148A (en) * 2014-07-18 2014-11-19 百度在线网络技术(北京)有限公司 Method and device for providing virtual keyboards for mobile equipment
CN107765976A (en) * 2016-08-16 2018-03-06 腾讯科技(深圳)有限公司 A kind of information push method, terminal and system
CN107765976B (en) * 2016-08-16 2021-12-14 腾讯科技(深圳)有限公司 Message pushing method, terminal and system
CN107577357A (en) * 2017-08-18 2018-01-12 中山叶浪智能科技有限责任公司 A kind of automatic matching method and system for inputting information
CN107577357B (en) * 2017-08-18 2018-07-06 中山叶浪智能科技有限责任公司 A kind of automatic matching method and system for inputting information

Also Published As

Publication number Publication date
JP2014123159A (en) 2014-07-03

Similar Documents

Publication Publication Date Title
WO2012140883A1 (en) Display processing device
JP5708644B2 (en) Information processing terminal and control method thereof
US10795558B2 (en) Device, method, and graphical user interface for providing and interacting with a virtual drawing aid
US8994646B2 (en) Detecting gestures involving intentional movement of a computing device
US8982045B2 (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
KR101224588B1 (en) Method for providing UI to detect a multi-point stroke and multimedia apparatus thereof
EP2584434A1 (en) Information processing terminal and method for controlling operation thereof
US20140022193A1 (en) Method of executing functions of a terminal including pen recognition panel and terminal supporting the method
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
EP2851782A2 (en) Touch-based method and apparatus for sending information
CN103365552A (en) Bookmark setting method of e-book, and apparatus thereof
TWI659353B (en) Electronic apparatus and method for operating thereof
US20140285445A1 (en) Portable device and operating method thereof
TW201331812A (en) Electronic apparatus and method for controlling the same
JP5092985B2 (en) Content decoration apparatus, content decoration method, and content decoration program
CN103927114A (en) Display method and electronic equipment
US20140359541A1 (en) Terminal and method for controlling multi-touch operation in the same
JP6057441B2 (en) Portable device and input method thereof
KR20130102670A (en) For detailed operation of the touchscreen handset user-specific finger and touch pen point contact location method and system for setting
JP2014153916A (en) Electronic apparatus, control method, and program
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
US20140210732A1 (en) Control Method of Touch Control Device
US20150100912A1 (en) Portable electronic device and method for controlling the same
JP3175877U (en) Touch pen for multi-function mobile terminal screen
US20140143726A1 (en) Method of choosing software button

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12771037

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12771037

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP