WO2013082881A1 - Procédé de commande d'affichage de bureau et terminal mobile - Google Patents

Procédé de commande d'affichage de bureau et terminal mobile Download PDF

Info

Publication number
WO2013082881A1
WO2013082881A1 PCT/CN2012/070927 CN2012070927W WO2013082881A1 WO 2013082881 A1 WO2013082881 A1 WO 2013082881A1 CN 2012070927 W CN2012070927 W CN 2012070927W WO 2013082881 A1 WO2013082881 A1 WO 2013082881A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
display
target
control
user
Prior art date
Application number
PCT/CN2012/070927
Other languages
English (en)
Chinese (zh)
Inventor
柳鲲鹏
黄连芳
房稳
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2013082881A1 publication Critical patent/WO2013082881A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to the field of mobile terminals, and in particular, to a method for performing desktop display control and a mobile terminal. Background technique
  • touch screen mobile terminals are widely used by gradually replacing conventional key operations using touch screen operations.
  • Existing touch screen phones support click, slide and other operations.
  • the touch screen terminal screen is large (such as 4.3 inches), it is inconvenient to click some of the icons or virtual buttons with one hand, which may result in malfunction. Summary of the invention
  • the technical problem to be solved by the present invention is to provide a method for performing desktop display control and a mobile terminal, which is convenient for the user to accurately touch the touch screen.
  • the present invention uses the following technical solutions:
  • a method for performing desktop display control including:
  • the mobile terminal After detecting the touch screen event of the user on the touch display screen, the mobile terminal determines the target operation area according to the touch screen event, and enlarges or reduces the display or panning display of the target operation area;
  • the target operating area is all desktops or partial desktops.
  • the mobile terminal After detecting the long press operation in the touch screen event, the mobile terminal determines that the start point of the long press operation corresponds to the non-control position on the display screen, and uses the entire desktop as the target operation area selected by the user;
  • the mobile terminal After the mobile terminal detects a long press operation in the touch screen event, it determines that the long press operation starts When the starting point corresponds to the position of the control on the display screen, the control area corresponding to the position of the control is used as the target operating area selected by the user;
  • the mobile terminal After the mobile terminal detects that the historical point in the touch screen event constitutes a closed area, the corresponding closed area on the desktop is used as the target operation area selected by the user.
  • the method also includes:
  • the step of performing an enlarged display or a reduced display on the target operation area includes: after the mobile terminal determines the target operation area, displaying a zoom-out control and a zoom-in control on the touch display screen, and detecting the position corresponding to the zoom-out control After the short press operation, the target operation area is displayed in a reduced manner, and after the short press operation is detected at the position corresponding to the enlargement control, the target operation area is enlarged and displayed; or
  • the mobile terminal After determining the target operation area, the mobile terminal performs a multi-point opposite movement operation, and performs a reduced display corresponding to the length of the stroke on the target operation area. After detecting the multi-point reverse movement operation, the target operation is performed. The area is enlarged for the corresponding length of the stroke.
  • the step of performing panning display on the target operating area includes:
  • a mobile terminal for performing desktop display control includes: a central processing module, a user interface management module, and a human interface module, wherein:
  • the human interface module is configured to: detect that the user operates on the touch display screen and is responsible for displaying the desktop on the touch display screen;
  • the central processing module is further configured to learn, by the human interface module, a touch screen event of the user on the touch display screen, and determine a target operation area according to the touch screen event: detecting by the human interface module After the long press operation in the touch screen event, determining that the starting point of the long press operation corresponds to the non-control position on the display screen, the entire desktop is regarded as the target operation area selected by the user;
  • the control area corresponding to the position of the control is selected by the user.
  • Target operating area or,
  • the human-machine interface module After the human-machine interface module detects that the historical point in the touch-screen event constitutes a closed area, the corresponding closed area on the desktop is used as the target operating area selected by the user.
  • the central processing module is further configured to control the user interface management module to display the enlarged or reduced display of the target operating area according to the following manner:
  • the user interface management module displays a zoom-out control and an enlargement control on the touch display screen, and after the short-press operation is detected by the human-machine interface module at the position corresponding to the zoom-out control, The user interface management module performs a reduced display on the target operation area, and after the short-press operation is detected by the human-machine interface module at a position corresponding to the enlargement control, the target is managed by the user interface management module.
  • the operation area is enlarged to display; or,
  • the target operation area is determined, after the multi-point opposite movement operation is detected by the human interface module, the target operation area is reduced and displayed according to the length of the stroke by the user interface management module, by the person After detecting the multi-point reverse movement operation, the machine interface module performs an enlarged display corresponding to the stroke length of the target operation area by the user interface management module.
  • the central processing module is further configured to control the user interface management in the following manner
  • the module performs a panning display on the target operating area:
  • the target operating area is performed by the user interface management module in the same direction as the moving operation direction and corresponding to the length of the swipe. Pan display.
  • the entire desktop or a partial desktop can be dragged or enlarged or reduced, and the area that the user needs to touch is placed at a position convenient for the user to click, thereby avoiding misoperation and improving the user experience.
  • FIG. 1 is a structural block diagram of a component of a mobile terminal in an embodiment
  • FIG. 2 is a schematic diagram of new function options of a mobile terminal in an embodiment
  • FIG. 3 is a schematic diagram showing the original screen display of the mobile terminal in the first example
  • FIG. 4 is a schematic diagram of scaling a desktop in the first example
  • FIG. 6 is a schematic diagram showing the position of the original virtual keyboard of the mobile terminal on the desktop in the second example
  • FIG. 7 is a schematic diagram showing the position of the virtual keyboard of the mobile terminal enlarged on the desktop in the second example;
  • FIG. 8 is a schematic diagram showing the position of the virtual keyboard of the mobile terminal after being translated on the desktop in the second example.
  • the mobile terminal in the present invention includes a human interface module 101, a user interface management module 102, a central processing module 103, and a program storage module 104, wherein:
  • the human interface module 101 is configured to: detect that the user operates on the touch display screen and is responsible for displaying the desktop on the touch display screen, and also calls the picture and interface of the program storage module 104, and displays the corresponding interface on the screen, waiting for the user to operate.
  • the user interface management module 102 is configured to: control the according to the indication of the central processing module 103
  • the human interface module 101 displays the target operation area, and also supports an enlarged display or a reduced display or a pan display of all the desktops, and an enlarged display or a reduced display or a panned display of the partial desktop.
  • the central processing module 103 is configured to: learn, by the human interface module 101, a touch screen event of the user on the touch display screen, determine a target operation area according to the touch screen event, and control the user interface management module 102 to operate the target The area is enlarged or reduced or displayed, and the target operating area is all desktop or partial desktop.
  • the user operates the touch screen to make the mobile terminal know the target operation area of the user.
  • the central processing module 103 detects the long press operation in the touch screen event by the human interface module 101, it is determined that the start point of the long press operation corresponds to the non-control position on the display screen, and the entire desktop is selected as the user. Target operation area.
  • the central processing module 103 detects the long press operation in the touch screen event by the human interface module 101, and determines that the start point of the long press operation corresponds to the position of the control on the display screen, the control corresponding to the position of the control The area is the target operation area selected by the user.
  • the central processing module 103 displays the target operation area as a selected state after determining the target operation area. There are several ways to display the selected state, such as displaying a border at the edge of the target operating area, displaying the target operating area as a color or transparency, and other ways of indicating the target operating area.
  • the manner in which the user zooms the target operation area by operating the touch screen includes the following two.
  • the central processing module 103 passes the user interface after determining the target operation area.
  • the management module 102 displays the zoom-out control and the zoom-in control on the touch display screen, and after the short-press operation is detected by the human-machine interface module 101 at the position corresponding to the zoom-out control, the user interface management module 102
  • the target operation area is displayed in a reduced size, and after the short-press operation is detected by the human-machine interface module 101 at the position corresponding to the enlargement control, the target operation area is enlarged and displayed by the user interface management module 102.
  • the method for performing desktop display control includes: after the mobile terminal detects a touch screen event of the user on the touch display screen, determining a target operation area according to the touch screen event, and performing an enlarged display or a reduced display or a pan display on the target operation area,
  • the target operating area is all desktop or partial desktop.
  • the mobile terminal determines that the start point of the long press operation corresponds to the non-control position on the display screen, and uses the entire desktop as the target operation area selected by the user;
  • the selected target operating area After the mobile terminal detects that the historical point in the touch screen event constitutes a closed area, the corresponding closed area on the desktop is used as the target operation area selected by the user.
  • the target operation area is displayed as the selected state.
  • the selected state There are several ways to display the selected state, such as displaying a border at the edge of the target operating area, displaying the target operating area as a color or transparency effect, and other ways of indicating the target operating area.
  • the user can use two fingers to move in the opposite direction on the display screen, indicating that the target operation area is desired to be reduced, or two fingers are used to move in the opposite direction on the display screen, indicating that the target operation area is enlarged.
  • the mobile terminal After detecting the multi-point opposite movement operation, the mobile terminal performs a reduction display corresponding to the length of the stroke on the target operation area, and after detecting the multi-point reverse movement operation, the length of the stroke of the target operation area is correspondingly Magnified display.
  • the user can use one finger to swipe in the direction of the target, or use multiple fingers to swipe in the same direction, which can indicate the direction in which the user wishes to pan.
  • the target operating area is subjected to a panning display in the same direction as the moving operation direction and corresponding to the swipe length.
  • the central processing module converts the operation of the user interface to the operation of the corresponding application of the graphic management system, and invokes the corresponding interface management module, and refreshes the display result to the screen buffer and displays it on the display screen.
  • New functional options in the embodiments of the present invention are panning the entire desktop, panning the partial desktop, scaling the entire desktop, and zooming the partial desktop. The following is a detailed description through a specific process.
  • Example 1 the process of the user panning the entire desktop includes:
  • Step 1 The user long presses the non-control position of the desktop
  • Step 2 detecting a long press operation in the touch screen event, and the position of the long press operation is a non-control position, determining whether to allow operation on the entire desktop (ie, determining whether to include an option of panning the entire desktop or scaling the entire desktop) Yes, go to the next step, otherwise, follow the normal process;
  • Step 3 display the entire desktop as selected;
  • Step 4 After detecting a panning operation of a contact or a panning operation of the plurality of contacts in the same direction, determining whether to allow a panning operation on the desktop (ie, determining whether to include the panning of the entire desktop), and if so, performing the next step, otherwise , processed according to the normal process;
  • Step 5 Record the starting point and the current end point of the translation, and draw the visual effect of the desktop translation.
  • Example 2 the process of users scaling the entire desktop includes:
  • Step 1 The user long presses the non-control position of the desktop
  • Step 2 detecting a long press operation in the touch screen event, and the position of the long press operation is a non-control position, determining whether to allow operation on the entire desktop (ie, determining whether to include an option of panning the entire desktop or scaling the entire desktop) Yes, go to the next step, otherwise, follow the normal process;
  • Step 3 display the entire desktop as selected;
  • Step 4 detecting a translation operation of the two contacts in the opposite direction, determining whether the zoom operation is allowed on the desktop (ie, determining whether to include scaling the entire desktop), and if so, performing the next step, otherwise, processing according to the normal process;
  • Step 5 Record the starting point and the current end point of the translation, and draw the visual effect of the desktop zoom.
  • Example 3 The process of the user panning the partial desktop includes:
  • Step 1 The user delineates a closed area or long presses a control
  • Step 2 detecting a moving operation in the touch screen event and the historical points constitute a closed area, or detecting a long press operation in the touch screen event and long pressing the position is a control position, determining whether to allow operation on the partial desktop (ie, determining Whether to include the option of panning a partial desktop or scaling a partial desktop), if yes, performing the next step, otherwise, processing according to the normal process;
  • Step 5 Record the starting point and the current end point of the translation, and draw the visual effect of the translation of the partial desktop (ie, this control or this closed area).
  • Step 2 detecting a moving operation in the touch screen event and the historical points constitute a closed area, or detecting a long press operation in the touch screen event and long pressing the position is a control position, determining whether to allow operation on the partial desktop (ie, determining Whether to include the option of panning a partial desktop or scaling a partial desktop), if yes, performing the next step, otherwise, processing according to the normal process;
  • Step 3 Display the control or the closed area is selected
  • Step 4 detecting a translation operation of the two contacts in the opposite direction, determining whether to allow the local desktop to be scaled (ie, determining whether to include the local desktop), and if so, performing the next step, otherwise, processing according to the normal process;
  • Step 5 Record the starting point and the current end point of the translation, and draw a magnified visual effect of the partial desktop (ie, this control or this closed area).
  • Example 5 The process of zooming in or out of a partial desktop by a user includes:
  • Step 1 The user circles a closed area or presses a control
  • Step 2 detecting a movement operation in the touch screen event and the historical points constitute a closed area, or The user detects a long press operation in the touch screen event and the long press position is the control position, and determines whether to allow operation on the partial desktop (ie, whether to include the option of panning the partial desktop or scaling the partial desktop), and if so, performing the next step, Otherwise, it is processed according to the normal process;
  • Step 3 Display the control or the closed area is selected
  • Step 5 Record the starting point and the current end point of the translation, and draw a magnified visual effect of the partial desktop (ie, this control or this closed area).
  • the original display shows the desktop
  • the thick line indicates the edge of the screen
  • the shaded area indicates the desktop.
  • the user slides in the opposite direction on the screen with two fingers, and the terminal enlarges the desktop.
  • the user presses the non-control area in the upper left corner of the screen to swipe the display to the lower right to indicate that the user wants to pan the original desktop.
  • the double-line arrow indicates the direction and distance of the user's swipe, so that the user originally located in the upper left corner of the screen needs Touch the target to move to the middle of the screen to facilitate the user's click operation.
  • the blank area generated after dragging the desktop can be displayed in a solid color, or displayed in the form of a preset picture or animation. Users can use their own needs, the entire screen Move in all directions to meet your own needs. During the desktop move, all the icons and controls on the desktop also move synchronously following the movement of the interface.
  • Example 2 an example of a control operation. When there is only one control to be selected, the user can directly press this control. When there are multiple controls to be selected, the user can touch the area of the screen to be selected.
  • the desktop is the input interface and the virtual keyboard is used as a control.
  • the user presses the control area
  • the user moves in the opposite direction with two fingers
  • the mobile terminal zooms in and displays the virtual keyboard.
  • the screen is swiped, and the virtual keyboard is translated to a partial position on the display screen, and the blank area flowing out after the drag can be displayed as an extension of the adjacent area (such as an input interface). , or replace it with a special background or animation.
  • the entire desktop or a partial desktop can be dragged or enlarged or reduced, and the area that the user needs to touch is placed at a position convenient for the user to click, thereby avoiding misoperation and improving the user experience. Therefore, the present invention has strong industrial applicability.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'affichage de bureau et un terminal mobile. Le procédé comprend les opérations suivantes : lors de la détection d'un événement de toucher d'écran encouru par un utilisateur sur un écran tactile, un terminal mobile détermine, selon l'événement de toucher d'écran, une zone d'actionnement cible et affiche la zone d'actionnement cible par zoomage avant, ou zoomage arrière ou déplacement en translation, la zone d'actionnement cible étant la totalité du bureau ou une partie du bureau. Au moyen de la solution, la totalité ou une partie du bureau peut faire l'objet d'un glissement ou d'un zoom avant ou d'un zoom arrière, et la zone devant être tapotée par l'utilisateur est positionnée à la position à laquelle l'utilisateur peut cliquer facilement, permettant ainsi d'éviter une mauvaise opération, et d'améliorer l'expérience d'utilisateur.
PCT/CN2012/070927 2011-12-09 2012-02-07 Procédé de commande d'affichage de bureau et terminal mobile WO2013082881A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201110409344.4 2011-12-09
CN201110409344.4A CN102520860B (zh) 2011-12-09 2011-12-09 一种进行桌面显示控制的方法及移动终端

Publications (1)

Publication Number Publication Date
WO2013082881A1 true WO2013082881A1 (fr) 2013-06-13

Family

ID=46291807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/070927 WO2013082881A1 (fr) 2011-12-09 2012-02-07 Procédé de commande d'affichage de bureau et terminal mobile

Country Status (2)

Country Link
CN (1) CN102520860B (fr)
WO (1) WO2013082881A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106686232A (zh) * 2016-12-27 2017-05-17 努比亚技术有限公司 一种控制界面的优化方法和移动终端

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8866770B2 (en) * 2012-03-19 2014-10-21 Mediatek Inc. Method, device, and computer-readable medium for changing size of touch permissible region of touch screen
CN104106035A (zh) * 2012-06-28 2014-10-15 汉阳大学校产学协力团 用户界面调节方法及利用该方法的用户终端机
CN102830914B (zh) * 2012-07-31 2018-06-05 北京三星通信技术研究有限公司 操作终端设备的方法及其设备
CN103593132A (zh) * 2012-08-16 2014-02-19 腾讯科技(深圳)有限公司 触控装置及手势识别方法
CN102880411B (zh) * 2012-08-20 2016-09-21 东莞宇龙通信科技有限公司 移动终端及其触控操作方法
CN103677543A (zh) * 2012-09-03 2014-03-26 中兴通讯股份有限公司 一种调整移动终端屏幕显示区域的方法及移动终端
CN107247538B (zh) 2012-09-17 2020-03-20 华为终端有限公司 触摸操作处理方法及终端设备
CN102902481B (zh) * 2012-09-24 2016-12-21 东莞宇龙通信科技有限公司 终端和终端操作方法
CN102855066B (zh) * 2012-09-26 2017-05-17 东莞宇龙通信科技有限公司 终端和终端操控方法
CN103309604A (zh) * 2012-11-16 2013-09-18 中兴通讯股份有限公司 一种终端及终端屏幕显示信息控制方法
CN103902206B (zh) * 2012-12-25 2017-11-28 广州三星通信技术研究有限公司 操作具有触摸屏的移动终端的方法和设备及移动终端
CN103294346B (zh) * 2013-06-20 2018-03-06 锤子科技(北京)有限公司 一种移动设备的窗口移动方法及其装置
CN103324347B (zh) * 2013-06-27 2017-09-22 广东欧珀移动通信有限公司 一种基于多触控面板的移动终端的操作方法和系统
CN103414829A (zh) * 2013-08-27 2013-11-27 深圳市金立通信设备有限公司 一种控制屏幕内容的方法、装置及终端设备
CN103472996A (zh) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 一种移动设备接收触控方法及设备
US9733806B2 (en) 2013-10-09 2017-08-15 Htc Corporation Electronic device and user interface operating method thereof
CN103530035A (zh) * 2013-10-09 2014-01-22 深圳市中兴移动通信有限公司 触控终端及其区域操作方法
CN104571777A (zh) * 2013-10-09 2015-04-29 宏达国际电子股份有限公司 电子装置及其使用者界面操作方法
CN104571799B (zh) * 2013-10-28 2019-02-05 联想(北京)有限公司 信息处理方法及电子设备
CN103902218A (zh) * 2013-12-27 2014-07-02 深圳市同洲电子股份有限公司 一种移动终端屏幕显示的方法及移动终端
CN103888840B (zh) * 2014-03-27 2017-03-29 电子科技大学 一种视频移动终端实时拖动与缩放的方法及装置
CN104049843B (zh) * 2014-06-03 2018-01-23 联想(北京)有限公司 一种信息处理方法及电子设备
CN105700763A (zh) * 2014-11-25 2016-06-22 中兴通讯股份有限公司 终端界面窗口的移动方法及装置
CN104915111B (zh) * 2015-05-28 2018-08-14 努比亚技术有限公司 终端操作控制方法及装置
CN104932776A (zh) * 2015-06-29 2015-09-23 联想(北京)有限公司 一种信息处理方法及电子设备
CN105117100A (zh) * 2015-08-19 2015-12-02 小米科技有限责任公司 目标对象的显示方法和装置
CN105224169B (zh) * 2015-09-09 2019-02-05 魅族科技(中国)有限公司 一种界面移动方法及终端
CN105404456B (zh) * 2015-12-22 2019-01-22 厦门美图移动科技有限公司 一种移动终端拨号键盘管理方法及装置
CN107015749A (zh) * 2016-01-28 2017-08-04 中兴通讯股份有限公司 一种用于移动终端的界面展示方法及移动终端
CN105930252A (zh) * 2016-04-29 2016-09-07 杨夫春 移动终端文件内存显示方法
CN106354396A (zh) * 2016-08-26 2017-01-25 乐视控股(北京)有限公司 界面调整方法及装置
CN108279840A (zh) * 2017-12-22 2018-07-13 石化盈科信息技术有限责任公司 一种触摸屏的单手操作方法及单手操作装置
CN112256169B (zh) * 2020-10-14 2021-08-10 北京达佳互联信息技术有限公司 内容展示方法、装置、电子设备及存储介质
CN113434079A (zh) * 2021-05-28 2021-09-24 北京信和时代科技有限公司 记事类应用的控制方法、装置、设备及计算机存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650633A (zh) * 2009-07-03 2010-02-17 苏州佳世达电通有限公司 电子装置操控方法
CN102023788A (zh) * 2009-09-15 2011-04-20 宏碁股份有限公司 触控屏幕显示画面控制方法
CN102163126A (zh) * 2010-02-24 2011-08-24 宏达国际电子股份有限公司 显示方法及使用此显示方法的电子装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5045559B2 (ja) * 2008-06-02 2012-10-10 富士通モバイルコミュニケーションズ株式会社 携帯端末
JP5511682B2 (ja) * 2008-12-04 2014-06-04 三菱電機株式会社 表示入力装置及びナビゲーションシステム
US9182854B2 (en) * 2009-07-08 2015-11-10 Microsoft Technology Licensing, Llc System and method for multi-touch interactions with a touch sensitive screen

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650633A (zh) * 2009-07-03 2010-02-17 苏州佳世达电通有限公司 电子装置操控方法
CN102023788A (zh) * 2009-09-15 2011-04-20 宏碁股份有限公司 触控屏幕显示画面控制方法
CN102163126A (zh) * 2010-02-24 2011-08-24 宏达国际电子股份有限公司 显示方法及使用此显示方法的电子装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106686232A (zh) * 2016-12-27 2017-05-17 努比亚技术有限公司 一种控制界面的优化方法和移动终端
CN106686232B (zh) * 2016-12-27 2020-03-31 努比亚技术有限公司 一种控制界面的优化方法和移动终端

Also Published As

Publication number Publication date
CN102520860B (zh) 2018-01-19
CN102520860A (zh) 2012-06-27

Similar Documents

Publication Publication Date Title
WO2013082881A1 (fr) Procédé de commande d'affichage de bureau et terminal mobile
EP2815299B1 (fr) Sélection d'applications par vignette
JP5718042B2 (ja) タッチ入力処理装置、情報処理装置およびタッチ入力制御方法
EP2372516B1 (fr) Procédés, systèmes et produits de programme informatique pour agencer une pluralité d'icônes dans un affichage tactile
EP2474896A2 (fr) Appareil et procédé de traitement d'informations et programme informatique
JP5975794B2 (ja) 表示制御装置、表示制御方法、プログラム及び記憶媒体
KR101019128B1 (ko) 터치 패널 입력 장치, 방법 및 이를 이용한 모바일 기기
TWI490771B (zh) 可編程顯示器及其畫面操作處理程式
TWI606383B (zh) 電子設備及其頁面縮放方法
TW201109994A (en) Method for controlling the display of a touch screen, user interface of the touch screen, and electronics using the same
TWI510083B (zh) 電子設備及其圖片縮放方法
US11740754B2 (en) Method for interface operation and terminal, storage medium thereof
WO2014169603A1 (fr) Méthode et dispositif de changement d'objet, et terminal à écran tactile
WO2017059734A1 (fr) Procédé de zoom avant/arrière d'image et dispositif électronique
WO2013104155A1 (fr) Procédé d'affichage de menu et dispositif terminal
WO2016160175A1 (fr) Amélioration de commandes de sélection de texte
KR101553119B1 (ko) 연속적인 터치를 이용한 사용자 인터페이스 방법 및 장치
WO2018123701A1 (fr) Dispositif électronique, procédé de commande associé et programme
CN112558844A (zh) 一种基于平板电脑的医疗影像阅片方法及系统
JP2018116605A (ja) 表示制御装置及び表示制御方法
JP7030529B2 (ja) 電子機器、情報処理方法、プログラム及び記憶媒体
CN117289849A (zh) 一种手势辅助书写方法及装置
TW201346704A (zh) 切換顯示介面之方法
TWI459288B (zh) 手持式電子裝置及其數位資訊之畫面控制方法
JP2014115825A (ja) 情報処理装置及びその制御方法、並びにプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12854736

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12854736

Country of ref document: EP

Kind code of ref document: A1