EP1735685A1 - Verfahren zum navigieren, elektronische einrichtung, benutzeroberfläche und computerprogrammprodukt - Google Patents

Verfahren zum navigieren, elektronische einrichtung, benutzeroberfläche und computerprogrammprodukt

Info

Publication number
EP1735685A1
EP1735685A1 EP05731356A EP05731356A EP1735685A1 EP 1735685 A1 EP1735685 A1 EP 1735685A1 EP 05731356 A EP05731356 A EP 05731356A EP 05731356 A EP05731356 A EP 05731356A EP 1735685 A1 EP1735685 A1 EP 1735685A1
Authority
EP
European Patent Office
Prior art keywords
navigation
display
block
detected
floatable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05731356A
Other languages
English (en)
French (fr)
Inventor
Mikko Repka
Virpi Roto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/813,222 external-priority patent/US20050223340A1/en
Application filed by Nokia Oyj filed Critical Nokia Oyj
Publication of EP1735685A1 publication Critical patent/EP1735685A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the invention relates to a method of navigating in application views of an electronic device, to an electronic device for navigating in application views, to a graphical user interface for navigating in application views shown on a display of an electronic device, and to a computer program product.
  • a method of navigating in application views of an electronic device comprising a display for showing application views and an input device.
  • the method comprises displaying an initial application view on the display, providing a floatable navigation area displayed at least partly over the application views on the display, the floatable navigation area comprising navigation blocks for controlling given software functions, detecting a selection of a given navigation block indicated by the input device, performing software functions associated with the selected navigation block once the selection of said navigation block is detected, and displaying a current application view on the basis of the performed software functions.
  • an electronic device for navigating in application views comprising a control unit for controlling functions of the electronic device, a display for showing application views coupled to the control unit, and an input device for giving control commands for navigating, coupled to the control unit.
  • the control unit is configured to: display an initial application view on the display, provide a floatable navigation area displayed at least partly over the ap- plication views on the display, the floatable navigation area comprising navigation blocks for controlling given software functions, detect a selection of a given navigation block indicated by the input device, perform software functions associated with the selected navigation block once the selection of said navigation block is detected, and display a current application view on the basis of the performed software functions.
  • a graphical user interface for navigating in application views shown on a display of an electronic device, the graphical user interface comprising: an initial application view displayed on the display, a floatable navigation area displayed at least partly over the application view, the floatable navigation area comprising navigation blocks for controlling given software functions, and a current application view displayed on the display on the basis of performed software functions associated with a detected selected navigation block.
  • a computer program product encoding a computer process for providing navigating in an application view of an electronic device, the computer process comprising: displaying an initial application view on the display, providing a floatable navigation area displayed at least partly over the application views on the display, the floatable navigation area comprising navigation blocks for controlling given software functions, detecting a selection of a given navigation block, performing software functions associated with the selected navigation block once the selection of said navigation block is detected, and displaying a current application view on the basis of the performed software functions.
  • an electronic device for navigating in application views the electronic device comprising controlling means for controlling functions of the electronic device, displaying means for showing application views, and input means for giving control commands for navigating.
  • the controlling means being further configured to: display an initial application view on a display, provide a floatable navigation area displayed at least partly over the application views on the dis- play, the floatable navigation area comprising navigation blocks for controlling given software functions, detect a selection of a given navigation block indicated by the input means, perform software functions associated with the selected navigation block once the selection of said navigation block is detected, and display a current application view on the basis of the performed software functions.
  • the embodiments of the invention provide several advantages. Navigating in application views is carried out by using a single tool. Also, the user can customize the tool. Users are provided with modeless navigating in application views. Also, more space is saved in the display of the portable electronic device. Further, from the point of view of the user, the invention is quickly understandable and easy to learn and use.
  • Figure 1 shows an example of an electronic device
  • Figures 2A and 2B illustrate examples of user interfaces of the invention
  • Figure 3 shows an example of a method of navigating in application views in a user interface of an electronic device.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The embodiments of the invention are applicable to electronic devices, such as a mobile station used as a terminal in telecommunication systems comprising one or more base stations and terminals communicating with the base stations, for example.
  • the device may be used for short-range communication implemented with a Bluetooth chip, an infrared or WLAN connection, for example.
  • the electronic device is, for example, a portable telephone or another device including telecommunication means, such as a portable computer, a personal computer, a handheld computer or a smart telephone.
  • the portable electronic device may be a PDA (Personal Digital Assistant) device including the necessary telecommunication means for establishing a network connection, or a PDA device that can be coupled to a mobile telephone, for instance, for a network connection.
  • the portable electronic device may also be a computer or PDA device including no telecommunication means.
  • Figure 1 shows a block diagram of the structure of an electronic device.
  • a control unit 100 typically implemented by means of a micro-processor and software or separate components, controls the basic functions of the device.
  • a user interface of the device comprises an input device 104 and a display 102, such as a touch screen implemented by manners known per se.
  • the user interface of the device may include a loudspeaker and a keypad part.
  • the device of Figure 1 such as a mobile station, also includes communication means 108 that implement the functions of a mobile station and include speech and channel coders, modulators and RF parts.
  • the device may also comprise an antenna and a memory 106.
  • the functions of the device are controlled by means of the input device 104, such as a mouse, a hand-held locator operated by moving it on a surface.
  • a sign or a symbol shows the location of a mouse cursor on the display 102 and often also the function running in the device, or its state.
  • the display 102 itself is the input device 104 achieved by means of a touch screen such that the desired functions are selected by touching the desired objects visible on the display 102.
  • a touch on the display 102 may be carried out by means of a pen, a stylus or a finger, for example.
  • the input device 104 can also be achieved by us- ing eye tracking means where detection of eye movements is used in interpreting certain control commands.
  • the control unit 100 controls the functions of the user interface and is connected to the display 102 and configured to show different application views on the display 102.
  • the control unit 100 receives control commands from the input device 104.
  • the input device 104 is configured to give control commands for navigating in application views shown on the display 102.
  • the application views may be views into different web pages from the Internet, views from any application programs run in the device or any other application views that may be shown on the display 102.
  • the navigating or browsing the application views may include scrolling the application view horizontally or ver- tically, zooming in to the application view to get a better view of the details of the application view or zooming out from the application view to get a more general view of the whole application view.
  • the navigating function operates such that the desired functions, such as scrolling or zooming, are first selected by means of the input device 104.
  • control unit 100 interprets the detected selections, performs given software functions based on thereon and, as a result of the performed software functions, displays a given application view on the display 104.
  • the control unit 100 first displays an initial application view on the display 102.
  • the control unit 100 is configured to provide a floatable navigation area displayed at least partly over the application view on the display 102.
  • the floatable navigation area comprises navigation blocks for controlling given software functions.
  • the control unit 100 detects a selection of a given navigation block indicated by the input device 104.
  • the selection may be detected on the basis of a touch on the display 102, for ex- ample. Alternatively, the selection may be detected by means of the input device 104, such as a mouse or a pen.
  • control unit 100 is configured to perform software functions associated with the selected navigation block once the selection of said navigation block is detected. Finally, the control unit 100 is configured to display a current application view based on the performed software functions.
  • the initial application view may be a partial view into an Internet page, and the current application view after a scrolling function may be a view into another part of the Internet page, for example.
  • the current application view may also be a view into the Internet page after the control unit 100 has performed a zooming function.
  • the control unit 100 continues to detect control commands indicated by the input device 102, and to detect selections of given navigation blocks. It is possible that the floatable navigation area is displayed automatically partly over the application view on the display 102 when a given application program displaying the application views is opened.
  • Figures 2A and 2B show displays 104 of an electronic device, such as a PDA device.
  • the Figures 2A and 2B illustrate graphical user interfaces in an embodiment of the invention.
  • a display 102 is divided into different areas, each area having specific functions.
  • Application views are shown in the largest areas 220A and 220B, for example.
  • the floatable navigation areas 200, 200A, 200B are in the form of squares in Figures 2A and 2B.
  • the floatable navigation areas 200, 200A, 200B may also be of any other shape than that of a square, such as a circle, for example.
  • the floatable navigation areas 200, 200A, 200B com- prise navigation blocks 202, 204, 206, 208, 210, 212, 214 for controlling given software functions.
  • the navigation blocks 202 and 208 control horizontal scrolling of the application view and the navigation blocks 204 and 212 control vertical scrolling of the application view.
  • the navigation blocks 206 and 210 control zooming in and zooming out in this example. It is possible that tapping a pen down on a given navigation block 202, 204, 208, 212 for scrolling results in scrolling to the desired direction by a single predetermined step.
  • Holding the pen down on the navigation block 202, 204, 208, 212 may repeat the functionality. Accordingly, tapping a pen down on a given navigation block 206, 210 for zooming results in changing the zoom level by a single pre- determined step, and holding the pen down repeats the functionality.
  • the number of navigation blocks 202, 204, 206, 208, 210, 212, 214 may be different than in this example. There may also be control functions for the navigation blocks 202, 204, 206, 208, 210, 212, 214 other than those in these examples. Further, it is possible that there is only one navigation block for both horizontal and vertical scrolling, for example.
  • the floatable navigation area 200, 200A, 200B comprises a control block 214.
  • the control block 214 is in the middle of the floatable navigation area. The control block 214 is for changing the location of the floatable navigation area 200, 200A, 200B, for example.
  • the location of the floatable navigation area 200, 200A, 200B may be changed for example by dragging and dropping the float- able navigation area 200, 200A, 200B with the help of the control block 214. Tapping on the control block 214 and holding the pen down while dragging may move the floatable navigation area to a desired location.
  • the location of the floatable navigation area 200A is changed to a location of the floatable navigation area 200B. It is also possible that the changed location remains in the memory and the floatable control area 200A is next displayed in the changed location.
  • the appearance of the floatable navigation area 200, 200A, 200B may be set as desired.
  • the navigation blocks 202, 204, 206, 208, 210, 212, 214 for different functions are marked with individual icons, such as arrows up and down, for navigation blocks 212, 204 for vertical scrolling, arrows left and right for navigation blocks for horizontal scrolling 202, 208, magnifiers for navigation blocks 206, 210 for zooming in or out, and crossed arrows for the control block 214.
  • the navigation blocks 202, 204, 206, 208, 210, 212, 214 may also be marked with appropriate colors, text, drawings or fill effects.
  • the floatable navigation area 200, 200A, 200B may also be set to appear in a "ghost mode", meaning for example that all the icons are removed and only colors are used to indicate different navigation blocks.
  • the whole floatable navigation area 200, 200A, 200B may be semi-transparent, that is, the contents below the floatable navigation area 200, 200A, 200B are visible. The level of transparency may also be adjusted.
  • the floatable navigation area 200, 200A, 200B does not cover so much of the application view shown on the display 102. It is also possible that no colours, arrows or magnifiers are shown such that only some or all outlines of the different navigation blocks 202, 204, 206, 208, 210, 212, 214 are visible.
  • Figure 2B shows the floatable navigation area 200B in a "ghost mode”.
  • the application view 220B can be seen through the floatable navigation area 200B. Further, there are only outlines of the navigation blocks 202, 204, 206, 208, 210, 212, 214 marking the locations of the navigation blocks of the floatable navigation area 200B.
  • the "ghost mode" is used with different icons, such as arrows, magnifiers and/or colors.
  • the application view under the floatable navigation area 200, 200A, 200B is also seen through the semi-transparent floatable navigation area.
  • the graphical user interface of the embodiment comprises an initial application view 220A that is displayed on the display 102.
  • the application view 220A is, for example, a view into a web page on the Internet.
  • the floatable navigation area 200 is displayed at least partly over the initial application view 220A.
  • the location and size of the floatable navigation area 200 may be determined by using the user interface of the device, for example.
  • the floatable navigation area 200 is displayed in a given location, for example, in the upper right corner of the display 104.
  • the location may at any time be changed by using the control block 214. Pressing or touching the control block 214 with a pen, for example, and moving the pen along the surface of the display 104 may result in changing the location of the floatable navigation area 200.
  • the size of the floatable navigation area 200 may also be set appropriately, for example, ac- cording to the needs of individual users of the device. The user may choose between a large and a small floatable navigation area 200, 200A, 200B, for example.
  • the navigation block 204 is next selected.
  • the user wishes to navigate the view to the web page by scrolling the page downwards.
  • the navigation block 204 that controls the scrolling down function is selected.
  • the selection of the navigation block 204 may be performed by using any suitable input device. Once the selection of the navigation block 204 has been detected, a current application view 220B illus- trated in Figure 2B is displayed.
  • the amount of scrolling down may depend on how long a pen is pressed on the navigation block 204, for example. If only a single touch is detected on the navigation block 204, only a predetermined step is scrolled down. Further, if the pen is continuously held down on the navigation block 204, the scrolling down continues as long as the pen stays on the navigation block 204. It is also possible that pressing the pen on the navigation block 204 for a predetermined period of time results in an increase in the speed of scrolling down. Accordingly, if the user wishes to zoom the application views shown on the display 102, navigation blocks 206, 210 for zooming are selected.
  • a current application view zoomed according to the detected selected navigation block is shown. If a pen is continuously held down on the navigation block 206, 210 for zooming, the zooming function continues. It is also possible that pressing the pen on the navigation block 206, 210 for a given time may result in an increase in the speed of zooming accordingly.
  • the amount of pressure detected at a site of a navigation block 202, 204, 206, 208, 210, 212 defines the speed of scrolling or the level of zooming. The amount of pressure may be detected based on a touch screen or a pressure sensitive pen used with the user interface of an embodiment, for example.
  • a dragging function after a selection of a given navigation block 202-214.
  • the input device may be a touch screen and a stylus, for example, and the user may select a given navi- gation block 202-214 by first touching the touch screen with the stylus. Then the stylus may be moved along the surface of the touch screen thus resulting a dragging function associated to the given navigation block 202-214.
  • the software functions associated with the selected navigation block 202-214 are performed on the basis of the detected drag function on the given navigation block. In an embodiment, the software functions performed are based on the detected amount of the drag function on the given navigation block. In another embodiment, the software functions performed are based on the detected speed of the drag function on the given navigation block.
  • the direction and the length of the drag function may define attributes for the software func- tions.
  • the software functions may be accelerated if the user drags farther away from the original point.
  • the whole area of the display is considered as the floatable navigation area 200 or that there are number of floatable navigation areas 200, 200A, 200B shown on the display.
  • the navigation blocks 200-212 may in fact reside anywhere on the display 102 area. The user may only need a few navigation blocks 200-212 on a regular basis and only those navigation blocks 200-212 that are frequently used may be visible on the display 102. It is also possible that given navigation blocks 200-212 are situated on different parts of the display 102.
  • the dragging function has different effects de- pending on the given navigation block 200-212 to which the dragging function is directed.
  • Some examples of how different control functions, such as tap, tap & hold or drag, may be used in navigating in application views are shown in the following tables 1-6.
  • the control functions may be made, for example, by using a pen or a stylus with a touch screen as an input device.
  • the right part of each table shows different software functions resulting from given control functions directed to the given navigation blocks. The idea is to provide the users a basic set of floating blocks on an active content area: scroll, zoom, page navigation and search. Whenever the user taps or drags the navigation blocks, the functions described in the following tables may be executed.
  • the directions and lengths of the drag functions define attributes for the functions and the action is accelerated when user drags farther away from the original point.
  • Navigation block for zooming in and out Tap Zooming a predetermined step towards the centre of the current view.
  • Tap & Hold Bringing up a zoom & scroll dialog that provides a miniature view of the page and a rectangle (corresponding the new view) that can be moved and resized.
  • Drag The direction of the drag defines whether the view is zoomed in or out. Dragging right or up zooms in, and dragging left or down zooms out. The view is zoomed in smoothly until stylus is lifted off. The farther the stylus is moved, the faster the zoom is. Continuing the dragging to the other side of the navigation block changes zooming direction.
  • Navigation block for zooming in Tap Zooming a predetermined step towards the centre of the view- Tap & Hold: Zooming in smoothly towards the centre of the view- Drag: Scrolling the view while zooming towards the centre of the changing view.
  • the direction of drag defines the scrolling direction. Dragging down results in showing more content from below the current view.
  • the page may be scrolled to any direction.
  • the scrolling direction is the same as the current angle between the scroll starting point (navigation block) and the stylus.
  • the view is zoomed and scrolled smoothly until the stylus is lifted off. The farther the stylus is moved from the navigation block, the faster the scrolling is.
  • the users may utilize a full screen space (only tiny position indicators are needed). Unlike in panning where the user must grab one point on the page and drag it to another point, the user can scroll over several screens with a single drag. Also very easy toggling between zooming in and out is provided.
  • the acceleration functions described in these examples can be used in other applications also.
  • embodiments where separate navigation blocks for zooming in and zooming out were provided. The reason for this embodi- ment is to allow simultaneous zoom and scroll functions. Providing separate controls for zooming in and out is also more intuitive for the end users than a single control. Only a single drag is needed to zoom the application view to a desired point. User may also zoom to areas outside the original view.
  • the input device 104 comprises a touch screen for giving control commands for navigating
  • the control unit 100 is further configured to detect a pixel element under- neath a detected touch point on the given navigation block indicated by the touch screen, and to perform the software functions associated with the selected navigation block by regarding the detected pixel element as a mid-point for the software function. For example, in a situation where the selected navigation block is a navigation block for zooming and the user wishes to zoom in or out once or one step at the time, then a tap on the navigation block for zooming in or out results is the pixel underneath the stylus touch point remaining in that position and the view is zoomed in/out.
  • the stylus may then be pressed and held over the navigation block for zooming in/out.
  • the view is zoomed smoothly until the user takes the stylus off.
  • the pixel underneath the stylus remains in the original position during zooming.
  • zooming in/out smoothly to any point in the application view is also possible: while holding the stylus on the navigation block for zooming, the stylus is moved to the desired position on the display.
  • the pixel that is under the stylus at the point when one of the zooming events takes place remains at it's original position.
  • the central pixel for zooming changes as the stylus moves.
  • the user can make sure the needed area remains on the screen even if the navigation block for zooming is not on that place when the zooming function is started. It is also possible that the speed of the zooming function is slowed down while the pen moves for making it easier to move to the target point in time. If the smooth zooming is very quick, the user may not be able to move the navigation block for zooming to the target point during the zooming quickly enough, and the target point may be zoomed outside the visible area. This is not a problem in the following embodiment since the user may get the hidden area visible very easily. Thus, in an embodiment it is possible to zoom into areas outside the display.
  • zooming function is changed to scrolling function if the touch on the touch screen is detected to reach an edge of the display. In this case, new content from the direction of the drag function is brought to the visible area. While the scrolling goes on, smooth zooming is stopped. The zooming function may then be continued when the touch is detected to proceed further off from the edge of the display. Thus, if the user drags a pen back to the direction of the active content area, the zooming function is returned and the scrolling ends.
  • Zoom out function brings areas visible from outside the current view by default but in an embodiment, it is possible also to scroll the view by pushing a pen/stylus against the edge of the screen or over the edge of the content area. Then new content from the direction of pushing is brought to the visible area. While the scrolling goes on, the zooming out function stops. The zooming may then continue if the user drags the pen back to the active content area.
  • the zooming functions can also be used in zoom to rectangle functionality. Then, dragging the stylus will draw a rectangle in- stead of smooth zooming.
  • the above-described zooming embodiments enable zooming the view to the desired point by using only a single drag function. Users may also define the mid-point for the zoom and zoom to areas outside the initial view.
  • the navigation function is modeless: for example, the zooming function is performed only when the navigation block for zooming is selected (pen touches the block, for example), and zooming function ends as soon as the selection of the navigation block for zooming is detected to end (pen is lifted off the block, for example).
  • a navigation function may be executed.
  • the software functions associated with the selected navigation block can also be performed once the end of a drag function on the display is detected.
  • a given navigation block is selected, for example, by touching the navigation block with a stylus, then the stylus is continuously dragged on the touch screen, and finally the software function associated with the selected navigation block is performed once the stylus is lifted off the touch screen, i.e. once the end of the drag function is detected.
  • the selected navigation block can follow the touch or stay stationary.
  • other control functions may be quickly selected by using the floatable navigation area 200, 200A, 200B. For example, pressing a secondary mouse button on a given navigation block 202, 204, 206, 208, 210, 212, 214 may result in opening a selection list or a menu where different control functions may be selected.
  • a pen down on the control block 214 and holding the pen without moving may activate a given control function, such as opening of the selection list.
  • Different topics on the selection lists or menus may be related to the floating navigation area 200, 200A, 200B, to the navigation blocks 202, 204, 206, 208, 210, 212, 214, to browsing functions and different settings. All the settings and functions that are needed are easily reachable by using such selection lists. Examples of the control functions that may be included in the selection lists include toggling between a full screen and a normal view, hiding the floatable navigation area 200, 200A, 200B, selecting the ghost mode, setting the size and appearance of the floatable navigation area 200, 200A, 200B, and so on.
  • FIG. 3 shows an example of a method of navigating in application views in a user interface of an electronic device.
  • the method starts is 300.
  • an initial application view is displayed on the display.
  • a floatable navigation area is displayed on the display at least partly over the application view.
  • the floatable navigation area may be displayed automatically when the application view is shown on the display, for example. It is also possible that the floatable navigation area is first shown as an icon on the display, is activated from a menu or on the basis of a tap based activation on screen, and is selected when needed.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
EP05731356A 2004-03-30 2005-03-23 Verfahren zum navigieren, elektronische einrichtung, benutzeroberfläche und computerprogrammprodukt Withdrawn EP1735685A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/813,222 US20050223340A1 (en) 2004-03-30 2004-03-30 Method of navigating in application views, electronic device, graphical user interface and computer program product
US11/052,420 US20050223342A1 (en) 2004-03-30 2005-02-07 Method of navigating in application views, electronic device, graphical user interface and computer program product
PCT/FI2005/050104 WO2005096132A1 (en) 2004-03-30 2005-03-23 Method of navigating, electronic device, user interface and computer program product

Publications (1)

Publication Number Publication Date
EP1735685A1 true EP1735685A1 (de) 2006-12-27

Family

ID=35063964

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05731356A Withdrawn EP1735685A1 (de) 2004-03-30 2005-03-23 Verfahren zum navigieren, elektronische einrichtung, benutzeroberfläche und computerprogrammprodukt

Country Status (3)

Country Link
EP (1) EP1735685A1 (de)
KR (1) KR100795590B1 (de)
WO (1) WO2005096132A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262745A (zh) * 2013-07-10 2019-09-20 Lg电子株式会社 移动终端以及用于控制移动终端的方法

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8701021B2 (en) 2006-07-31 2014-04-15 Hewlett-Packard Development Company, L.P. Capability to build multiple application views from a single system model
US20090002324A1 (en) * 2007-06-27 2009-01-01 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Scrolling Mechanism for Touch Screen Devices
US8301723B2 (en) 2010-02-26 2012-10-30 Research In Motion Limited Computer to handheld device virtualization system
CN102830917A (zh) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 移动终端及其触控建立方法
CN108052677A (zh) * 2018-01-02 2018-05-18 武汉斗鱼网络科技有限公司 页面处理方法、装置及可读存储介质
CN113535286A (zh) * 2020-04-15 2021-10-22 斑马智行网络(香港)有限公司 界面显示方法、装置、设备和存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000029598A (ja) * 1998-07-13 2000-01-28 Matsushita Electric Ind Co Ltd 表示制御装置、表示制御方法及び表示制御プログラムを記録したコンピュータ読み取り可能な記録媒体
US7308653B2 (en) * 2001-01-20 2007-12-11 Catherine Lin-Hendel Automated scrolling of browser content and automated activation of browser links
US6765596B2 (en) 2001-02-27 2004-07-20 International Business Machines Corporation Multi-functional application launcher with integrated status

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005096132A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262745A (zh) * 2013-07-10 2019-09-20 Lg电子株式会社 移动终端以及用于控制移动终端的方法
CN110262745B (zh) * 2013-07-10 2022-08-02 Lg电子株式会社 移动终端以及用于控制移动终端的方法

Also Published As

Publication number Publication date
WO2005096132A1 (en) 2005-10-13
KR20070009661A (ko) 2007-01-18
KR100795590B1 (ko) 2008-01-21

Similar Documents

Publication Publication Date Title
US20050223342A1 (en) Method of navigating in application views, electronic device, graphical user interface and computer program product
US11481538B2 (en) Device, method, and graphical user interface for providing handwriting support in document editing
US20220261131A1 (en) Device, Method, and Graphical User Interface for Providing Feedback for Changing Activation States of a User Interface Object
KR102618362B1 (ko) 사용자 인터페이스들 사이에 내비게이팅하기 위한 디바이스 및 방법
US20190369823A1 (en) Device, method, and graphical user interface for manipulating workspace views
EP2225628B1 (de) Verfahren und system zur bewegung eines eingabezeigers und auswahl von objekten auf einem berührungsbildschirm anhand eines fingerzeigers
US10304163B2 (en) Landscape springboard
EP3404520B1 (de) Verfahren zur informationsanzeige durch berührungseingaben in ein mobiles endgerät
KR101812329B1 (ko) 콘텐츠를 스크롤할지 선택할지 결정하기 위한 디바이스, 방법 및 그래픽 사용자 인터페이스
US20160034132A1 (en) Systems and methods for managing displayed content on electronic devices
EP2708996A1 (de) Anzeigevorrichtung, benutzerschnittstellenverfahren und programm
US20050223341A1 (en) Method of indicating loading status of application views, electronic device and computer program product
EP2154603A2 (de) Anzeigevorrichtung, Anzeigeverfahren und Programm
KR100950080B1 (ko) 소프트웨어 기능들을 제어하는 방법, 전자 장치, 그리고컴퓨터 프로그램 제품
WO2013003105A1 (en) Electronic device and method with dual mode rear touch pad
EP1735685A1 (de) Verfahren zum navigieren, elektronische einrichtung, benutzeroberfläche und computerprogrammprodukt
US20070006086A1 (en) Method of browsing application views, electronic device, graphical user interface and computer program product

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20061017

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1094821

Country of ref document: HK

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20101001

REG Reference to a national code

Ref country code: HK

Ref legal event code: WD

Ref document number: 1094821

Country of ref document: HK