WO2003023593A1 - Navigation method, program product and device for presenting information in a user interface - Google Patents

Navigation method, program product and device for presenting information in a user interface Download PDF

Info

Publication number
WO2003023593A1
WO2003023593A1 PCT/FI2002/000285 FI0200285W WO03023593A1 WO 2003023593 A1 WO2003023593 A1 WO 2003023593A1 FI 0200285 W FI0200285 W FI 0200285W WO 03023593 A1 WO03023593 A1 WO 03023593A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
display
collection
input device
moving
operating mode
Prior art date
Application number
PCT/FI2002/000285
Other languages
French (fr)
Inventor
Jukka-Pekka METSÄVAINIO
Anna-Leena Hartojoki
Ismo ALAKÄRPPÄ
Riku Hukkanen
Kari PENTTILÄ
Original Assignee
Myorigo Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with page-structured environments, e.g. book metaphor

Abstract

The present invention concerns a method, a program product and a hand-held device for managing separate collections of information and/or functions with said device. In the invention, user actions with the input device are logically tied to motions on the display. Said collections are preferably organized to a linear stack, the collection order remaining the same all the time. When a user moves, e.g. his/her finger horizontally (along x-axis) on the touch pad, the collection on the display is changed from one to another depending on the direction of the user finger movement. Correspondingly, when a user moves his/her finger vertically (along y-axis) on the touch pad, the collection on the display is scrolled vertically down/up depending on the direction of the user finger movement. While moving within one collection, the operating mode is changed to second operating mode when one collection and/or function is selected.

Description

NAVIGATION METHOD, PROGRAM PRODUCT AND DEVICE FOR PRESENTING INFORMATION IN A USER INTERFACE

FIELD OF THE INVENTION

The present invention relates to a new and improved user interface preferably for small devices, e.g. mobile phones .

BACKGROUND OF THE INVENTION

In information technology, the user, interface (UI) is everything designed into an information device with which a human being may interact including display screen, keyboard, mouse, light pen,' the appearance of a desktop, illuminated characters, help messages, and how an application program or a Web site invites interaction and responds to it. In early computers, there was very little user interface except for a few buttons at an operator's console. The user interface was largely in the form of punched card input and report output . Later, a user was provided with the ability to interact with a computer online and the user interface was a nearly blank display screen with a. command, line, a keyboard, and a set of commands and computer responses that were exchanged. This command line in- terface led to one in which menus (list of choices written in text) predominated. And, finally, the graphical user interface (<3UI) arrived, originating mainly in Xerox's Palo Alto Research Center, adopted and enhanced by Apple Computer, and finally effec- tively standardized by Microsoft in its Windows operating systems.

In general, a user interface consists of software components and physical parts of the equipment being in interaction with the user of the equip- ment . In hand-held devices the user interface varies from one to another. The user interface of a typical mobile phone comprises numeric keys (0-9, *, #) and certain amount of manufacturer dependent function keys. The user interface of a mobile phone comprises also a display and software components interacting with the display and the user. Pressing a key causes a predefined a set of predetermined actions.

Several problems arise when the size of the device decreases. The keys require a certain, amount of space in order to provide minimum usability. Moreover, the keys often require a larger area than the size of the display. The keys are also prone to different kinds of malfunctions. Above all, the size of a mobile phone depends on the size of the keyboard.

There are naturally also other hand-held devices than mobile phones. Personal Digital Assistants (PDA) typically comprise a large display area in proportion to the size of the device. The display area is usually touch sensitive so. that information can be transferred into the device with a finger or a special tool, e.g. a special pen.

Most laptop computers include a touch pad mouse. The touch-sensible pad is able to produce simi- lar control signals than with a conventional computer mouse. Touch pad functionality is represented, e.g. in U.S. Pat. No. 5,995,084.

The present mobile phones usually comprise navigating buttons, scroll mouse or keypad. However, there are several drawbacks with these solutions. The navigation button solutions are not logical, because the navigating button actions do not correspond to motion on the display of a device. The user interfaces of mobile phones are tied up to mechanical keypads. Software and mechanical user interface are not cooperating logically. In other words, actions on the display are abstract and floating and separate from common logic. There are also other forms of small phones, here called as wrist phones or miniature phones . Many of them use mechanical keypads or some other unrealistic technical solutions. However, the user interface becomes distributed. The product consists of separate parts or its form becomes complex.

Some mobile phones or PDAs provide touch sensible displays. They are hard to handle at miniature size. Therefore, an assisting device, e.g. special stick, is required. Moreover, touch sensible, membranes are expensive, fragile and not so accurate in order to work by finger.

SUMMARY OF THE INVENTION The present invention concerns a method, a program product and a hand-held device for managing separate collections of information and/or functions with a device comprising at least a display. The device comprises at least a housing, electronic cir- cuitry located in said housing, a program product for executing user actions, said program product comprising separate collections of information and/or functions, and a display presenting said actions.

Said collections are displayed on said dis- play. In a preferred embodiment, said collections are organized to a linear stack the collection . order remaining the same all the time. A very important idea of the invention is how the collections are reviewed. Said input device, such as a touch pad, is used as controlling means when separate collections of information and/or functions are gone through. The method comprises the step of moving from one collection to another when detecting first predefined orientation control instructions with said software means from said input device. In other words, the active collection on said display is changed. The active collection (or at least some of the collection) is always seen on the display. Further, the method comprises the step of moving within one collection when detecting second predefined orientation control instructions with said software means from said input device. In other words, when a desired collection is active (on display) , the contents of said collection can be reviewed. All above actions are executed within a basic operating mode. In a preferred embodiment, only one collection is visible at a time on said display. In a preferred embodiment, there is at least one collection.

While moving within one collection, the second operating mode is entered, when one collection and/or function is selected. The second operating mode is preferably called as an edit operating mode. Correspond- ingly, the first operating mode is referred to as basic operating mode.

Causing a horizontal movement with said input device corresponds to first predefined orientation control, instructions. Moreover, causing a vertical movement with said input device corresponds to second predefined orientation control instructions. The input device is preferably a touch pad. Therefore, when the user moves, e.g. his/her finger on the pad iii a horizontal direction (x-axis) , the display presents col- lections as long as the user moves his/her finger or the page stack ends.. Correspondingly, when the user moves, e.g. his/her finger on the pad in a vertical direction (y-axis) , the display presents the contents of a collection. The present invention has several advantages over the prior-art solutions. A touch pad at 2-axial 2-mode user interface of a mobile device is a compact combination, which makes it possible to control all functions of the device. The mechanical control area is small but multi-purpose. There are required no other mechanical keys for controlling. This means that the whole control system can be on software basis. Moreover, the whole user interface is changeable for example for ODM manufacturers. This means also that the whole phone is transformable by software changes.

The present invention provides a solution, which is very ergonomic. Motion of fingers is minimal and there are no difficult motion paths of fingers and no continuous clicking of buttons when browsing. When the user interface is based on software, small motions on the touch pad can control large control area on the display.

There are also several mechanical benefits with the present invention. The device becomes very compact and there is no need for backlight for the keypad, and above all, there are fewer failures in me- chanics. It is a common problem with mobile phones with a keypad that they usually do not tolerate water. In the present invention, there is no keypad so the mobile phone can be constructed to be at least splash proof. With the present invention it will be possible to make drawing commands with the device, e.g. draw icons, figures etc. Also controlling game applications becomes easier.

The distance between keys in a keypad is limited in conventional phones. In the present invention, distance between selection points can be lengthened with software, because the touch pad as an input device uses motion for selecting active points (no straight location pointing) .

BRIEF DESCRIPTION OF THE DRAWINGS

. The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this specification, illustrate embodiments of the invention and together with the description help to explain the principles of the invention. In the drawings: Fig 1 is a block diagram illustrating information arranged as a page stack,

Figs 2 - 4 are block diagrams illustrating the handling of the page stack of Figure 1 on a dis- play,

Fig 5 is a block diagram illustrating the structure of the 2-axial user interface,

Fig 6a is a block diagram illustrating the two-mode (basic and edit operating mode) functional- ity,

Fig 6b is a block diagram illustrating an approving mechanism,

Fig 6c is a block diagram illustrating a feeding mechanism with the input device, Figs 7a - 7d are block diagrams illustrating the functionality of the input device of the invention,

Fig 8 schematically shows the essential parts of a radio frequency communication device for communi- cation with a cellular or cordless network,

Fig 9 illustrates an example of typing a PIN code,

Fig 10 illustrates an example of making a call, Fig 11 illustrates an example of speed dialing,

Fig 12 illustrates an example of making a call,

Fig 13 illustrates an example of finding a number in the phone book and making a call,

Fig 14 illustrates an example of finding a number in the phone book and making a call,

Fig 15 illustrates an example of accepting a call, Fig 16 illustrates an example of canceling a call, Figs 17 and 18 illustrate an example of opening the menu files,

Figs 19 and 20 illustrate an example of browsing the menu files, Figs 21 - 29 illustrate examples of sending a message,

Fig 30 illustrates an example of entering a PIN code,

Fig 31 illustrates an example of making a call, and

Figs 32 - 38 illustrate examples of sending a message .

DETAILED DESCRIPTION OF THE INVENTION Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings.

Figure 1 represents a preferred embodiment of the separate collections of information and/or func- tions. In Figure 1, the term collection is equivalent with the term a page of a stack. There are a few simple conditions concerning the structure of the stack STK. The stack STK consists of 1 to N pages. The length of an individual page in the stack STK is not limited. However, the width of a page is limited to the width of the display view DW. The stack STK and the display DW are part of a device. The device is not described in more detail in this example. It is, e.g. a mobile phone or a PDA. The main idea for presenting information is that the movement of the display view is 2-axial. Pages of the stack are scrolled by y-axis and browsed by x-axis (moving from one page to another within the stack) .. A single page can be regarded as a list, which is scrolled on the display view DW. Figures 2 - 4 are block diagrams illustrating the handling of the page stack of Figure 1 on a display. Figure 2 represents a simplified example of a user interface. The device DT comprises a display DISP and a touch pad PD with which a user of the device can control the device. Scrolling and browsing of a page stack of Figure 1 is done by sliding a finger on the touch pad PD. When a finger is moved horizontally on the touch pad PD, a page is browsed from one to another .

There is a logical user-friendly connection between the finger movement on the touch pad PD and the view on the display DISP. Motion on the display DISP corresponds to the motion of a finger. In other words, when a user moves his/her finger right/left on the touch pad PD, the page also slides to the next/previous page (see Figure 3) . This makes the user interface simple and easy to handle. The turning of a page is preferably animated, which makes the action even more concrete. Correspondingly, when a user moves his/her finger down/up on the touch pad PD, the page also scrolls down/up (see Figure 4) . Although it is presented above that a touch pad is used as an input device, any other appropriate device can be used. A possible input device could be, e.g. a navigation key with which more than four directions can be indicated.

Figure 5 represents a block diagram illus- trating the structure of the 2-axial user interface. The page stack (consisting of pages P1...PN) has a starting page and an ending page. Further, a page of the stack has a starting point and an ending point. The broken line above the "Contents of page 1...N" re- fers to the actual visual display area the user sees when pages are browsed from one to another. The pages are arranged so that horizontal movements (x-axial) on the touch pad PD of Figure 2 cause movements between pages. Correspondingly, vertical movements (y-axial) on the touch pad PD of Figure 2 cause movements within a page. The structure of the stack and the functionality of the stack are simple and easy to comprehend. Logically the stack of pages can be considered as a book. A book has a starting point and an ending point and it is browsed by turning pages (x-axial movements) . When a certain page is to be reviewed more carefully, the page is gone through vertically, (y- axial movements) .

In one embodiment of Figure 5, the page stack (consisting pages P1...PN) represents different software functions (choices) of the menu of a mobile phone. Page PI represents for example Messages menu, page P2 Call register, P3 Calendar menu etc. A page can be scrolled downwards. In the case of Messages menu, a list of choices appear: Inbox, Outbox, Write messages, Message settings, Voice messages, etc. Figure 6a represents a block diagram illustrating the first and the second operating mode (basic and edit operating mode) functionality. The example represented in Figure 6 comprises a page PN of a page stack. The area inside the thick line represents cur- rent display view DW that can be scrolled with y-axial movements on the touch pad PD of Figure 2. The user interface of the present invention uses preferably static cursor/selective area SEL. Data is scrolled and browsed under the cursor/selective area SEL. The cur- sor/selective area SEL is highlighted or otherwise pointed out . The user interface of the present invention operates in two modes:

1. Basic operating mode. Pages can be browsed and scrolled; operations can be selected from a page.

2. Edit operating mode. For specified selected operations, which require modifying (number/letter feeding, editing, saving, etc.).

The page PN on the left is in the basic oper- ating mode. Edit operating mode turns on when a specified operation is selected, e.g. by taping the touch pad PD of Figure 2. The view on the display changes to, e.g. to display view DW2 or DW3. The edit operating mode contains a narrow, horizontal (x-axial) scroll bar (e.g. scroll bar DIG or ALP) on the display view. Contents of the scroll bar are determined by the selected operation. The scroll bar can contain for example :

• A menu of commands (edit, save, remove, dial, etc. ) .

• Numbers (1 2 3 4 5 6 7 8 9 0 * #). • Characters for entering text.

At a specified static point of the scroll bar there is a cursor/selection point, e.g. SEL1 or SEL2. The menu/number/character bar is scrolled under the selection point by x-axis. In the edit operating mode, the x-axial control motion (on the touch pad) is defined only for scrolling the scroll bar. In other words, pages cannot be browsed in the edit operating mode. The length of the information may exceed the area of the display DW3. In such case, the user can scroll the information with vertical movements on the touch pad. The scroll bar (e.g. ALP) , however, remains visible on the display view DW3 all the time.

Figure 6b represents a block diagram illustrating an approving mechanism. When a user wishes to go back to the basic operating mode, he/she selects OK or C. OK and C commands can be located, e.g. at the corners of the display view DW4. Diagonal (e.g. 45 degrees angle) movements on the touch pad means activating C or OK command. Which command is activated, de- pends on the direction of the movement on the touch pad. A tap on the touch pad confirms the desired action.

Figure 6c is a block diagram illustrating a feeding mechanism with the input device. In Figure 6c, a virtual key grid is presented on the display view DW5. The key grid is formed in the same way as normal keys of a mobile phone and they contain the same op- erations (numbers, characters, OK, C, etc.) as is a mobile phone. The key grid is used and controlled with the input device PD, e.g. a touch pad. The key grid can be much smaller than normal keys, because the dis- tance between virtual keys can be determined larger by program basis (no straight location pointing) as is represented in the next example .

Figures 7a - 7d represent block diagrams illustrating the functionality of the input device of the invention. Each of the figures comprise a display view DW and a touch pad PD. Figures 7a - 7d try to illustrate variable distances of selection points controlled by motion on the touch pad PD and program determined distance. Figures 7a and 7b clarify the term "program determined distance" . Finger motion length in both figures is marked as N. However, the distance between selection points in Figure 7a is long, whereas the distance between selection points in Figure 7b is short.

The control can also be supported with an acceleration factor. Figures 7c and 7d clarify this feature. Finger motion length in both figures is marked as N. However, because in Figure 7c the motion rate of a finger is slow, selection points move one by one

(e.g. phonebook scrolls name by name). In figure 7d the motion rate of a finger is fast so selection points move by steps or groups (e.g. phonebook scrolls from alphabetical group to another) . Figure 8 schematically shows the most important parts of a preferred embodiment of a radio frequency communication device, e.g. a portable phone. The preferred embodiment of the phone of the invention is adapted for use in connection with the Global Sys- tern for Mobile Communication (GSM) network, but, of course, the invention may also be applied in connection with other phone networks, such as cellular net- works and various forms of cordless phone systems . The microphone 7 records the user's speech, and the analog signals formed thereby are A/D converted in an A/D converter 8 before the speech is encoded in an audio codec unit 4. The encoded speech signal is transferred to a physical layer processor 3, which e.g. supports the GSM terminal software. The processor 3 also forms the interface to the peripheral units of the apparatus, including the memories 10 (RAM, ROM) , the display 12 and the touch pad 11 (as well as SIM, data, power supply, etc.). The memories comprise the program of computer-readable instructions performing the method presented in this invention.

The processor 3 communicates with the RF part 1 via a baseband converter 2 and a channel equalizer 13; The audio codec unit 4 speech-decodes the signal, which is transferred from the processor 3 to the earpiece 6 via a D/A converter 5. The units 2, 3, 5, 8, 9 and 13 are usually integrated in a chip set either a commercially available one or in a set of specially designed chips (ASIC's). The processor 3, which serves as the controller unit in a manner known per se in the preferred embodiment, is connected to the user interface. Thus, it is the processor, which monitors the activity in the phone and controls the display 12 in response thereto. Therefore, it is the processor 3 which detects the occurrence of a state change event and changes the state of the phone and thus the display text or graphics . A state change event may be caused by the user when he/she operates with the touch pad 11.

The present invention has many advantages over the prior-art solutions. The motion on the display corresponds to the motion on the surface of the input device. Therefore, it is easy handle. Motions remind of real world operations and above all, they are easy to learn. Linear 2-axial user interface is clear; it has a starting and an ending point. The combination of 2-axial 2-mode user interface suites best and is designed for extremely small mobile devices, such as miniature mobile phones and PCs. Same sized device with mechanical keys cannot be created without a radical dropping in usability.

The method of the present invention comprises the steps of moving from one collection to another when detecting first predefined orientation control instructions with said software means from said input device and moving within one collection when detecting second predefined orientation control instructions with said software means from said input device . First and second predefined orientation control instructions refer preferably to horizontal and vertical movements on the touch pad. However, it is clear that it is very hard to produce precisely horizontal or vertical movement on the touch pad. Therefore, horizontal and vertical here means actually that the movements are mostly horizontal or vertical. It can be also determined that the direction (horizontal or vertical) is determined by which component (x-axial component or y- axial component) is dominating. The term diagonal used in the previous example refers^ preferably to a direc- tion in which the x-axial and the y-axial factors are close to each other.

Figure 9 represents an example where a personal identification code (PIN) is inserted to a mobile phone. A linear number selector appears on the top of the display. The right number is chosen by moving a finger right or left on the touch pad of the mobile phone. The desired number is accepted by tapping the touch pad.

Figure 10 represents an example of making a phone call. When the user moves his/her finger left on the touch pad, a call menu appears. The call menu is browsed by moving the finger down. The "Dial" text is placed under the static cursor. All the action so far have been activated while being in the first operating mode. The second operating mode is entered by accepting the dialing by tapping the touch pad. A linear number selector appears on the top of the display. The wanted number is selected by moving the finger right or left on the touch pad of the mobile phone. The desired number is accepted by tapping the touch pad. An unwanted number can be removed by selecting the arrow symbol from the linear number selector. The dialed numbers are accepted by moving the finger to the downright corner. An OK symbol appears . The dialed number is accepted by tapping the touch pad. A linear action selector appears on the top of the display with the dialed number. The calling is accepted by tapping the touch pad. Exit into the previous window can be done by moving the finger to the down-left corner. A C symbol activates. The exiting is accepted by tapping the touch pad. Figure 11 represents an example of making a speed dial phone call. When the user moves his/her finger left on the touch pad, a call menu appears. The speed dialing is activated by tapping the touch pad. A linear number selector appears on the top of the dis- play. A saved number is selected by moving the finger left or right on the touch pad. A number is dialed by tapping the touch pad. The dialed number is accepted by moving the finger to the down-right corner. An OK symbol appears. The dialed number is accepted by tap- ping the touch pad. The linear action selector appears on the top of the display with the dialed number. The calling is accepted by tapping the touch pad. Exit into previous window can be done by moving the finger to the down-left corner. A C symbol activates. The ex- iting is accepted by tapping the touch pad.

Figure 12 represents an another example of making a phone call. When the user moves his/her fin- ger left on the touch pad, a call menu appears. The call menu is browsed by moving the finger down. The "Dialed numbers" text is placed under the static cursor. All the action so far have been activated while being in the first operating mode. The second operating mode is entered by tapping the touch pad. Previous calls list appears on the display. The list is browsed by moving the finger up or down. A previous call is accepted by tapping the touch pad. A linear action se- lector appears on the top of the display with the dialed number. The calling is accepted by tapping the touch pad. Exit into previous window can be done by moving the finger to the down-left corner. A C symbol activates. The exiting is accepted by tapping the touch pad.

Figure 13 represents another example of selecting a name from the phone book. When the user moves his/her finger down, the names/numbers can be found. A phone book menu appears. Phone book names and numbers can be browsed by moving a finger down or up. A desired name is accepted by tapping the touch pad. A linear action selector appears on the top of the display with the selected name. The, calling is accepted by tapping the touch pad. Exit into the previous win- dow can be done by moving the finger to the down-left corner. A C symbol activates. The exiting is accepted by tapping the touch pad.

Figure 14 represents an example of selecting a name from the phone book. When the user moves his/her finger down, the phone book menu appears. The "Find" alternative is selected by tapping the touch pad. A phone book editor with a linear letter selector appears on the top of the display. The letter is accepted by tapping the touch pad (1-4 times) . The first possibility on alphabetical order appears on the display. The names are browsed by moving the finger down or up. A desired name is accepted by tapping the touch pad. A linear action selector appears on the top of the display with the selected name. The calling is accepted by tapping the touch pad. Exit into the previous window can be done by moving the finger to the down-left corner. A C symbol activates. The exiting is accepted by tapping the touch pad.

Figure 15 represents an example of accepting a call. An OK symbol activates on the linear action selector on the top of the display. The call is ac- cepted by pressing the touch pad.

Figure 16 represents an example of refusing a call. The finger is moved to the left. A C symbol activates on the linear action selector on the top of the display. The call is cancelled by pressing the touch pad.

Figures 17 and 18 represent an example of browsing the menu. When moving a finger to the left in the first operating mode, the call menu appears. The menu is browsed by moving the finger left or right on the touch pad.

Figures 19 and 20 represent an example of browsing the menu files. The menu files are browsed by moving a finger up or down on the touch pad.. The desired function is selected by tapping the touch pad. Figures 21 - 29 represent examples of sending a message. The message menu is browsed by moving a finger down. The "write message" is selected by tapping the touch pad. A message editor with a linear letter selector appears on the top of the display (figures 21 and 22) . A wanted letter is selected from the linear letter selector by moving the finger left of right on the touch pad. A desired letter is accepted by tapping the touch pad. T9 dictionary can be used in the message writing. A letter can by removed by tapping the arrow symbol in the linear selector. The M symbol represents the letter mode editor (figure 23) . The written text can be browsed by moving finger up or down on the touch pad. OK and C symbols disappear during the operation (figure 24) . When a write mode is to be changed, the M symbol is selected from the linear selector. The mode editor is activated by tapping the touch pad. Now, a linear selector with new action description appears. A wanted action is selected by moving the finger left or right on the touch pad. An action is accepted by tapping the touch pad. A linear selector with wanted action marks appears. A wanted mark is selected by moving the finger left or right on the touch pad. The mark is accepted by tapping the touch pad. Exit to the previous window is done by moving the finger to the down-left corner. A C symbol activates and the exiting is accepted by tap- ping the touch pad (figure 25) ."

When a message is ready to be sent, a finger is moved to the down-right corner. An OK symbol then activates. The action is accepted by tapping the touch pad. A linear selector with new action description ap- pears. Sending the written message is a default value. The sending is accepted by tapping the touch pad (figure 26) . A dialing editor with numbers appears. The wanted number is selected by moving the finger left or right on the touch pad. The numbers are dialed by tap- ping the touch pad. A number can by removed by tapping the arrow symbol in the linear selector. The dialed number is accepted by moving the finger to the downright corner. An OK symbol appears. The message sending is accepted by tapping the touch pad. The exiting to the previous window is done by moving the finger to the down-left corner. A C symbol activates and the exiting is accepted by tapping the touch pad (figure 27) .

When sending a message by name, a finger is moved to the down-right. An OK symbol activates. The action is accepted by tapping the touch pad. A linear selector with new action description appears. A wanted action is selected by moving the finger left or right on the touch pad. Sending the written message is a default value. The sending is accepted by tapping the touch pad (figure 28) . A dialling editor with numbers appears. The "find" action is selected by tapping the touch pad. The phone book name list with a linear letter selector appears on the display. An initial letter of the wanted name is selected by moving the finger left or right on the touch pad. The letter is accepted by tapping the touch pad (1-4 times). The first possibility in alphabetical order appears. The right name is chosen by moving the finger down or up on the touch pad. The sending is accepted by moving the finger to the down-right corner. An OK symbol appears. The mes- sage sending is accepted by tapping the touch pad. The exiting to the previous window can be done by moving the finger to the down-left corner. A C symbol activates and the exiting is accepted by tapping the touch pad (figure 29) . Figure 30 represents an example of typing a

Personal Identification Code (PIN) . A squared keyboard with numbers appears . The right number can be chosen by moving a finger on the touch pad. Numbers are dialed by tapping or pressing the touch pad. In figure 30 moving within the numbers can be done in two ways. The first is to move the finger on the touch pad only to the left or to the right. In this solution the numbers are run through only in horizontal rows. The second solution is that the finger can be moved either horizontally or vertically on the touch pad.

Figure 31 represents an example of making a call. When the user moves his/her finger left on the touch pad, a call menu appears. The call menu is browsed by moving the finger down. The "Dial" text is placed under the static cursor. All the actions so far have been activated while being in the first operating mode. The second operating mode is entered by accept- ing the dialing by tapping the touch pad. A squared keyboard with number appears. A wanted number is selected by moving the finger. In one embodiment, in the beginning of the number selection the cursor is always under the number five. When a number is selected, the cursor is taken back to the number five. Alternatively, when a number is selected the cursor remains under the latest selected number. A number is dialed by tapping the touch pad. The dialed numbers are ac- cepted by moving the finger to the down-right corner. An OK symbol appears. The dialing is accepted by tapping the touch pad. The linear action selector appears on the top of the display with the dialed number. The calling is accepted by tapping the touch pad. Exit into the previous window can be done by moving the finger to the down-left corner. A C symbol activates. The exiting is accepted by tapping the touch pad.

Figures 32 - 38 represent examples of sending a message. The message menu is browsed by moving the finger down. The "write message" is selected by tapping the touch pad. A message editor with squared number and letter selectors appears (figure 32) . The wanted letter is selected by moving the finger on the touch pad. The selection is accepted by tapping the touch pad. A letter or a number can be removed by tapping the arrow symbol (figure 33) . To browse the written text, the finger is moved to the down-right corner. An OK symbol appears. The written text is accepted by tapping the touch pad. The text can be browsed by moving the finger up or down on the touch pad. The OK and c symbols disappear during the operation (figure 34) .

When a message is to be sent, a finger is moved to the down-right corner. An OK symbol then ac- tivates. The written text is accepted by tapping the touch pad. A linear selector with new action description appears. Sending the written message is a default value. The sending is accepted by tapping the touch pad (figure 35) . A linear selector with new action descriptions appears. The "dial" text is selected from the linear selector by moving the finger right of left. The action is accepted by tapping the touch pad. A squared keyboard with numbers appears . A wanted number is selected by moving the finger. In one embodiment, in the beginning of the number selection the cursor is always under the number five. When a number is selected, the cursor is taken back to the number five. Alternatively, when a number is selected the cursor remains under the latest selected number. A number is dialed by tapping the touch pad. Accept the dialed numbers by moving the finger to the down-right corner. An OK symbol appears. The dialing is a-ccepted by tapping the touch pad. The linear action selector appears on the top of the display with the dialed number. Exit into the previous window can be done by moving the finger to the down-left corner. A C symbol ac- tivates. The exiting is accepted by tapping the touch pad (figure 36) .

When sending a message by name, a finger is moved to down-right. An OK symbol then activates. The action is accepted by tapping the touch pad. A linear selector with new action description appears. A wanted action is selected by moving the finger left or right on the touch pad. Sending the written message is a default value . The sending is accepted by tapping the touch pad (figure 37) . A linear selector with new ac- tion description appears. The "Find" action is selected by tapping the touch pad. The phone book name list with a linear letter selector appears on the display. An initial letter of the wanted name is selected by moving the finger left or right on the touch pad. The letter is accepted by tapping the touch pad (1-4 times) . The first possibility in alphabetical order appears. The right name is chosen by moving the finger down or up on the touch pad. The sending is accepted by moving the finger to the down-right corner. An OK symbol appears. The message sending is accepted by tapping the touch pad. The exiting to the previous window is done by moving the finger to the down-left corner. A C symbol activates and the exiting is accepted by tapping the touch pad (figure 38) .

It is obvious to a person skilled in the art that with the advancement of technology, the basic idea of the invention may be implemented in various ways. The invention and its embodiments are thus not limited to the examples described above, instead they may vary within the scope of the claims .

Claims

1. A method for managing separate collections of information and/or functions with a device comprising at least a display, an input device and software means for receiving controlling instructions from said input device for controlling said display, wherein the method comprises the steps of: characteri zed in that while being in a first operating mode, the method comprises the steps of: displaying said collections on said display; moving from one collection to another, when detecting first predefined orientation control instructions with said software means from said input device; moving within one collection when detecting second predefined orientation control instructions with said software means from said input device; and while moving within one collection, changing the operating mode to second operating mode when one col- lection and/or function is selected.
2. The method according to claim 1, characterized in that said collections are organized to a linear stack the collection order remaining the same all the time.
3. The method according to claim 1 or 2, characterized in that only one collection is visible at a time on said display.
4. The method according to any of the claims 1, 2 or 3, characterized in that scrolling one collection vertically on said display when detecting second predefined orientation control instructions with said software means from said input device.
5. The method according to any of the claims 1, 2, 3 or 4, characteri zed in that when be- ing in said second operating mode, the method further comprises the steps of: forming a menu on said display, said menu being possibly wider than said display area; moving within said menu when detecting first pre- . defined orientation control instructions with said software means from said input device; and selecting a menu item of said menu with said input device .
6. The method according to claim 5, cha acteri zed in that when being in said second op- erating mode, the method further comprises the steps of: scrolling one collection vertically on said display when detecting second predefined orientation control instructions with said software means from said input device, said menu being visible all the time on said display; and selecting a menu item of said menu with said input device, the selection becoming visible on said display.
7. The method according to any of the claims
1, 2, 3 or 4, characterized in that when being in said first operating mode, the method further comprises the step of : while moving vertically within one collection, moving from one collection to another when detecting first predefined orientation control instructions with said software means from said input device .
8. The method according to claim 5 or 6, characteri zed in that when being in said second operating mode, the method further comprises the step of : displaying a key grid on said display; setting one of the keys active key; moving among key grid items based on the orienta- tion control instructions of said input device; and selecting a key grid item by taping said input device.
9. The method according to any of the claims 1, 2, 3, 4, 5, 6, 7 or 8, characteri zed in that the width of one collection is determined by the width of said display.
10. The method according to any of the claims
1, 2, 3, 4, 5, 6, 7, 8 or 9, characteri zed in that the length of one collection is not limited.
11. The method according to any of the claims 1, 4, 5, 6, or 7, characteri zed in that : causing a horizontal movement with said input device corresponds to first predefined orientation control instructions; and causing a vertical movement with said input device corresponds to second predefined orientation control instructions.
12. The method according to any of the claims 1, 5, 6, 7, 8 or 11, characteri zed in that adjusting the speed of said horizontal or vertical movements on said display in said first or said second operating mode on the basis of the acceleration factor of the movement on said input device .
13. The method according to any of the claims 1, 5 or 6, characterized , in that said selection of an item in said first or said second operating mode is indicated by taping said input device.
14. The method according to claim 5 or 6, characterized in that in said second operating mod : rejecting or approving information on said display by a third or fourth predefined orientation movement; and confirming the action by taping said input device.
15. The method according to any of the claims 1 - 14, characterized in that said input de- vice is a touch pad.
16. A program product arranged in a device that executes the program steps recorded in a com- puter-readable medium to manage separate collections of information and/or functions with said device, said device comprising at least a display and an input device for controlling said device, characteri zed in that the program product comprises: a recordable medium; and a program of computer-readable instructions to perform the method comprising the steps of: displaying said collections on said display; moving from one collection to another when detecting first predefined orientation control instructions from said input device; moving within one collection when detecting second predefined orientation control instructions from said input device; and while moving within one collection, changing the operating mode to second operating mode when one collection and/or function is selected.
17. The program product according to claim
16, characterized in that said collections are organized to a linear stack the collection order remaining the same all the time.
18. The program product according to claim 16 or 17, characteri zed in that only one collection is visible at a time on said display.
19. The program product according to any of the claims claim 16, 17 or 18, characterized in that scrolling one collection vertically on said display when detecting second predefined orientation control instructions with said software means from said input device .
20. The program product according to any of the claims 16, 17, 18 or 19, characterized in that when being in said second operating mode: forming a menu within said display, said menu being possibly wider than said display area; moving within said menu when detecting first predefined orientation control instructions from said input device; and selecting a menu item of said menu with said input device.
21. The program product according to claim 20, characteri zed in that when being in said second operating mode: scrolling one collection vertically when detecting second predefined orientation control instructions from said input device, said menu being visible all . the time on said display; and selecting a menu item of said menu with said input device, the selection becoming visible on said dis- play.
22. The program product according to any of the claims 16, 17, 18 or 19, characterized in that when being in said first operating mode: while moving vertically within one collection, moving from one collection to another when detecting first predefined orientation control instructions from said input device.
23. The program product according to claim 20 or 21, characterized in that when being in said second operating mode, the method further comprises the step of: displaying a key grid on said display; setting one of the keys active key; moving among key grid items based on the orienta- tioή control instructions of said input device; and selecting a key grid item by taping said input de^- vice.
24. The program product according to any of the claims 16, 17, 18, 19, 20, 21, 22 or 23, char- acteri zed in that the width of on collection is determined by the width of said display.
25. The program product according to any of the claims 16, 17, 18, 19, 20, 21, 22, 23 or 24, characterized in that the length of one collection is not limited.
26. The program product according to any of the claims 16, 19, 20, 21 or 22, c h a r a c - terized in that : causing a horizontal movement with said input device corresponds to first predefined orientation con- trol instructions; and causing a vertical movement with said input device corresponds to second predefined orientation control instructions.
27. The program product according to any of the claims 16, 19, 20, 21, 23 or 26, charac teri zed in that adjusting the speed of said horizontal or vertical movements on said display in said first or said second operating mode on the basis of the acceleration factor of the movement on said input device.
28. The program product according to any of the claims 16, 19 or 20, characterized in that said selection of an item in said first, or said second operating mode is indicated by taping said in- put device.
29. The program product according to claim 20 or 21, characteri zed in that in said second operating mode: rejecting or approving information on said display by a third or fourth predefined orientation movement; and confirming the action by taping said input device.
30. The program product according to any of the claims 16 - 29, characterized in that said input device is a touch pad.
31. A hand-held device comprising at least a housing, electronic circuitry located in said housing, a program product for executing user actions, said program product comprising separate collections of information and/or functions, and a display presenting said actions, characterized in that said device comprises : the user interface comprising a combined navigation and selection input device, wherein displaying said collections on said display; wherein when moving towards first predefined orientation on said surface, moving from one collection to another in said program, product, wherein when moving towards second predefined orientation on said surface, moving within one collection of said program product, wherein while moving within one collection of said program product, changing the operating mode to second operating mode when one collection and/or function is selected in said program product .
32. The hand-held device according to claim 31, characteri zed in that said collections are organized to a linear stack the collection order remaining the same all the time.
33. The hand-held device according to claim 31 or 32, characteri zed in that only one collection is visible at a time on said display.
34. The hand-held device according to any of the claims 31, 32 or 33, characterized in that scrolling one collection vertically on said display when detecting second predefined orientation con- trol instructions with said software means from said input device.
35. The hand-held device according to any of the claims 31, 32, 33 or 34, characterized in that when being in said second operating mode: forming a menu on the display, said menu being possibly wider than said display area; moving within said menu when moving towards first predefined orientation on said surface; and selecting a menu item of said menu with said input, device.
36. The hand-held device according to claim
35, characteri zed in that when being in said second operating mode: scrolling one collection vertically on said display when moving towards second predefined orientation on said surface, said menu being visible all the time; and selecting a menu item with said input device, the selection becoming visible on the display.
37. The hand-held device according to any of the claims 31, 32, 33 or 34, characterized in that when being in said first operating mode: while moving vertically within one collection, moving from one collection to another when detecting first predefined orientation control instructions from said input device.
38. The hand-held device according to claim 35 or 36, characterized in that when being in said second operating mode, the method further comprises the step of: displaying a key. grid on said display; setting one of the keys active key; moving among key grid items based on the orientation control instructions of said input device; and selecting a key grid item by taping said input de- vice.
39. The hand-held device according to according to any of the claims 31, 32, 33, 34, 35, 36, 37 or 38, characterized in that the width of one collection is determined by the width of said display.
40. The hand-held device according to according to any of the claims 31, 32, 33, 34, 35, 36, 37., 38 or 39, characterized in that the length of one collection is not limited.
41. The hand-held device according to any of the claims 31, 34, 35, 36 or 37, c h a r a c - t e r i z e d in that : causing a horizontal movement with said input device corresponds to first predefined orientation control instructions; and causing a vertical movement with said input device corresponds to second predefined orientation control instructions.
42. The hand-held device according to any of the claims 31, 35, 36, 37, 38 or 41, charac teri zed in that adjusting the speed of said hori- zontal or vertical movements on said display in said first or said second operating mode on the basis of the acceleration factor of the movement on said input device .
.
43. The hand-held device according to any of the claims 31, 35 or 36, characterized in that said selection of an item in said first or said second operating mode is indicated by taping said surface.
44. The hand-held device according to claim 35 or 36, characteri zed in that in said second operating mode: rejecting or approving information on said display by a third or fourth predefined orientation movement on said surface; and confirming the action by taping said surface.
45. The hand-held device according to any of the claims 31 - 44, characterized in that said input device is a touch pad.
46. The hand-held device according to any of the claims 31 - 45, characterized in that said device is a mobile phone.
47. The hand-held device according to any of the claims 31 - 46, characteri zed in that said device is a personal digital assistant.
PCT/FI2002/000285 2001-09-10 2002-04-03 Navigation method, program product and device for presenting information in a user interface WO2003023593A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FI20011788A FI114175B (en) 2001-09-10 2001-09-10 Navigation Method, program product and apparatus for presenting information in the user interface
FI20011788 2001-09-10

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20020712986 EP1425651A1 (en) 2001-09-10 2002-04-03 Navigation method, program product and device for presenting information in a user interface

Publications (1)

Publication Number Publication Date
WO2003023593A1 true true WO2003023593A1 (en) 2003-03-20

Family

ID=8561870

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2002/000285 WO2003023593A1 (en) 2001-09-10 2002-04-03 Navigation method, program product and device for presenting information in a user interface

Country Status (3)

Country Link
EP (1) EP1425651A1 (en)
FI (1) FI114175B (en)
WO (1) WO2003023593A1 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005103867A1 (en) * 2004-04-22 2005-11-03 Denis Fompeyrine Multi-terminal control interface to manage information within user groups connected to each other in peer-to-peer networks
WO2006066742A1 (en) * 2004-12-21 2006-06-29 Daimlerchrysler Ag Control system for a vehicle
WO2007085689A1 (en) * 2006-01-20 2007-08-02 Head Inhimillinen Tekijä Oy User interface and a computer program product and a method for its implementation
WO2008030779A2 (en) * 2006-09-06 2008-03-13 Apple Inc. Portable electronic device for photo management
WO2009085779A1 (en) * 2007-12-27 2009-07-09 Apple Inc. Insertion marker placement on touch sensitive display
US7984384B2 (en) 2004-06-25 2011-07-19 Apple Inc. Web view layer for accessing user interface elements
US8042042B2 (en) * 2006-02-09 2011-10-18 Republic Of Korea Touch screen-based document editing device and method
US8120586B2 (en) 2007-05-15 2012-02-21 Htc Corporation Electronic devices with touch-sensitive navigational mechanisms, and associated methods
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8255830B2 (en) 2009-03-16 2012-08-28 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8302020B2 (en) 2004-06-25 2012-10-30 Apple Inc. Widget authoring and editing environment
CN102819398A (en) * 2012-08-08 2012-12-12 许继集团有限公司 Method for slidingly controlling camera via touch screen device
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8694026B2 (en) 2007-06-28 2014-04-08 Apple Inc. Location based services
US8698762B2 (en) 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
US8924144B2 (en) 2007-06-28 2014-12-30 Apple Inc. Location based tracking
US8930233B2 (en) 2000-06-07 2015-01-06 Apple Inc. System and method for anonymous location based services
US8954871B2 (en) 2007-07-18 2015-02-10 Apple Inc. User-centric widgets and dashboards
US8977294B2 (en) 2007-10-10 2015-03-10 Apple Inc. Securely locating a device
CN104408172A (en) * 2014-12-14 2015-03-11 王湘龙 Method for achieving quick intelligent navigation of mobile device list view based on word frequency statistic screening
US9032318B2 (en) 2005-10-27 2015-05-12 Apple Inc. Widget security
US9031581B1 (en) 2005-04-04 2015-05-12 X One, Inc. Apparatus and method for obtaining content on a cellular wireless device based on proximity to other wireless devices
US9066199B2 (en) 2007-06-28 2015-06-23 Apple Inc. Location-aware mobile device
US9104294B2 (en) 2005-10-27 2015-08-11 Apple Inc. Linked widgets
US9109904B2 (en) 2007-06-28 2015-08-18 Apple Inc. Integration of map services and user applications in a mobile device
US9131342B2 (en) 2007-06-28 2015-09-08 Apple Inc. Location-based categorical information services
US9250092B2 (en) 2008-05-12 2016-02-02 Apple Inc. Map service with network-based query for search
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US9417888B2 (en) 2005-11-18 2016-08-16 Apple Inc. Management of user interface elements in a display environment
US9513930B2 (en) 2005-10-27 2016-12-06 Apple Inc. Workflow widgets
US9619132B2 (en) 2007-01-07 2017-04-11 Apple Inc. Device, method and graphical user interface for zooming in on a touch-screen display
US9702709B2 (en) 2007-06-28 2017-07-11 Apple Inc. Disfavored route progressions or locations
US9979776B2 (en) 2009-05-01 2018-05-22 Apple Inc. Remotely locating and commanding a mobile device
US10073584B2 (en) 2016-09-23 2018-09-11 Apple Inc. User interfaces for retrieving contextually relevant media content

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5835091A (en) * 1996-08-21 1998-11-10 International Business Machines Corporation Manipulating and displaying a plurality of views in a graphical user interface
US5995084A (en) 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6037937A (en) * 1997-12-04 2000-03-14 Nortel Networks Corporation Navigation tool for graphical user interface
GB2355142A (en) 1999-10-08 2001-04-11 Nokia Mobile Phones Ltd Portable device having menu driven input
EP1111878A2 (en) 1999-12-22 2001-06-27 Nokia Mobile Phones Ltd. Handheld devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5835091A (en) * 1996-08-21 1998-11-10 International Business Machines Corporation Manipulating and displaying a plurality of views in a graphical user interface
US5995084A (en) 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6037937A (en) * 1997-12-04 2000-03-14 Nortel Networks Corporation Navigation tool for graphical user interface
GB2355142A (en) 1999-10-08 2001-04-11 Nokia Mobile Phones Ltd Portable device having menu driven input
EP1111878A2 (en) 1999-12-22 2001-06-27 Nokia Mobile Phones Ltd. Handheld devices

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8930233B2 (en) 2000-06-07 2015-01-06 Apple Inc. System and method for anonymous location based services
US9753606B2 (en) 2002-03-19 2017-09-05 Facebook, Inc. Animated display navigation
US9886163B2 (en) 2002-03-19 2018-02-06 Facebook, Inc. Constrained display navigation
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US9851864B2 (en) 2002-03-19 2017-12-26 Facebook, Inc. Constraining display in display navigation
US10055090B2 (en) 2002-03-19 2018-08-21 Facebook, Inc. Constraining display motion in display navigation
US9626073B2 (en) 2002-03-19 2017-04-18 Facebook, Inc. Display navigation
US9678621B2 (en) 2002-03-19 2017-06-13 Facebook, Inc. Constraining display motion in display navigation
WO2005103867A1 (en) * 2004-04-22 2005-11-03 Denis Fompeyrine Multi-terminal control interface to manage information within user groups connected to each other in peer-to-peer networks
US8291332B2 (en) 2004-06-25 2012-10-16 Apple Inc. Layer for accessing user interface elements
US8266538B2 (en) 2004-06-25 2012-09-11 Apple Inc. Remote access to layer and user interface elements
US7984384B2 (en) 2004-06-25 2011-07-19 Apple Inc. Web view layer for accessing user interface elements
US9753627B2 (en) 2004-06-25 2017-09-05 Apple Inc. Visual characteristics of user interface elements in a unified interest layer
US8464172B2 (en) 2004-06-25 2013-06-11 Apple Inc. Configuration bar for launching layer for accessing user interface elements
US9507503B2 (en) 2004-06-25 2016-11-29 Apple Inc. Remote access to layer and user interface elements
US8302020B2 (en) 2004-06-25 2012-10-30 Apple Inc. Widget authoring and editing environment
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
WO2006066742A1 (en) * 2004-12-21 2006-06-29 Daimlerchrysler Ag Control system for a vehicle
US9253616B1 (en) 2005-04-04 2016-02-02 X One, Inc. Apparatus and method for obtaining content on a cellular wireless device based on proximity
US9942705B1 (en) 2005-04-04 2018-04-10 X One, Inc. Location sharing group for services provision
US9031581B1 (en) 2005-04-04 2015-05-12 X One, Inc. Apparatus and method for obtaining content on a cellular wireless device based on proximity to other wireless devices
US9955298B1 (en) 2005-04-04 2018-04-24 X One, Inc. Methods, systems and apparatuses for the formation and tracking of location sharing groups
US9967704B1 (en) 2005-04-04 2018-05-08 X One, Inc. Location sharing group map management
US9854402B1 (en) 2005-04-04 2017-12-26 X One, Inc. Formation of wireless device location sharing group
US9736618B1 (en) 2005-04-04 2017-08-15 X One, Inc. Techniques for sharing relative position between mobile devices
US9854394B1 (en) 2005-04-04 2017-12-26 X One, Inc. Ad hoc location sharing group between first and second cellular wireless devices
US9584960B1 (en) 2005-04-04 2017-02-28 X One, Inc. Rendez vous management using mobile phones or other mobile devices
US9167558B2 (en) 2005-04-04 2015-10-20 X One, Inc. Methods and systems for sharing position data between subscribers involving multiple wireless providers
US9467832B2 (en) 2005-04-04 2016-10-11 X One, Inc. Methods and systems for temporarily sharing position data between mobile-device users
US9749790B1 (en) 2005-04-04 2017-08-29 X One, Inc. Rendez vous management using mobile phones or other mobile devices
US9615204B1 (en) 2005-04-04 2017-04-04 X One, Inc. Techniques for communication within closed groups of mobile devices
US9185522B1 (en) 2005-04-04 2015-11-10 X One, Inc. Apparatus and method to transmit content to a cellular wireless device based on proximity to other wireless devices
US9654921B1 (en) 2005-04-04 2017-05-16 X One, Inc. Techniques for sharing position data between first and second devices
US9032318B2 (en) 2005-10-27 2015-05-12 Apple Inc. Widget security
US9104294B2 (en) 2005-10-27 2015-08-11 Apple Inc. Linked widgets
US9513930B2 (en) 2005-10-27 2016-12-06 Apple Inc. Workflow widgets
US9417888B2 (en) 2005-11-18 2016-08-16 Apple Inc. Management of user interface elements in a display environment
WO2007085689A1 (en) * 2006-01-20 2007-08-02 Head Inhimillinen Tekijä Oy User interface and a computer program product and a method for its implementation
US8042042B2 (en) * 2006-02-09 2011-10-18 Republic Of Korea Touch screen-based document editing device and method
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
EP2282275A1 (en) * 2006-09-06 2011-02-09 Apple Inc. Portable electronic device for photo management
US8106856B2 (en) 2006-09-06 2012-01-31 Apple Inc. Portable electronic device for photo management
EP2390799A1 (en) * 2006-09-06 2011-11-30 Apple Inc. Portable electronic device for photo management
WO2008030779A3 (en) * 2006-09-06 2008-06-26 Apple Inc Portable electronic device for photo management
WO2008030779A2 (en) * 2006-09-06 2008-03-13 Apple Inc. Portable electronic device for photo management
EP2390779A1 (en) * 2006-09-06 2011-11-30 Apple Inc. Portable electronic device for photo management
JP2013239193A (en) * 2006-09-06 2013-11-28 Apple Inc Image acquisition and management method, portable electronic devices, and storage medium
JP2015215913A (en) * 2006-09-06 2015-12-03 アップル インコーポレイテッド Portable electronic device for photo management
US8305355B2 (en) 2006-09-06 2012-11-06 Apple Inc. Portable electronic device for photo management
US9459792B2 (en) 2006-09-06 2016-10-04 Apple Inc. Portable electronic device for photo management
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US9632695B2 (en) 2006-10-26 2017-04-25 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9207855B2 (en) 2006-10-26 2015-12-08 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9619132B2 (en) 2007-01-07 2017-04-11 Apple Inc. Device, method and graphical user interface for zooming in on a touch-screen display
US8120586B2 (en) 2007-05-15 2012-02-21 Htc Corporation Electronic devices with touch-sensitive navigational mechanisms, and associated methods
US9310206B2 (en) 2007-06-28 2016-04-12 Apple Inc. Location based tracking
US9702709B2 (en) 2007-06-28 2017-07-11 Apple Inc. Disfavored route progressions or locations
US9066199B2 (en) 2007-06-28 2015-06-23 Apple Inc. Location-aware mobile device
US9414198B2 (en) 2007-06-28 2016-08-09 Apple Inc. Location-aware mobile device
US10064158B2 (en) 2007-06-28 2018-08-28 Apple Inc. Location aware mobile device
US9109904B2 (en) 2007-06-28 2015-08-18 Apple Inc. Integration of map services and user applications in a mobile device
US8694026B2 (en) 2007-06-28 2014-04-08 Apple Inc. Location based services
US9578621B2 (en) 2007-06-28 2017-02-21 Apple Inc. Location aware mobile device
US9891055B2 (en) 2007-06-28 2018-02-13 Apple Inc. Location based tracking
US8924144B2 (en) 2007-06-28 2014-12-30 Apple Inc. Location based tracking
US9131342B2 (en) 2007-06-28 2015-09-08 Apple Inc. Location-based categorical information services
US9483164B2 (en) 2007-07-18 2016-11-01 Apple Inc. User-centric widgets and dashboards
US8954871B2 (en) 2007-07-18 2015-02-10 Apple Inc. User-centric widgets and dashboards
US8977294B2 (en) 2007-10-10 2015-03-10 Apple Inc. Securely locating a device
USRE46864E1 (en) 2007-12-27 2018-05-22 Apple Inc. Insertion marker placement on touch sensitive display
WO2009085779A1 (en) * 2007-12-27 2009-07-09 Apple Inc. Insertion marker placement on touch sensitive display
US8698773B2 (en) 2007-12-27 2014-04-15 Apple Inc. Insertion marker placement on touch sensitive display
US8610671B2 (en) 2007-12-27 2013-12-17 Apple Inc. Insertion marker placement on touch sensitive display
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US9529524B2 (en) 2008-03-04 2016-12-27 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US9702721B2 (en) 2008-05-12 2017-07-11 Apple Inc. Map service with network-based query for search
US9250092B2 (en) 2008-05-12 2016-02-02 Apple Inc. Map service with network-based query for search
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8661362B2 (en) 2009-03-16 2014-02-25 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8510665B2 (en) 2009-03-16 2013-08-13 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9846533B2 (en) 2009-03-16 2017-12-19 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8584050B2 (en) 2009-03-16 2013-11-12 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8255830B2 (en) 2009-03-16 2012-08-28 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9875013B2 (en) 2009-03-16 2018-01-23 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9979776B2 (en) 2009-05-01 2018-05-22 Apple Inc. Remotely locating and commanding a mobile device
US8698762B2 (en) 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US9857941B2 (en) 2010-01-06 2018-01-02 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US8677232B2 (en) 2011-05-31 2014-03-18 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9244605B2 (en) 2011-05-31 2016-01-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8719695B2 (en) 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9092130B2 (en) 2011-05-31 2015-07-28 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
CN102819398A (en) * 2012-08-08 2012-12-12 许继集团有限公司 Method for slidingly controlling camera via touch screen device
CN104408172A (en) * 2014-12-14 2015-03-11 王湘龙 Method for achieving quick intelligent navigation of mobile device list view based on word frequency statistic screening
CN104408172B (en) * 2014-12-14 2017-10-24 王湘龙 Based on the mobile device list view of word frequency statistics screening method for rapid intelligent navigation
US10073584B2 (en) 2016-09-23 2018-09-11 Apple Inc. User interfaces for retrieving contextually relevant media content

Also Published As

Publication number Publication date Type
EP1425651A1 (en) 2004-06-09 application
FI20011788A0 (en) 2001-09-10 application
FI20011788A (en) 2003-03-11 application
FI114175B1 (en) grant
FI114175B (en) 2004-08-31 application
FI20011788D0 (en) grant

Similar Documents

Publication Publication Date Title
US7190351B1 (en) System and method for data input
US6496182B1 (en) Method and system for providing touch-sensitive screens for the visually impaired
US6944472B1 (en) Cellular phone allowing a hand-written character to be entered on the back
US20060143574A1 (en) Display method, portable terminal device, and display program
EP1818786A1 (en) Navigation tool with audible feedback on a handheld communication device
US20050071761A1 (en) User interface on a portable electronic device
US20080042983A1 (en) User input device and method using fingerprint recognition sensor
US20070024646A1 (en) Portable electronic apparatus and associated method
US20070192027A1 (en) Navigation tool with audible feedback on a wireless handheld communication device
US7856605B2 (en) Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US20070120832A1 (en) Portable electronic apparatus and associated method
US20050129199A1 (en) Input device, mobile telephone, and mobile information device
US20080303795A1 (en) Haptic display for a handheld electronic device
US7602378B2 (en) Method, system, and graphical user interface for selecting a soft keyboard
US20040263479A1 (en) Active keyboard system for handheld electronic devices
EP0715441A1 (en) Roller bar menu access system and method for cellular telephones
US20040141011A1 (en) Graphical user interface features of a browser in a hand-held wireless communication device
US20070229476A1 (en) Apparatus and method for inputting character using touch screen in portable terminal
US20060033723A1 (en) Virtual keypad input device
US20080098331A1 (en) Portable Multifunction Device with Soft Keyboards
US8171417B2 (en) Method for switching user interface, electronic device and recording medium using the same
US7886233B2 (en) Electronic text input involving word completion functionality for predicting word candidates for partial word inputs
US6957397B1 (en) Navigating through a menu of a handheld computer using a keyboard
US20060007129A1 (en) Scroll wheel with character input
US20100214218A1 (en) Virtual mouse

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NO NZ OM PH PT RO RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE CH CY DE DK FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2002712986

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2002712986

Country of ref document: EP

NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)