AU7792600A - Operation method of user interface of hand-held device - Google Patents

Operation method of user interface of hand-held device Download PDF

Info

Publication number
AU7792600A
AU7792600A AU77926/00A AU7792600A AU7792600A AU 7792600 A AU7792600 A AU 7792600A AU 77926/00 A AU77926/00 A AU 77926/00A AU 7792600 A AU7792600 A AU 7792600A AU 7792600 A AU7792600 A AU 7792600A
Authority
AU
Australia
Prior art keywords
display
data
user interface
operation method
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU77926/00A
Inventor
Jukka-Pekka Metsavainio
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Myorigo Oy
Original Assignee
Myorigo Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from FI992191A external-priority patent/FI19992191A/en
Application filed by Myorigo Oy filed Critical Myorigo Oy
Publication of AU7792600A publication Critical patent/AU7792600A/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)
  • Calculators And Similar Devices (AREA)

Description

WO 01/27735 PCT/FI00/00871 1 Operation method of user interface of hand-held device The invention is related to an operation method of the user interface of a hand-held data-processing device. A hand-held device means here a palm-top or pocket computer, 5 mobile phone, communicator or like device, essential features of which include displaying data on the display of the device and in which data on the display of the device is changed for finding and selecting a desired data. User interfaces of above mentioned devices are presently mostly of type "point and click". A general operation method of a user interface is browsing and selecting data by 10 means of keys. Palm-top computers are usually managed with a touch sensitive display, a virtual keyboard, a pen, and often some auxiliary hard keys or buttons. Firstly, this kind of user interfaces are difficult to use for small size of the devices and correspondingly the keys or virtual keys. On the other hand, the use becomes more difficult as the devices are continuously provided with new applications and services which are to be used with the 15 same small quantity of small-sized keys. Characteristric of these user interfaces, like user interfaces of computers and other data-processing devices, is generally also that they have their own artificial logics and rules which are fully adopted and known only by a few, normally technically orientated users. An object of the invention is to present such an operation method of the user interface 20 of hand-held devices which to large extent removes many of the above mentioned problems. To reach this object, the operation method of the user interface of a hand-held device, like palm-top or pocket computer, mobile phone, communicator or like, according to the invention, in which operation method data on the display of the device is changed for finding and selecting a desired data, is characterized in that what is defined by claim 1. 25 Other claims define variable embodiments of the invention. An advantage of the invention is that a user is able to easily adopt the operating rules of the user interface because they are "natural". The use of the device is also ergonomically easier because there is no need to use small-sized keys or auxiliary devices, for example. The invention and some embodiments thereof are described in further detail in the 30 following with reference to the accompanying drawings, in which: Figs. 1 to 10 are schematic perspective views presenting examples of the operation method of the user interface of the invention; Figs. 11 to 13 are schematic front views of a device and data presented on the display thereof and present some further examples of the operation method of the user interface of 35 the invention; WO 01/27735 PCT/FI00/00871 2 Fig. 14 is a schematic presentation of an arrangement of data and operation of an application of the user interface of the invention; Fig. 15 is a schematic perspective view of operation of a further application of the user interface of the invention; and 5 Fig. 16 is a flow chart presenting a possible realization of the operation method of the user interface according to the invention. In Figs. 1 to 5, a user is holding a hand-held device 2 in his or her hand 3 and is using a device having a display 2'. The display 2' is a touch sensitive display, and the user is able to give a signal to the device by pressing the display with his or her thumb 4. Movements of 10 the device 2 are detected by an acceleration measurement circuit installed in the device itself, for example, and the data obtained from the circuit is utilized in the operation of the user interface. Some possible ways of realizing the operation method of the user interface according to the invention are considered in further detail below. Figs. 1 to 5 present examples of the operation of the user interface. 15 With reference to Fig. 1, pages 5a, 5b, 5c, 5d and 5e, which may be e.g. weather maps obtained as a result of a web search, are browsed on the display 2' in response to moving the device 2 essentially perpendicularly to the display 2' thereof, as is indicated by arrow F. For the user, it is easy to think that he or she has a pile of pictures 5a to 5e ahead of him or her and that by moving in the pile he or she is able to see a picture at which he or she is at each 20 time. In the same way, successive (or piled) pages of a book or other document may be thought to be browsed. The speed of browsing may be made dependent on quickness or intensity of the movement, i.e. on the magnitude of acceleration in the movement. The operation may also be arranged in such a way that a suitable small movement forward or backward and immediate stop bring the next page in the corresponding direction on the 25 display. Furthermore, it may be arranged that a push of the touch sensitive display 2' gives a signal to the display control means in response to which signal changing of the data on the display is stopped. This kind of virtual use of depth increases significantly the capacity of a display having a restricted size. With reference to Fig. 2, an image on the display 2' is zoomed larger, image 6b, or 30 smaller, image 6a, in response to moving the device 2 in the same way as above, essentially perpendicularly to the display 2' thereof, as is indicated by arrow Z. This operation method of the user interface is fully analogous to human action as he or she wants to examine details of an object, e.g. an image, more closely. Having found a desired magnification, changing of the data on the display may be stopped again by a push of the display 2', for example. The 35 functions of Figs. 1 and 2 may be used alternatively in a user interface according to application, whereby there is no indistinction of what is happening in response to moving WO 01/27735 PCT/FI00/00871 3 the device in this way. The function of Figs. 1 and 2 may also be combined in the operation of a user interface so that, for example, data is first browsed and then zoomed by means of the same movement. Then, it is advantageous to arrange a suitable selector for selecting one or the other of the functions. A selector of zooming may be a push of a certain area on the 5 display, for example, whereby changing over from the zooming mode to the browsing mode may happen automatically by selection of a desired magnification ratio. In Fig. 3, reference sign 7a indicates image data which is significantly larger than the display. The capacity of the display of a hand-held device is not large enough for large entities of factual or image data. In response to lateral movement of the device in essentially 10 the plane of the display 2', as is indicated by arrows N, E, S and W, data 7b on the display is changed as if a window were moved above a larger image formed by the image data 7a. Having found a desired place, changing of data 7b on the display may again be stopped by a push of the display. It is logical to accomplish the functions of Figs. 2 and 3 together whereby the image data may be examined more thoroughly after zooming by selecting the 15 data in the way described by Fig. 3. Moreover, e.g. number selection may be realized in this way without keys by moving the device above a large virtual keyboard and selecting the desired numbers one by one. This function makes it possible to read full-length web pages with a palm-top computer, for example. With reference to Fig. 4, in response to a quick rotational movement of the device 2 as 20 if around an axis formed essentially by the edge 9 thereof, i.e. in response to swinging the device like turning page of a book, as is indicated by arrow P, data 8a, 8b, 8c on the display 2' is changed correspondingly, e.g. a page of an electronic book is turned. By the direction of the swing, the direction of turning page is selected. The function may be arranged so that a small swing with quick stopping turns one page in the direction of the swing, and that in 25 response to a more intensive swing, pages are browsed more quickly until the browsing is stopped by a push of the display. In Fig. 5, an application is illustrated in which any part 9b of image data 9a forming a panorama picture covering the whole sphere of perceptual space (or respectively of image data forming a cylindrical picture) may be examined at a time. In the operation method of a 30 user interface according to the invention, in response to moving the device 2 as if moving a picture on a corresponding spherical surface or in response to changing orientation of the device, that part 9b of the image data 9a which corresponds to the orientation of the display is obtained on the display 2'. At the same time, stopping and zooming of the image may be applied in the way described above. It may be contemplated also that a panorama picture is 35 rotated by giving a push with a movement of the device to a desired direction, whereby the WO 01/27735 PCT/FI00/00871 4 panorama picture is correspondingly kept moving until moving is stopped by a push of the display, for example. Figs. 6 to 10 present embodiments of the user interface of the invention in which data on the display is changed by tilting the device. In Fig. 6, an example is presented in which 5 data, illustrated by data D 1, on the display 2' of the device 2 is scrolled in the way indicated by arrow R1 to the direction to which the device is tilted in the way indicated by arrow Tl. The initial position of the device is indicated by broken line and the tilted position by solid line. In other words, in response to tilting the device, the data on the display is rolling or running to the direction to which the device is tilted, which is fully analogous to cause 10 consequence realtionships of the real world. This may be realized in such a way, for example, that the changing velocity is the grater the more the device is being tilted or is tilted from the initial position. Rolling or running of the data may be stopped by returning the device to the initial position or by means of a key on the device, for example. Fig. 7 presents an embodiment suitable for moving and searching objects on a map, for 15 example. On the display 2,' a cursor C is placed which keeps its place thereon, and tilting the device in the way indicated by arrow T2 causes that data, illustrated by data D2, starts moving from the tilting direction towards to the cursor C in the way indicated by arrow M2. In other words, the cursor is moving on the data forming a map, for example, to the direction to which the device is tilted. 20 In the example of Fig. 8, data on the display 2' of the device 2 is illustrated by data D3. As the user is tilting the device 2 from the position indicated by broken line away from himself or herself, like going closer to the data on the display, the data on the display is enlarged, which is illustrated by a graph indicating enlarging of data D3 and arrow Z2. In the example of Fig. 9, data D4 on the display is correspondingly reduced, as is indicated by a 25 graph illustrating the reduction and drawing away and by arrow Z3, by tilting the device towards the user in the way indicated by arrow T4. The embodiments of Figs. 6, 7, 8 and 9 may be combined in the user interface of a device in such a way, for example, that there is a key on the device for selecting the operation mode (scrolling or zooming) whereby, first, an object is searched by tilting in the 30 scrolling mode, for example, and having found the object, changing over to the zooming mode is made whereby tilting results in zooming the data on the display. Scrolling by tilting may, of course, be combined with zooming in the way described with reference to Fig. 2. Fig. 10 presents schematically an embodiment of the user interface of the invention for selecting data objects on the display. Data objects D5 to D10 may be alternatives in a menu 35 or selection buttons on a page, for example. In the initial position of the device 2 indicated by broken line, data object D5 selected which is indicated by hatching in the figure. Tilting WO 01/27735 PCT/FI00/00871 5 the device in the way indicated by arrow T5, the selection is moved to the direction of tilting, as is indicated by arrow M3, and is here moved to data object D9. Having selected the desired data object, it may be locked by a key on the device or by a return movement of the device, for example. For the sake of simplicity, tilting the device to only one direction 5 and selection from data objects located one below the other are presented here, but in the same way it is, of course, possible to move between selectable objects also laterally. Fig. 11 presents a further example of the embodiment described with reference to Fig. 3. Wide data D 11 consists of objects which are described by letters arranged to form a matrix. By moving the device in the way indicated by arrows N, E, S and W one is as if 10 moving above the data and looking at it through a window formed by the display 2'. As the desired data, here letter q, is found, it may be selected by means of a key on the device, for example. A comparable function may be realized also so that, instead of moving the device laterally, it is tilted to a direction to which one desires to move on the data. Fig. 12 presents a further embodiment of the user interface of the invention for 15 selecting an object on the display. On the display 2' of the device 2 there are objects 01 to 09 from which the selection is made. In the initial state at left, the cursor is close to the upper right corner of the display. In this embodiment, cursor C is as if anchored in place in the real world. As one desires to select object 07, the device 2 is moved laterally as if moving a picture under the cursor so that the desired object 07 is coming under the cursor C 20 at right in the figure. There are keys 21, 22 and 23 on the device the operation of which may correspond to the operation of the buttons of a mouse. In fact, the embodiments of Figs. 10 to 12 are solutions which in hand-held devices replace solutions, like a mouse, in conventional computers which are needed for moving, moving a cursor or making selections on the display. 25 Fig. 13 presents an embodiment in which data objects, pages, cards or like, C1, ... , Ci, ... , Cn are arranged in a stack or one above the other on the display 2' of the device, like e.g. pages of a book in the real world. The data objects may be cards presenting articles of commerce or web pages, for example. The objects are browsed in the way indicated by arrow M4 forward and backword by tilting the device correspondingly either forward in the 30 way indicated by arrow T6 or backward in the way indicated by arrow T7. By means of keys 24 to 26, objects may be selected and files, programs, etc. behind them be opened. There may be several modes in the user interface of a device, like browsing, scrolling, selecting and zooming described above, and e.g. key 24 may be for selection of mode and other keys 25, 26 and 27 may operate like the buttons of a conventional mouse. 35 Fig. 14 presents an application of the user interface of the invention in which data objects are arranged in radial stacks S11 to S18 to extend outwards from the circle of the WO 01/27735 PCT/FI00/00871 6 centre in which a device 2 and a user are. The first page of each stack, e.g. P12 or P16, defines data therebehind. Front pages may be browsed by turning the device laterally in the way indicated by arrow T8, for example, so that direction A12 and front page P12 are selected. The stack of data objects behind it may be browsed either by moving or tilting the 5 device 2 forward and backward in the way indicated by arrow Fl. In this way. e.g. a user interface for electronic shopping may be arranged and make perceiving the places of the products and orientation in a virtual shop easier. Fig. 15 presents a further developed application of the embodiment described with reference to Fig. 5. Circles R1 and R2 present now image data forming a panorama picture 10 or a 3D picture, in general, covering the whole sphere of the perception space. The image data may be also image data obtained from a video camera connected to the device. Three dimensional image data D 12 describing a thing, for example, may be brought to this image data and set in a desired place and desired position in "the real surroundings". In response to moving the device 2 in the way described with reference to Fig. 5, e.g. as indicated by arrow 15 T9, both the background data and 3D data D12', D12" describing the thing are changed in accordance with the positions of the device to correspond to viewing directions Al and A2, so that the thing may be examined from different directions in "the real surroundings" thereof. This may be applied e.g. by searching the 3D model of a sofa from a web shop and by placing it in a desired place in an image data presenting one's house whereby it may be 20 examined from different directions to see how the sofa looks like in the real surroundings thereof. A solution according to the invention may be realized by providing a hand-held device by a multi-axial accelerometer, for example, and with suitable circuits and programs co operating with the operating system of the device and possibly application programs for 25 processing and interpreting measurement results so that a change of data on the display corresponding to a movement detected by the accelerometer is carried out. E.g. the realization with measurement of accelerations is based on application of as such known technical solutions, and a person skilled in the art, having provided with instructions and specifications, is able to realize the operation of the user interface according to the invention 30 with reasonable efforts. In small-sized data-processing devices, like palm-top computers, with prior art hardware technology, it is not possible to realize operation systems and applications with a capacity and usability which were even close to the level reached by larger data-processing devices, like desktop or laptop computers. It is also impossible to include conventional 35 storage means, like hard disk, floppy disk or CD-ROM drivers, in hand-held devices. A solution of these problems may be a two-part data-processing device wherein in a hand-hald WO 01/27735 PCT/FI00/00871 7 part there is only a part of necessary circuits and programs in addition to a display. The most of the circuits and programs are in another portable part held by the user, a wireless link connecting this part to the hand-held part. The wireless link may operate with IR or radio frequencies. In this kind of device, it is easy to realize at the same time a system which 5 detects movement or position of the hand-held part in relation to the portable part. The operation method of the user interface according to the invention may then be realized by means of this system. In the following, a further embodiment of the invention based on detecting accelerations is considered with reference to Fig. 16. The device includes a multi-axial 10 accelerometer and necessary circuits and programs for measurements. The device being switched on, it is monitoring accelerations and in phase 11 is interpreting that an acceleration above a certain threshold is possibly an initial stage of a movement defined in the operation method of the user interface. In response to this, timers TD 1 and TD2 are started in phase 12. TD1 sets a very short experimentally determined time of the order of 15 milliseconds from the detected start of the movement, in which the actual direction and magnitude of the acceleration specify the movement which the user gives to the device. TD2 sets a longer time expiring of which stops changing data on the display if no other causes for stopping appeared. At the time of expiration of time TD 1, the prevailing acceleration vector is measured in phase 13. In phase 14, it is examined if a detected vector is a defined vector, 20 i.e. if it is corresponding to any movement defined in the operation method of the user interface. If not, the operation returns back to the beginning 10 and phase 11 to monitor accelerations of the device. On the other hand, if a vector is a defined vector, data on the display is changed in phase 15, e.g. is browsed, zoomed, etc., according to a corresponding algorithm. As is described above, e.g. determination of the browsing speed of data on the 25 display on the basis of the magnitude of an acceleration vector may be related to this. In phase 16, it is also monitored at the same time if time TD2 is expired. If so, the operation is forwarded to phase 19 in which changing of data on the display is stopped and the procedure is finished in phase 20. If time TD2 is not expired, it is monitored if any opposite vector to the detected defined vector or any other stopping signal, like a push of the display, appear in 30 phase 17. If an opposite vector or a stopping signal are detected in phase 18, changing data on the display is stopped in phase 19 and the procedure is finished. If no opposite vector or other stopping signals are detected in phase 18, the operation returns back to monitor above mentioned issues. The operation method of the user interface of the invention may be realized also in 35 other ways than by utilizing acceleration measurements. Any technique by which changes of place and position of the device may be measured may be applied. A solution which may be WO 01/27735 PCT/FIOO/00871 8 contemplated is technique in which a place and orientation of the device are detected by sending pulsed DC magnetic field and by measuring it with a detector in the device. With this technique, both a place and position of the device in three-dimensional space may be fully detected. 5 A possible solution is also the technique on which so called optical mouse is based and in which a movement and speed thereof are detected by observing and analyzing any surface in the vicinity, which in this case could be a surface on the user, for example. By combining this with distance measurement, which is easy to realize, and by analyzing changes of distance and relative movement, changes of place, movement and position of a device may 10 be detected. The invention may, of course, be realized in various ways. It may be contemplated, for example, that for making detection of a defined movement more reliable, an acceleration is measured many times within a short time window, and a defined movement is detected if any one of the measured vectors meets the requirements for detection. Naturally, also a more 15 complicated analysis of a movement based on successive measured acceleration vectors may be contemplaed. A movement of a device may be detected also with an arrangement, for example, in which distances are measured between transmitters and receivers placed in the device, on one hand, and on the user, on the other hand. A user interface according to the invention may be realized according to use and 20 necessary features of a hand-held device as a suitable combination of the embodiments presented above, the user interface including several operation modes in which the same movement of the device may correspond to different changes of data on the display. As is stated above in relation to some embodiments, there may be a key, for example, on the device for selecting different operation modes. 25 The invention may be varied within the scope of the accompanying claims.

Claims (19)

1. An operation method of the user interface of a hand-held device, like palm-top or pocket computer, mobile phone, communicator or like, in which operation method data on the 5 display of the device is changed for finding and selecting a desired data. characterized in that: data (5a - 5e; 6a. 6b; 7b; 8a - 8c; 9b; Di1; D2; D3; D4; D5 - D10; D11; Cl Cn; S 11 - S 18) on the display (2') of the device is changed in response to defined movements of the device (F; Z; N, E, S, W; P; TI, T2, T3, T4, T5; T6, T7; T8, Fl; Fig. 5) in 10 the three-dimensional space of use thereof; and said movements of the device and the corresponding changes of data on the display are defined so that a movement and a corresponding change of data on the display have a cause-consequence relationship which is analogous to cause-consequence relationships of the three-dimensional space of perception and action of a user (1). 15
2. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to a movement (F; T6, T7) of the device perpendicularly to the display (2') thereof data objects (5a, 5b, 5c, 5d, 5e; Sl, C1, ..., Ci, ..., Cn) are browsed on the display dependent on the direction of the movement either forward or 20 backward, respectively.
3. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to moving (F) the device forward or backward perpendicularly to the display (2') thereof data objects (5a, 5b, 5c, 5d, 5e) are browsed on the 25 display dependent on the direction of the movement either forward or backward, respectively.
4. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to tilting (T6, T7) the device (2) data objects (S1, Cl, ... , 30 Ci, ..., Cn) placed one above the other are browsed (M4) on the display (2') to a direction corresponding to the direction of tilting.
5. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to moving (Z) the device (2) perpendicularly to the display 35 (2') thereof towards the front of the display data (6a) on the display is zoomed larger (6b). WO 01/27735 PCT/FI00/00871 10
6. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to moving (Z) the device (2) perpendicularly to the display (2') thereof towards the back of the display data on the display is zoomed smaller. 5
7. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to moving (N, E, S, W) the device (2) essentially in the direction of the plane of the display (2') thereof it is moved correspondingly (7b) in a displayed data (7a) which forms an image larger than the display. 10
8. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to moving (N, E, S, W) the device (2) essentially in the direction of the plane of the display (2') thereof data (q) from data (D 11) larger than the display is selected. 15
9. An operation method of the user interface of a hand-held device according to claim 1, characterized in that a cursor (C) is placed on the display (2') and in response to moving (N, E, S, W) the device (2) essentially in the direction of the plane of the display (2') thereof the cursor (C) is moved against the movement for selecting data (0 1 - 09) on the display as if by moving data on the display in the real world below the cursor keeping its place in the 20 real world (Fig. 12).
10. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to swinging (P) the device (2) like turning page also data (8a, 8b, 8c) on the display (2') is changed in the way corresponding to turning page. 25
11. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to tilting (Ti1) the device (2) data (DI) on the display (2') is rolled (MI) to the direction of tilting. 30
12. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to tilting (T2) the device (2) data (D2) is moved on the display (2') from the direction of tilting towards a cursor (C).
13. An operation method of the user interface of a hand-held device according to claim 1, 35 characterized in that in response to tilting (T3) the device (2) away from a user data (D3) WO 01/27735 PCT/FI00/00871 11 on the display (2') is enlarged (Z2) and / or in response to tilting (T4) the device (2) towards a user data (D4) (2') is reduced (Z3).
14. An operation method of the user interface of a hand-held device according to claim 1, 5 characterized in that in response to tilting (T4) the device (2) selection of data (D5) on the display (2') is moved (M3, D9) into the direction of tilting.
15. An operation method of the user interface of a hand-held device according to claim 1, characterized in that data is arranged virtually in stacks (S 11 -S 18) on a circle surrounding 10 a user (a device (2)) whereby selection of a stack (S 12) is carried out by turning (T8) the device and the stack (S 12) is browswd by moving or tilting (Fl 1) the device into the direction of the stack (S 12) and away from it, respectively.
16. An operation method of the user interface of a hand-held device according to claim 1, 15 characterized in that in an application (Fig. 5), in which displayed data is a desired part of a panorama picture (9a), in response to turning the device (2) displayed data (9b) is changed so that the change of the viewing direction and the displayed part of the panorama picture (9a) correspond to turning. 20
17. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in an application(Fig. 15), in which displayed data is a desired part of a panorama picture or 3D image data (R1, R2) 3D, data (D12) describing a thing is placed in the image data and in response to turning the device (2) both the panorama picture data or 3D image data (R1, R2) and 3D image data (D12', D12") are changed on the display (2') to 25 correspond to the viewing direction (Al, A2) corresponding to the orientation of the device (2).
18. An operation method of the user interface of a hand-held device according to claim 1, characterized in that in response to stopping or significant slowing (18) of a movement 30 changing of data on the display is stopped.
19. An operation method of the user interface of a hand-held device according to claim 1, characterized in that changing of data on the display is stopped by giving a signal to the device in other way than by means of movement of the device.
AU77926/00A 1999-10-12 2000-10-11 Operation method of user interface of hand-held device Abandoned AU7792600A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
FI19992191 1999-10-12
FI992191A FI19992191A (en) 1999-10-12 1999-10-12 Functional procedure for operating systems of handheld devices
FI20001506A FI20001506A (en) 1999-10-12 2000-06-22 Method of operation of the handheld device
FI20001506 2000-06-22
PCT/FI2000/000871 WO2001027735A1 (en) 1999-10-12 2000-10-11 Operation method of user interface of hand-held device

Publications (1)

Publication Number Publication Date
AU7792600A true AU7792600A (en) 2001-04-23

Family

ID=26160785

Family Applications (1)

Application Number Title Priority Date Filing Date
AU77926/00A Abandoned AU7792600A (en) 1999-10-12 2000-10-11 Operation method of user interface of hand-held device

Country Status (8)

Country Link
EP (1) EP1228422A1 (en)
JP (1) JP2003511786A (en)
CN (1) CN1167995C (en)
AU (1) AU7792600A (en)
FI (1) FI20001506A (en)
NO (1) NO20021647L (en)
RU (1) RU2242043C2 (en)
WO (1) WO2001027735A1 (en)

Families Citing this family (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002268622A (en) * 2001-03-09 2002-09-20 Denso Corp User interface device of portable terminal device
JP4530593B2 (en) * 2001-09-20 2010-08-25 シャープ株式会社 Image display system
FI115861B (en) 2001-11-12 2005-07-29 Myorigo Oy Method and apparatus for generating a response
FI20012209A (en) * 2001-11-14 2003-06-24 Nokia Corp Method for controlling display of information in an electronic device and electronic device
US7714880B2 (en) * 2001-11-16 2010-05-11 Honeywell International Inc. Method and apparatus for displaying images on a display
FI20020582A (en) 2002-03-26 2003-09-27 Nokia Oyj User interface of a portable telecommunications device
FI20021037A (en) 2002-05-31 2003-12-01 Nokia Corp Calendar system and procedure for providing a calendar view
TW200407025A (en) * 2002-08-27 2004-05-01 Vitec Co Ltd Pocket terminal device
AU2003292472A1 (en) * 2003-01-15 2004-08-10 Koninklijke Philips Electronics N.V. Handheld device with a display screen
US20120084693A1 (en) 2010-10-01 2012-04-05 Imerj LLC Modals in dual display communication devices
KR100415161B1 (en) * 2003-07-01 2004-01-13 (주)두모션 Hand held device with three dimensional viewing function by using tilting sensor and system for three-dimensionally displaying image using the same
US7246912B2 (en) 2003-10-03 2007-07-24 Nokia Corporation Electroluminescent lighting system
JP3791848B2 (en) * 2003-10-28 2006-06-28 松下電器産業株式会社 Image display apparatus, image display system, photographing apparatus, image display method, and program
US20050212760A1 (en) 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
JP2005292893A (en) * 2004-03-31 2005-10-20 Nec Access Technica Ltd Portable information terminal device
CN1704871B (en) * 2004-05-31 2010-04-28 深圳市朗科科技股份有限公司 Portable digital devices and input method therefor
WO2005119431A1 (en) * 2004-06-04 2005-12-15 Philips Intellectual Property & Standards Gmbh A hand-held device for content navigation by a user
JP2006079312A (en) * 2004-09-09 2006-03-23 Matsushita Electric Ind Co Ltd Portable viewer
GB0422090D0 (en) * 2004-10-05 2004-11-03 Symbian Software Ltd An interactive computing device with a configurable user interface
JP2006113859A (en) * 2004-10-15 2006-04-27 Nec Corp Portable information terminal and display control method therefor
CN100361051C (en) * 2004-12-30 2008-01-09 集嘉通讯股份有限公司 Device and method for analyzing operation of movable product
KR101002807B1 (en) * 2005-02-23 2010-12-21 삼성전자주식회사 Apparatus and method for controlling menu navigation in a terminal capable of displaying menu screen
KR100811160B1 (en) 2005-06-02 2008-03-07 삼성전자주식회사 Electronic device for inputting command 3-dimensionally
WO2007007682A1 (en) * 2005-07-08 2007-01-18 Mitsubishi Electric Corporation Touch panel display device and portable apparatus
KR100651368B1 (en) * 2005-09-15 2006-11-29 삼성전자주식회사 Method for controlling image according to movement of wireless terminal
US7431216B2 (en) * 2005-11-16 2008-10-07 Sony Ericsson Mobile Communications Ab Methods for presenting parameter status information and related portable electronic devices and parameters
CN100429610C (en) * 2006-01-19 2008-10-29 宏达国际电子股份有限公司 Intuition type screen controller
US20070174416A1 (en) * 2006-01-20 2007-07-26 France Telecom Spatially articulable interface and associated method of controlling an application framework
KR100877829B1 (en) 2006-03-21 2009-01-12 엘지전자 주식회사 Terminal with scrolling function and scrolling method thereof
KR20090077755A (en) * 2006-09-09 2009-07-15 에프-오리진, 인크. Integrated pressure sensitive lens assembly
JP5158902B2 (en) * 2006-09-14 2013-03-06 シャープ株式会社 Electronic device, effective function selection method and program
JP4801623B2 (en) * 2006-09-14 2011-10-26 シャープ株式会社 Electronic device and method for selecting effective functions
US7889173B2 (en) * 2006-09-14 2011-02-15 Microsoft Corporation Defining user input fields on a portable media device
CN101330811B (en) * 2007-06-22 2010-12-08 鸿富锦精密工业(深圳)有限公司 Portable electronic device and operation method thereof
JP5088017B2 (en) * 2007-06-28 2012-12-05 ソニー株式会社 Image display apparatus, imaging apparatus, image display method, and program
JP5412812B2 (en) * 2007-12-07 2014-02-12 ソニー株式会社 Input device, control device, control system, and handheld device
DE102007059273A1 (en) * 2007-12-08 2009-06-18 T-Mobile Internationale Ag Virtual keyboard of a mobile device
US8217964B2 (en) * 2008-02-14 2012-07-10 Nokia Corporation Information presentation based on display screen orientation
JP2009265757A (en) * 2008-04-22 2009-11-12 Toshiba Corp Foldable portable terminal
WO2009141497A1 (en) * 2008-05-22 2009-11-26 Nokia Corporation Device and method for displaying and updating graphical objects according to movement of a device
JP2010092086A (en) * 2008-10-03 2010-04-22 Just Syst Corp User input apparatus, digital camera, input control method, and input control program
FI20080591A0 (en) * 2008-10-24 2008-10-24 Teknillinen Korkeakoulu Gesture-driven interface
US8645871B2 (en) * 2008-11-21 2014-02-04 Microsoft Corporation Tiltable user interface
US20100146460A1 (en) * 2008-12-10 2010-06-10 Sony Ericsson Mobile Communications Ab System and method for modifying a plurality of key input regions based on detected tilt and/or rate of tilt of an electronic device
JP5618486B2 (en) * 2009-01-30 2014-11-05 株式会社東芝 Portable information terminal
JP5357800B2 (en) * 2009-02-12 2013-12-04 キヤノン株式会社 Electronic device and control method thereof
JP5223784B2 (en) * 2009-06-05 2013-06-26 船井電機株式会社 Mobile terminal device
CN101943988B (en) * 2009-07-09 2013-04-24 深圳富泰宏精密工业有限公司 System and method for automatically adjusting user interface of electronic device
US9383916B2 (en) * 2009-09-30 2016-07-05 Microsoft Technology Licensing, Llc Dynamic image presentation
JP2011097441A (en) * 2009-10-30 2011-05-12 Sony Corp Information processing apparatus, image display method and computer program
WO2011073557A1 (en) * 2009-12-18 2011-06-23 France Telecom Method for restoring information on a screen of a terminal, and corresponding device, terminal, and computer program
US20110161889A1 (en) * 2009-12-30 2011-06-30 Motorola, Inc. User Interface for Electronic Devices
CN102129337A (en) * 2010-01-19 2011-07-20 腾讯科技(北京)有限公司 Method and device for controlling mobile terminal browser
US9977472B2 (en) * 2010-03-19 2018-05-22 Nokia Technologies Oy Method and apparatus for displaying relative motion of objects on graphical user interface
KR101680113B1 (en) * 2010-04-22 2016-11-29 삼성전자 주식회사 Method and apparatus for providing graphic user interface in mobile terminal
JP2011233064A (en) * 2010-04-30 2011-11-17 Sony Corp Information processor and display screen operation method
US9046992B2 (en) 2010-10-01 2015-06-02 Z124 Gesture controls for multi-screen user interface
JP2012123451A (en) * 2010-12-06 2012-06-28 Sony Corp Information processor, information processing system and information processing method
KR101740439B1 (en) * 2010-12-23 2017-05-26 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR101830962B1 (en) 2010-12-29 2018-02-22 삼성전자주식회사 Apparatus and method for controlling screen display in portable terminal
KR101766332B1 (en) * 2011-01-27 2017-08-08 삼성전자주식회사 3d mobile apparatus displaying a plurality of contents layers and display method thereof
KR101864333B1 (en) * 2011-03-21 2018-07-05 삼성전자 주식회사 Supporting Method For Icon Change Function And Portable Device thereof
JP5684621B2 (en) 2011-03-28 2015-03-18 京セラ株式会社 Electronic device, display control method, and display control program
US9182935B2 (en) 2011-09-27 2015-11-10 Z124 Secondary single screen mode activation through menu option
WO2013048288A2 (en) * 2011-09-30 2013-04-04 Miroshnichenko Vladimir Vitalievich Touch-sensitive panel
KR101969931B1 (en) * 2012-01-10 2019-04-17 삼성전자주식회사 Apparatus and method for controlling rotation of display image
JP6019601B2 (en) 2012-02-10 2016-11-02 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2014029522A (en) 2012-07-06 2014-02-13 Funai Electric Co Ltd Electronic information terminal and display method therefor
US9791896B2 (en) * 2012-07-13 2017-10-17 Symbol Technologies, Llc Device and method for performing a functionality
JP5435110B2 (en) * 2012-11-29 2014-03-05 日本電気株式会社 Terminal device, display method, and program
CN103179273A (en) * 2013-03-11 2013-06-26 广东欧珀移动通信有限公司 Mobile phone operation method
JP5516794B2 (en) * 2013-05-13 2014-06-11 日本電気株式会社 Portable information terminal, display control method and program
KR102131358B1 (en) 2013-06-17 2020-07-07 삼성전자주식회사 User interface device and method of operation of user interface device
CN104407785B (en) * 2013-12-10 2018-06-15 贵阳朗玛信息技术股份有限公司 A kind of information displaying method and device
JP5639295B2 (en) * 2014-03-31 2014-12-10 グリー株式会社 User operation control program, portable device, and user operation control method
JP5640165B2 (en) * 2014-03-31 2014-12-10 グリー株式会社 User operation control program, portable device, and user operation control method
CN105288997B (en) * 2014-06-24 2019-08-06 腾讯科技(深圳)有限公司 Interactive method and apparatus are realized in chessboard interface
CN104216634A (en) * 2014-08-27 2014-12-17 小米科技有限责任公司 Method and device for displaying manuscript
CN104394312B (en) 2014-10-23 2017-08-22 小米科技有限责任公司 Filming control method and device
JP5801005B2 (en) * 2015-01-14 2015-10-28 京セラ株式会社 Electronic device, display control method, and display control program
JP5826415B2 (en) * 2015-01-14 2015-12-02 京セラ株式会社 Display control method and display control program
JP2015099602A (en) * 2015-01-14 2015-05-28 京セラ株式会社 Display control program
JP5801006B2 (en) * 2015-01-14 2015-10-28 京セラ株式会社 Electronic device, display control method, and display control program
RU2608148C1 (en) * 2015-08-20 2017-01-16 Общество с ограниченной ответственностью "1С ВИАРАБЛ" (ООО "1С ВИАРАБЛ") Method, device and system for data input and display on touch screen display
US9697393B2 (en) 2015-11-20 2017-07-04 Symbol Technologies, Llc Methods and systems for adjusting mobile-device operating parameters based on housing-support type
CN105807952B (en) * 2016-03-07 2020-01-31 联想(北京)有限公司 information processing method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL92220A (en) * 1989-11-06 1993-02-21 Ibm Israel Three-dimensional computer input device
JPH1049290A (en) * 1996-08-05 1998-02-20 Sony Corp Device and method for processing information
JP4149574B2 (en) * 1997-08-29 2008-09-10 ゼロックス コーポレイション User interface support device and information input method
SE516552C2 (en) * 1997-10-02 2002-01-29 Ericsson Telefon Ab L M Handheld display unit and method for displaying screens

Also Published As

Publication number Publication date
WO2001027735A1 (en) 2001-04-19
FI20001506A (en) 2001-04-13
NO20021647D0 (en) 2002-04-08
CN1167995C (en) 2004-09-22
NO20021647L (en) 2002-05-22
FI20001506A0 (en) 2000-06-22
EP1228422A1 (en) 2002-08-07
CN1379871A (en) 2002-11-13
JP2003511786A (en) 2003-03-25
RU2242043C2 (en) 2004-12-10

Similar Documents

Publication Publication Date Title
AU7792600A (en) Operation method of user interface of hand-held device
Harrison et al. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
US10318017B2 (en) Viewing images with tilt control on a hand-held device
US9880640B2 (en) Multi-dimensional interface
US7271795B2 (en) Intuitive mobile device interface to virtual spaces
US6567101B1 (en) System and method utilizing motion input for manipulating a display of data
EP2364470B1 (en) Movement recognition as input mechanism
KR100671585B1 (en) Method and device for browsing information on a display
JP4093823B2 (en) View movement operation method
CN114327096A (en) Device, method and graphical user interface for displaying objects in 3D scenarios
JP2018185853A (en) System and method for interpreting physical interaction with graphical user interface
US20060164382A1 (en) Image manipulation in response to a movement of a display
KR20170138869A (en) Portable apparatus having a plurality of touch screens and control method thereof
JP2012514786A (en) User interface for mobile devices
KR20070006477A (en) Method for arranging contents menu variably and display device using the same
JP2003501762A (en) Movement detection and tracking system for controlling target viewer navigation and display
EP2245525A2 (en) Selecting a layout
CN1156747C (en) Information processing device
JP4912377B2 (en) Display device, display method, and program
van Tonder et al. Is tilt interaction better than keypad interaction for mobile map-based applications?
EP1028366A2 (en) Motion driven access to object viewers
JPH05274421A (en) Cursor controller