WO2003077108A2 - Procede de commande d'un appareil de communication et appareil de communication commande selon ce procede - Google Patents

Procede de commande d'un appareil de communication et appareil de communication commande selon ce procede Download PDF

Info

Publication number
WO2003077108A2
WO2003077108A2 PCT/EP2003/050037 EP0350037W WO03077108A2 WO 2003077108 A2 WO2003077108 A2 WO 2003077108A2 EP 0350037 W EP0350037 W EP 0350037W WO 03077108 A2 WO03077108 A2 WO 03077108A2
Authority
WO
WIPO (PCT)
Prior art keywords
control object
control
display
objects
communication device
Prior art date
Application number
PCT/EP2003/050037
Other languages
German (de)
English (en)
Other versions
WO2003077108A3 (fr
Inventor
Ulrich Leiner
Alexander Jarczyk
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to AU2003219141A priority Critical patent/AU2003219141A1/en
Publication of WO2003077108A2 publication Critical patent/WO2003077108A2/fr
Publication of WO2003077108A3 publication Critical patent/WO2003077108A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection

Definitions

  • the present invention relates to a method for controlling a communication device via a display or user interface, a communication device with a display for displaying a user interface and a program product for a communication device.
  • Communication devices such as mobile radio devices or mobile telephones, tend to be smaller and smaller, which means that the display and input means, such as a keyboard, are also reduced in size.
  • the display and input means such as a keyboard
  • PDAs Personal Digital Assistants
  • a method for controlling a communication device via a display or a user interface shown in it comprises the following steps: First, a plurality of control objects is shown on the display or user interface, the actuation of which enables a respective predetermined control process to be carried out, the control objects being an actuating area with a respective first size.
  • the control process carried out by operating this control object can be the execution of a specific application or a specific program that is stored in the communication device.
  • a character (number, letter, symbol, etc.) assigned to a respective control object can be entered into the communication device or a memory thereof.
  • one of the plurality of control objects shown is selected and then the actuating area of the selected control object is expanded to a respective second size, which is larger than the first size of the selected control object.
  • the area of the selected object not only improves the visual perceptibility of the selected object, which advantageously provides a description of the control process that can be carried out by this object, it also increases the "attack surface" or actuation surface for a user who uses a specific control object Select pointing device, such as a mouse or a touch pen, to then operate it.
  • select is a marking of a specific control element which does not yet cause a control operation assigned to the control object to be carried out
  • actuation means such control of the control object by which the control operation assigned to the selected control object is carried out.
  • the respective control objects are arranged adjacent to one another. In this way it is possible to accommodate a large number of control objects even on a small display or user interface.
  • control objects are arranged in a predetermined area of the display and fill this. This means that the control objects do not have to be provided on the entire display or on the user interface shown, but they do share at least part of the user interface.
  • the actuation area of this new selected control object is expanded (to a second size) while the size the actuation area of the last or originally selected control object, in particular to its respective first size, is reduced.
  • This method step generally reducing the size of the control object last selected after the selection criterion has been transferred to a new control object, enables the display of the control objects to require only a small amount of space.
  • the respective control objects are arranged next to one another in a row, with a selected control object expanding its width of its actuating area in the direction of the arrangement of the control objects and / or a height of its actuating area perpendicular to the direction of the arrangement of the control objects.
  • the term "row” does not have to mean that the individual control objects are arranged in a strictly straight line, but that the row can also have a certain curvature.
  • width and height were chosen independently of the position or orientation of the control elements on the display, but only to define the expansion of the respective actuating surface of the selected control element in the direction or perpendicular to the direction of the Serve set of controls.
  • a third variant it is possible according to a third variant to increase the row height of the selected control object to expand the actuating area of the selected control object, while the row height of the at least one further one in the column of the selected one Control object located control object is reduced. It should be noted that either only the row height of the at least one further control object in the column of the selected control object or the row height of the row assigned to this at least one further control object can be reduced. According to this third variant, it follows that the selected control object or its actuating area is shown particularly large in comparison to the other non-selected control objects, which on the one hand improves the visual perceptibility but also the possibility of actuation for the user.
  • the column width of the selected control object is increased according to a fourth variant to expand the actuating area of the selected control object, while the column width of the at least one further one in the row of the selected one Control object located control object is reduced. It should be noted that either only the column width of the at least one further control object in the row of the selected control object or the column width of the column assigned to this at least one further control object. According to this fourth variant, too, it is again achieved that the size of the selected control object or its actuating area is greatly increased in comparison to the other non-selected control objects in order to improve the perceptibility and actuating area for the user.
  • the arrangement-related or positional relationship of the individual control objects advantageously continues to exist after the selection of a specific control object. This means that the position of a certain control object, for example in a matrix arrangement, does not change compared to the other control objects, even if it is selected. Then only the size of its operating area or the display attribute changes, as will be explained below.
  • control object or its actuation area can have a different color
  • the selected control object or its actuation area can be labeled differently (for example in font and representation, such as bold formatting, italic formatting, color formatting), the selected one
  • the control object or its actuating surface flash periodically in a predetermined time interval
  • a specific control object by means of a specific input device assigned to the communication device, such as the cursor block of a keyboard of a computer, in order to then actuate the selected control object, for example by means of the input key.
  • a specialized input device in the form of a pointing device, such as a mouse, a track ball, a joystick, a control roller, etc., in order to move a pointing object (such as an arrow) shown on the display or user interface, to select any of the controls.
  • an inductive touch input module as the pointing device, which, after integration into the display of the communication device, can be designed as an inductive touch screen or inductive "touch screen".
  • an inductive pen is necessary, which ultimately forms the actual pointing device with the touch screen.
  • Such an inductive touch screen can consist, for example, of an excitation coil and two sensor coils or one pair of sensor coils for a position direction (such as horizontal and vertical). All six coils (two excitation coils, each with two sensor coils) can be stored in a printed circuit board (PCB:
  • Printed Circuit Board which is also referred to as a so-called “sensor board” or sensor circuit board, can be embedded.
  • an excitation coil When the inductive touch screen is operated as an input device or a pointing device, an excitation coil generates a magnetic field with a defined frequency of, for example, 100 to 130 kHz for a short period of time (for example for 100 ⁇ s). With this time-modulated or changing magnetic field a resonant circuit or LC resonant circuit is excited in the inductive pen for the touch screen. After the excitation field generated by the excitation coil is switched off, the LC resonant circuit oscillates due to its high quality and radiates a magnetic field with the frequency characteristic of the LC resonant circuit.
  • This induced magnetic field in turn induces a voltage in the sensor coils.
  • the position of the pen can be determined via the ratio of the voltages generated in the respective sensor coil pairs.
  • the position is evaluated via a control element, in particular in the form of a processor. Due to its functional principle of stimulating an inductive stylus located above the touch screen to emit its own magnetic field, the inductive touch screen not only allows position determination in a horizontal and vertical direction (XY plane or plane of the touch screen), but also in the Z direction. This means that the inductive touch screen can also draw conclusions about the distance of the inductive pin from the touch screen due to the voltages induced in the sensor coils.
  • control object when selecting a specific control object, that control object is selected which is associated with a pointing device
  • Show object over the display (currently). This means that a user does not have to take any further action, such as clicking on a specific button, etc. make a selection process, he only has to arrange the pointing object above the actuating surface of a desired control object in order to select it.
  • a pointing object on the display enables, for example, a modification of the above-mentioned variants 1 to 4 to expand the operating area of a selected control object. It is generally possible to expand the actuating surface of this neighboring control object with a decreasing distance from the pointing object to the neighboring control object when the pointing object located in the currently selected control object moves in the direction of an adjacent (not selected) control object.
  • the row height and / or column width of the adjacent control object can (with increasing distance from the central position) can be enlarged.
  • This extension of the adjacent control object advantageously takes place at the expense of the actuating surface of the selected control object, i. H. the operating area of the selected control object is reduced by the amount by which the operating area of the adjacent control object is increased.
  • the row height of this at least one further control object can continuously decrease with a decreasing distance between the pointing object and the control object at least one further control object (or with increasing distance from the vertical center of the selected control object) can be expanded.
  • the column width of this at least one further control object was continually borrowed with decreasing distance of the pointing object from the at least one further control object (or with increasing distance from the horizontal center of the selected control object) can be expanded.
  • a pointing object can be moved over the display by means of a pointing device shown above in order to select certain control objects. To actuate the selected control object, it is then necessary for the user to enter a specific actuation instruction via the pointing device, for example in the form of a double mouse click or a touch of a touch screen, etc., in order to actuate the control object. It should be noted that, according to the method according to the invention, the pointing object can be located in any section of the actuating surface of the selected control object in order to actuate the selected control object.
  • a communication device is created, by means of which the method described above or preferred configurations thereof are carried out.
  • a communication device has a display for displaying a plurality of control objects, the actuation of which enables a respective predetermined control process to be carried out, the control objects having an actuating surface with a respective first size.
  • the communication device comprises a control device for controlling the display, the control device device is designed such that one of the plurality of control objects is selected and the actuating area of the selected control object is expanded to a second size that is larger than the first size.
  • the communication device comprises an input device connected to the control device for inputting control instructions in order to select the control objects on the display via the control device.
  • the communication device can be designed as a computer, in particular a portable computer, such as a PDA, organizer, etc.
  • the communication device can be designed as a mobile radio device or a cell phone, which operates in particular according to the GSM (Global System for Mobile Communication) standard or the UMTS (Universal Mobile Telecommunication Service) standard.
  • GSM Global System for Mobile Communication
  • UMTS Universal Mobile Telecommunication Service
  • a user interface for an operating system of a communication device or else a user interface for an application program, through which control instructions, in particular characters (numbers, letters,
  • the method according to the invention can thus serve as the basis for an application program which, for example, provides an input option in the form of a software-generated keyboard, by which a user of the communication device can easily enter characters by selecting and actuating the software-generated keys (softkeys).
  • an application program which, for example, provides an input option in the form of a software-generated keyboard, by which a user of the communication device can easily enter characters by selecting and actuating the software-generated keys (softkeys).
  • a program product for a communication device which contains software code sections with which the method described above or advantageous refinements thereof can be executed on the communication device.
  • the program product can be executed by suitable implementation of the method in a programming language and translation into code that can be executed by the communication device.
  • the software code sections are stored for this.
  • a program product is understood to mean the program as a tradable product. It can be in any form, e.g. B. on paper, a computer readable disk or distributed over a network.
  • Figure 1 is a schematic representation of a mobile phone as a, in this case portable, communication device to explain the present invention
  • FIG. 2 shows a schematic illustration of the essential components of a touch screen which can be used, for example, in the mobile telephone shown in FIG. 1;
  • Figure 3 is a view of a user interface for a communication device for entering characters according to a first embodiment
  • Figure 4 is a view of a portion of a user interface for a communication device according to a second embodiment
  • Figures 5 and 6 are respective views of a portion of a user interface for a communication device to show a modification the second embodiment shown in Figure 4.
  • FIG. 1 shows a mobile telephone MFG which is intended to serve as a representation of a communication device, here a portable communication device, to explain the present invention.
  • the mobile phone MFG has a control device STE, which in particular has a microprocessor and a program memory.
  • program codes can be contained in the program memory, the processing of which by means of the control device STE realizes the present invention.
  • LCD LCD
  • This display which is represented by a frame with a thickened line, is in this case part of a so-called touch screen, through which software-generated control objects (softkeys) can be selected and actuated by touching a stylus ST provided for this purpose on the display.
  • software-generated control objects softkeys
  • FIG. 2 A detailed description of the operation of the DSP display will be given below with reference to FIG. 2.
  • a user interface is currently shown on the DSP display, on which an application program represents a softkey keyboard, which softkey keys for the respective digits "0 to 9" and for the " ⁇ *" - and has the "#” symbol.
  • other "keyboards" can also be shown on the display DSP, as shown, for example, in FIGS. 3 to 6.
  • an arrow P is provided on the displayed user interface, which is moved by means of a pointing device (which in this case consists of the touch screen and the stylus ST provided therefor.) If any of the numbers or symbols shown are selected and actuated, they are converted into one with the Control device STE connected buffer ZWS written and advantageously displayed in the space not occupied by the control objects of the display DSP for control.
  • the mobile telephone MFG shown in FIG. 1 is shown in such a way that it is only a massive button
  • control device STE and the intermediate memory ZWS are shown in dashed lines, since they are normally provided in the interior of the mobile telephone MFG and are not visible to the user.
  • the mobile telephone MFG also has a microphone MIK and a loudspeaker LS, which are used in conventional telephone calls (in the operating state) in which voice signals are transmitted. Furthermore, the mobile telephone MFG has an antenna ANT, which is connected to a radio module (not shown), which serves to transmit or transmit voice or data via an air interface to a base station, which is part of a communication network receive.
  • the mobile phone MFG and the communication network including base station (s) can work, for example, in accordance with the GSM standard or UMTS standard. However, the invention is not limited to such standards.
  • a display DSP in the form of a touch screen is provided in this mobile phone shown in FIG. 1.
  • This touch screen is based on an inductive effect and is now to be shown schematically with reference to FIG. 2. It should be noted that for reasons of clarity, the mobile phone MFG and the actual display element (for example the LCD) have been omitted.
  • the display element In a composite form of the touch screen DSP, the display element would be arranged above the sensor circuit board SLP shown in FIG. 2. The one shown in FIG.
  • the input device or pointing device essentially comprises three components, namely a pin ST with an inductance-capacitance circuit or an LC element LCG, a sensor circuit board SLP with at least one excitation coil ES and at least one sensor coil SS1 / SS2, and a control element PR to determine the position of the pin above the sensor circuit board SLP.
  • the SLP and PR components are part of the DSP touch screen.
  • the principle for detecting the pin position above the sensor circuit board SLP provided with the excitation and sensor coils essentially works as follows.
  • the control element PR causes a current pulse through the excitation coil ES.
  • This current pulse causes a change in the magnetic field through the coil ES, which in turn can be perceived by the coil of the LC element LCG in the pin.
  • This magnetic coupling between the excitation coil ES and the coil of the LC element LCG produces an inductive interaction between the two coils in accordance with the principle of a voltage transformer.
  • a magnetic alternating field is generated in the excitation coil ES with a defined frequency, for example in the range from 100 to 130 kHz, for a predetermined short period of time (for example in the range of 100 ⁇ s), by means of which the LC element provided in the pin ST contributes
  • a resonance oscillation magnetic resonance
  • the alternating magnetic field emitted by the pin or its LC element are detected by the sensor coils SSI and SS2, and, for example, the position of the pin can be deduced from the detected different field strengths.
  • the inductive position determination not only makes it possible to determine the position of the pin ST in the plane of the sensor circuit board SLP (XY plane), but also the position of the pin ST above the sensor circuit board SLP or the display element above it (Z direction) , In this way, it is possible to move the arrow P shown in FIG. 1 on the display by moving the pen ST in an X or Y direction, for example to move a specific one of the control objects shown (numbers "0-9" , "*" Symbol, "+” symbol) as long as the stylus is at a certain (minimum) distance above the sensor circuit board SLP or the display element arranged above it.
  • the pin ST is brought into a distance smaller than the predetermined minimum distance from the printed circuit board or the display element located above it, in particular in such a way that it touches the display element, the control element above which the entry is below the minimum distance or there is contact with the display element.
  • FIG. 3 in which a section of a user interface is shown, on the basis of which the selection or actuation of a control object according to a first embodiment is to be explained.
  • a section of a user interface can be arranged in the display DSP shown in FIG. 1 instead of the 12 control objects shown there (digits "0-9", "*" symbol, "+” symbol).
  • the control objects (softkey keys) are arranged in a matrix arrangement consisting of 11 columns and 4 rows. More precisely, the control objects are arranged in a predetermined section of the display or user interface DSP, namely a so-called software-generated key section or softkey section SKA.
  • the dashed line at the upper section of the frame representing the display DSP is intended to illustrate that the display DSP is not limited to the section shown, but can, for example, still be continued upwards.
  • the individual control objects (all of which, including the selected control element, are to be designated with the reference symbols SO) have a rectangular shape, so that they are optimal, ie without wasting space on the display DSP, fit into the SKA section.
  • the arrow P can be controlled by means of the inductive pen ST (FIG. 2) in such a way that it can be moved over the entire display DSP, in particular the area SKA. If the arrow P is moved over a control object SO (in the following, the terms "control object” and “actuating surface” refer to the same surface sections on the display), then the control object SO on which or above which is located the arrow P is automatically marked as the selected control object ASO.
  • the column width of the entire column SP1 was expanded (to approximately twice) in order to give the user of the pen ST a larger "contact surface” or “operating surface” for actuating the character " Y "to create.
  • the display attribute of the control object "Y” has been changed compared to the control objects not selected. In this case, the designation of the character "Y” was displayed in a bold image format in order to distinguish it from the non-selected control elements.
  • other possibilities are conceivable to differentiate the selected object ASO from the non-selected control objects.
  • the control object "X” is selected automatically when the arrow head of the arrow P is swept over the actuating surface of the control object "X".
  • the column width of the column SP1 is then reduced to an amount (first size) as shown for the column SP2 in FIG. 3, the column SP2 now assuming a width (second size) as shown in picture 3 for the column SPl is shown.
  • the control object "X" is displayed in a bold image format.
  • a corresponding change in the column width occurs when, for example, the user object "Y" or another character from another column. This means that in addition to the widening of the column width of the column in which the selected control object is located, the selected control object is additionally emphasized by a changing display attribute compared to the other control objects.
  • control objects are arranged in a softkey section whose height is greater than its width, it is also conceivable to transfer the method of widening the column width shown in FIG. 3 to an extension of the line height of the line in which the selected object is located ,
  • FIG. 4 a section of a user interface, here only the softkey section SKA, in which the software-generated control objects (keys) SO are provided, for the purpose of clarification, to explain a method for selecting the control objects according to a second embodiment is shown.
  • the control object "G" arranged in the sixth column SP6 and the third row ZE3 is marked as the selected control object ASO.
  • the column width of the column in which the selected control object ASO is located is enlarged compared to the other columns for better visualization and actuation.
  • the row height of the selected control object ASO is also increased. More precisely, as can be seen in FIG. 4, only the row height of the selected object ASO is increased, where however, the row heights of the control objects located in the same column (SP6) are reduced. In this way, a substantial enlargement of the selected control object ASO compared to the non-selected control objects is created, through which a clear visual highlighting of the selected object ASO and thus an easy or comfortable operation of this object by a user, for example by the above-mentioned pen ST, is created.
  • the pointing object P more precisely the tip of the pointing object P, is viewed approximately in the vertical direction in the middle of the selected control object ASO, with the selected control object ASO still having its maximum line height in this position of the pointing object having. If, on the other hand, the pointing object P is moved from this central position by a certain amount in the vertical direction upwards in the direction of the control object "Z", the line height of the selected control object "H” is reduced, the line height of the control object "Z "increased in favor of the row height of the object” H ". As shown in FIG. 6, the pointing object P is already located on an upper section of the control object "H” and is therefore relatively close to the control object "Z" to be newly selected.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Position Input By Displaying (AREA)
  • Selective Calling Equipment (AREA)

Abstract

L'invention concerne un procédé de commande d'un appareil de communication, selon lequel une pluralité d'objets de commande (SO) sont représentés sur un affichage (DSP) dudit appareil de communication. Ces objets, qui permettent, par leur actionnement, l'exécution d'un processus de commande prédéfini respectif, présentent une surface d'actionnement d'une première dimension respective. Lorsque l'utilisateur sélectionne (ASO) un de ces objets de commande, la dimension de la surface d'actionnement de l'objet de commande sélectionné est augmentée par rapport à ladite première dimension. L'utilisateur obtient ainsi une représentation visuelle améliorée de l'objet sélectionné et dispose, en outre, d'une plus grande surface d'actionnement pour actionner l'objet de commande souhaité sélectionné.
PCT/EP2003/050037 2002-03-11 2003-02-28 Procede de commande d'un appareil de communication et appareil de communication commande selon ce procede WO2003077108A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2003219141A AU2003219141A1 (en) 2002-03-11 2003-02-28 Method for controlling a communication device and communication device controlled thus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10210637A DE10210637A1 (de) 2002-03-11 2002-03-11 Verfahren zum Steuern eines Kommunikationsgeräts und dadurch gesteuertes Kommunikationsgerät
DE10210637.1 2002-03-11

Publications (2)

Publication Number Publication Date
WO2003077108A2 true WO2003077108A2 (fr) 2003-09-18
WO2003077108A3 WO2003077108A3 (fr) 2004-06-24

Family

ID=27797656

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2003/050037 WO2003077108A2 (fr) 2002-03-11 2003-02-28 Procede de commande d'un appareil de communication et appareil de communication commande selon ce procede

Country Status (3)

Country Link
AU (1) AU2003219141A1 (fr)
DE (1) DE10210637A1 (fr)
WO (1) WO2003077108A2 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005024620A2 (fr) * 2003-09-11 2005-03-17 Siemens Aktiengesellschaft Dispositif d'affichage comportant des dispositifs pour generer et detecter un champ magnetique, et appareil utilisant un tel dispositif
EP1933229A1 (fr) * 2006-12-14 2008-06-18 BrainLAB AG Procédé et dispositif destinés à la représentation et l'utilisation d'une surface d'utilisation sur une signalisation
WO2009074278A1 (fr) * 2007-12-11 2009-06-18 Nokia Corporation Dispositif et procédé pour la saisie de caractères combinés
EP2112583A1 (fr) * 2008-04-22 2009-10-28 HTC Corporation Procédé et appareil pour parcourir des informations et support d'enregistrement les utilisant
WO2010032190A1 (fr) * 2008-09-18 2010-03-25 Koninklijke Philips Electronics N.V. Procédé et appareil d'affichage d'éléments d'interface utilisateur sélectionnables
EP2199898A1 (fr) 2008-12-22 2010-06-23 Research In Motion Limited Dispositif électronique portable doté d'un écran tactile et son procédé de commande
EP2163975A3 (fr) * 2008-09-12 2010-10-27 Sony Corporation Traitement d'informations
US8121652B2 (en) 2008-12-22 2012-02-21 Research In Motion Limited Portable electronic device including touchscreen and method of controlling the portable electronic device
EP1912114A3 (fr) * 2006-10-09 2012-12-12 Robert Bosch Gmbh Unité de commande et procédé de représentation d'un clavier
CN103294397A (zh) * 2008-04-30 2013-09-11 宏达国际电子股份有限公司 使用者界面显示区域的调整方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9092134B2 (en) * 2008-02-04 2015-07-28 Nokia Technologies Oy User touch display interface providing an expanded selection area for a user selectable object
US8381118B2 (en) 2009-10-05 2013-02-19 Sony Ericsson Mobile Communications Ab Methods and devices that resize touch selection zones while selected on a touch sensitive display

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0880090A2 (fr) * 1997-04-28 1998-11-25 Nokia Mobile Phones Ltd. Fonction de d'agrandissement automatique de symboles dans une station mobile avec entrée de données tactile
WO2001046790A2 (fr) * 1999-12-20 2001-06-28 Apple Computer, Inc. Interface utilisateur offrant une consolidation et un acces ameliores

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0880090A2 (fr) * 1997-04-28 1998-11-25 Nokia Mobile Phones Ltd. Fonction de d'agrandissement automatique de symboles dans une station mobile avec entrée de données tactile
WO2001046790A2 (fr) * 1999-12-20 2001-06-28 Apple Computer, Inc. Interface utilisateur offrant une consolidation et un acces ameliores

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"EXPANDABLE TARGETS FOR EFFICIENT SELECTION VIA A SCREEN CURSOR" IBM TECHNICAL DISCLOSURE BULLETIN, IBM CORP. NEW YORK, US, Bd. 35, Nr. 3, 1. August 1992 (1992-08-01), Seiten 438-439, XP000326335 ISSN: 0018-8689 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005024620A3 (fr) * 2003-09-11 2005-05-26 Siemens Ag Dispositif d'affichage comportant des dispositifs pour generer et detecter un champ magnetique, et appareil utilisant un tel dispositif
WO2005024620A2 (fr) * 2003-09-11 2005-03-17 Siemens Aktiengesellschaft Dispositif d'affichage comportant des dispositifs pour generer et detecter un champ magnetique, et appareil utilisant un tel dispositif
EP1912114A3 (fr) * 2006-10-09 2012-12-12 Robert Bosch Gmbh Unité de commande et procédé de représentation d'un clavier
EP1933229A1 (fr) * 2006-12-14 2008-06-18 BrainLAB AG Procédé et dispositif destinés à la représentation et l'utilisation d'une surface d'utilisation sur une signalisation
WO2009074278A1 (fr) * 2007-12-11 2009-06-18 Nokia Corporation Dispositif et procédé pour la saisie de caractères combinés
EP2112583A1 (fr) * 2008-04-22 2009-10-28 HTC Corporation Procédé et appareil pour parcourir des informations et support d'enregistrement les utilisant
CN103294397A (zh) * 2008-04-30 2013-09-11 宏达国际电子股份有限公司 使用者界面显示区域的调整方法及装置
CN103294397B (zh) * 2008-04-30 2016-03-23 宏达国际电子股份有限公司 使用者界面显示区域的调整方法及装置
EP2163975A3 (fr) * 2008-09-12 2010-10-27 Sony Corporation Traitement d'informations
US8471825B2 (en) 2008-09-12 2013-06-25 Sony Corporation Information processing apparatus, information processing method and computer program
US8860680B2 (en) 2008-09-12 2014-10-14 Sony Corporation Information processing apparatus, information processing method and computer program
US9569106B2 (en) 2008-09-12 2017-02-14 Sony Corporation Information processing apparatus, information processing method and computer program
WO2010032190A1 (fr) * 2008-09-18 2010-03-25 Koninklijke Philips Electronics N.V. Procédé et appareil d'affichage d'éléments d'interface utilisateur sélectionnables
EP2199898A1 (fr) 2008-12-22 2010-06-23 Research In Motion Limited Dispositif électronique portable doté d'un écran tactile et son procédé de commande
US8121652B2 (en) 2008-12-22 2012-02-21 Research In Motion Limited Portable electronic device including touchscreen and method of controlling the portable electronic device

Also Published As

Publication number Publication date
DE10210637A1 (de) 2003-10-09
AU2003219141A1 (en) 2003-09-22
WO2003077108A3 (fr) 2004-06-24

Similar Documents

Publication Publication Date Title
DE69937592T2 (de) Verfahren und Vorrichtung zur Zeicheneingabe mit virtueller Tastatur
DE69838052T2 (de) Tragbares Kommunikationsgerät
DE60212976T2 (de) Verfahren und Benutzerschnittstelle zur Zeicheneingabe
DE19849515C1 (de) Verfahren zur Übergabe von Zeichen insbesondere an einen Computer und Eingabevorrichtung unter Einsatz dieses Verfahrens
DE60022030T2 (de) Kommunikationssystem und -verfahren
EP1262740B1 (fr) Système d'ordinateur véhiculaire et procédé de commande d'un curseur pour système d'ordinateur véhiculaire
EP2169522B1 (fr) Procédé et dispositif destinés à la saisie de textes
DE112013004437B4 (de) Verfahren zum Definieren einer Eingebetaste auf einer Tastatur und Verfahren zur Interpretation von Tastenanschlägen
WO2003077108A2 (fr) Procede de commande d'un appareil de communication et appareil de communication commande selon ce procede
DE102006017486A1 (de) Elektronische Vorrichtung und Verfahren zum Vereinfachen einer Texteingabe unter Verwendung einer Soft-Tastatur
DE102008007243A1 (de) Endgerät und Verfahren zur Anzeige eines Menüs
DE10357475A1 (de) Kommunikationsvorrichtung und Verfahren zum Eingeben und Vorhersagen von Text
DE10209797A1 (de) Optische Pseudo-Rollkugel zur Steuerung des Betriebs einer Vorrichtung oder Maschine
EP2652584A1 (fr) Système pourvu d'une unité de reconnaissance de gestes
DE10201195B4 (de) Verfahren zur Texteingabe durch Auswahl von Buchstaben mittels eines Cursors und Einrichtung zur Durchführung des Verfahrens
DE10140874A1 (de) Graphische Benutzeroberfläche
EP1912114A2 (fr) Unité de commande et procédé de représentation d'un clavier
WO2006003087A2 (fr) Procede pour entrer des caracteres dans un appareil de communication mobile et appareil de communication mobile concu a cette fin
DE19713027A1 (de) Mischeinrichtung mit einem Mischer für Videosignale
DE102006020568A1 (de) Anzeigevorrichtung zur taktil erfassbaren Darstellung von Anzeigeelementen sowie Anzeigesystem mit einer solchen Anzeigevorrichtung
EP0262548B1 (fr) Méthode et dispositif d'affichage de textes munis de codes de commande y appartenant
DE102006010229B4 (de) Verfahren und Vorrichtung zur Eingabe von Zeichen
WO2000079371A1 (fr) Dispositif de saisie et procede permettant de faire fonctionner ledit dispositif
DE10234203B4 (de) Tastatur als erste Baueinheit für einen eine zweite Baueinheit bildenden Rechner sowie Verwendung eines Telefons als Dateneingabevorrichtung
EP0831391B1 (fr) Système et methode à afficer des charactères alphanumeriques et/ou graphiques

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP