EP2486472A1 - Modification de la valeur d'un champ par geste tactile ou clavier virtuel - Google Patents

Modification de la valeur d'un champ par geste tactile ou clavier virtuel

Info

Publication number
EP2486472A1
EP2486472A1 EP10821510A EP10821510A EP2486472A1 EP 2486472 A1 EP2486472 A1 EP 2486472A1 EP 10821510 A EP10821510 A EP 10821510A EP 10821510 A EP10821510 A EP 10821510A EP 2486472 A1 EP2486472 A1 EP 2486472A1
Authority
EP
European Patent Office
Prior art keywords
field
touch
sensitive display
sequential list
values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10821510A
Other languages
German (de)
English (en)
Other versions
EP2486472A4 (fr
Inventor
Earl John Wikkerink
Michael George Langlois
Michael Thomas Hardy
Yoojin Hong
Rohit Rocky Jain
Raymond Emmanuel Mendoza
Oriin Stoev
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BlackBerry Ltd
Original Assignee
Research in Motion Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research in Motion Ltd filed Critical Research in Motion Ltd
Publication of EP2486472A1 publication Critical patent/EP2486472A1/fr
Publication of EP2486472A4 publication Critical patent/EP2486472A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present disclosure relates to computing devices, and in particular to a portable electronic devices having touch screen displays and their control.
  • Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart telephones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or BluetoothTM capabilities.
  • PIM personal information manager
  • Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability.
  • a touch-sensitive display also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output.
  • the information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. Performing repetitive actions on touch-sensitive displays while maintaining an efficient graphical user interface is a challenge for portable electronic devices having touch-sensitive displays. Accordingly,
  • Figure 1 is a simplified block diagram of components including internal components of a portable electronic device according to one aspect
  • Figure 2 is a front view of an example of a portable electronic device in a portrait orientation
  • Figure 3A is a sectional side view of portions of the portable electronic device of Figure 2;
  • Figure 3B is a side view of a portion of the portable electronic device shown in Figure 3A;
  • Figure 4 is a front view of an example of a portable electronic device in a portrait orientation, showing hidden detail in ghost outline;
  • Figure 5 is a block diagram of a circuit for controlling the actuators of the portable electronic device in accordance with one example embodiment of the present disclosure
  • Figures 6A and 6B are schematic diagrams of a user interface screen in accordance with one example embodiment of the present disclosure.
  • Figure 7 is a schematic diagram of a user interface screen in
  • Figure 8 is a schematic diagram of a user interface screen in
  • Figure 9 is a screen capture of a user interface screen in accordance with one example embodiment of the present disclosure.
  • Figure 10 is a flowchart illustrating an example a method of controlling touch input on a touch-sensitive display when a display element is active in accordance with one example embodiment of the present disclosure
  • Figure 11A to 11B are screen captures of a user interface screen in accordance with other example embodiments of the present disclosure.
  • Figure 12A to 12C are screen captures of a widget for the user interface screen of Figure 11A or 11B; and [0017] Figure 13A to 13F are screen captures of time widget in accordance with one example embodiment of the present disclosure.
  • the present disclosure provides a method of controlling touch input on a touch-sensitive display when a display element is active and a portable electronic device configured for the same. Precise targeting is difficult when using a touch- sensitive display, particularly when swiping on or over small onscreen targets.
  • the present disclosure provides a mechanism for gross targeting rather than precise targeting when interacting with an active display element such as a selected field.
  • the present disclosure describes, in at least some embodiments, a method and portable electronic device in which a swipe gesture anywhere on the touch-sensitive display changes the value of an active display element (e.g., incrementing or decrementing the value of a field which has been selected).
  • the present disclosure may be particularly useful when swiping on or over a "spin dial” or “spin box” to change its value.
  • the method and portable electronic device taught by the present disclosure seek to reduce the targeting which is required before swiping. This can reduce the number of erroneous inputs generated when interacting with the touch-sensitive display which are inefficient in terms of processing resources, use unnecessary power which reduces battery life, and may result in an unresponsive user interface. Accordingly, the method and portable electronic device taught by the present disclosure seeks to provide improvements in these areas.
  • the ability to interact with the selected field using other parts of the touch-sensitive display provides a larger area for interaction in which touch gestures can be performed, and provides a method of interacting with the selected field which does not obscure that field.
  • a method of controlling touch input on a touch-sensitive display of a portable electronic device comprising : displaying a widget having at least one field on a user interface screen displayed on the touch-sensitive display; selecting a field in the widget in response to predetermined interaction with the touch-sensitive display; changing the value of the selected field in accordance with a predetermined touch gesture at any location on the touch-sensitive display; and re-displaying the widget on the user interface screen with the changed value of the selected field.
  • a portable electronic device comprising : a processor; a touch- sensitive display having a touch-sensitive overlay connected to the processor;
  • the processor is configured for: causing a widget having at least one field to be displayed on a user interface screen displayed on the touch-sensitive display; selecting a field in the widget in response to predetermined interaction with the touch-sensitive display; changing the value of the selected field in accordance with a predetermined touch gesture at any location on the touch-sensitive display; and causing the widget to be re-displayed on the user interface screen with the changed value of the selected field.
  • a computer program product comprising a computer readable medium having stored thereon computer program instructions for implementing a method on a portable electronic device for controlling its operation, the computer executable instructions comprising instructions for performing the method(s) set forth herein.
  • the disclosure generally relates to an electronic device, which is a portable electronic device in the embodiments described herein .
  • portable electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, and so forth.
  • the portable electronic device may also be a portable electronic device without wireless communication capabilities, such as a handheld electronic game device, digital photograph album, digital camera, or other device.
  • FIG. 1 A block diagram of an example of a portable electronic device 100 is shown in Figure 1.
  • the portable electronic device 100 includes multiple
  • the communication subsystem 104 receives messages from and sends messages to a wireless network 150.
  • the wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications.
  • a power source 142 such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100.
  • the processor 102 interacts with other components, such as Random Access Memory (RAM) 116, memory 110, a display screen 112 (such as a liquid crystal display (LCD)) with a touch-sensitive overlay 114 operably connected to an electronic controller 116 that together comprise a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, one or more auxiliary input/output (I/O) subsystems 124, a data port 126, a speaker 128, a microphone 130, short-range communications subsystem 132, and other device subsystems 134.
  • User-interaction with a graphical user interface (GUI) is performed through the touch-sensitive overlay 114.
  • GUI graphical user interface
  • the processor 102 interacts with the touch- sensitive overlay 114 via the electronic controller 116.
  • Information such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102.
  • the processor 102 may interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity- induced reaction forces.
  • the portable electronic device 100 uses a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150.
  • SIM/RUIM Removable User Identity Module
  • user identification information may be programmed into memory 110.
  • the portable electronic device 100 includes an operating system 146 and software applications or programs 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs 148 may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.
  • a received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102.
  • the processor 102 processes the received signal for output to the display screen 112 and/or to the auxiliary I/O subsystem 124.
  • a subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104.
  • the speaker 128 outputs audible information converted from electrical signals
  • the microphone 130 converts audible information into electrical signals for processing.
  • FIG. 2 shows a front view of an example of a portable electronic device 100 in portrait orientation.
  • the portable electronic device 100 includes a housing 200 that houses internal components including internal components shown in Figure 1 and frames the touch-sensitive display 118 such that the touch-sensitive display 118 is exposed for user-interaction therewith when the portable electronic device 100 is in use.
  • the touch-sensitive display 118 may include any suitable number of user-selectable features rendered thereon, for example, in the form of virtual buttons for user-selection of, for example, applications, options, or keys of a keyboard for user entry of data during operation of the portable electronic device 100.
  • the touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art.
  • a capacitive touch- sensitive display includes a capacitive touch-sensitive overlay 114.
  • the overlay 114 may be an assembly of multiple layers in a stack including, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover.
  • capacitive touch sensor layers may be any suitable material, such as patterned indium tin oxide (ITO).
  • ITO patterned indium tin oxide
  • One or more touches may be detected by the touch-sensitive display 118.
  • the processor 102 may determine attributes of the touch, including a location of a touch.
  • Touch location data may include an area of contact or a single point of contact, such as a point at or near a centre of the area of contact.
  • the location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118.
  • the x location component may be determined by a signal generated from one touch sensor
  • the y location component may be determined by a signal generated from another touch sensor.
  • a signal is provided to the controller 116 in response to detection of a touch.
  • a touch may be detected from any suitable object, such as a finger, thumb, appendage, or other items, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected.
  • the actuator(s) 120 may be depressed by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120.
  • the actuator 120 may be actuated by pressing anywhere on the touch- sensitive display 118.
  • the actuator 120 may provide input to the processor 102 when actuated. Actuation of the actuator 120 may result in provision of tactile feedback.
  • the actuators 120 may comprise one or more piezoelectric devices that provide tactile feedback for the touch-sensitive display 118.
  • the actuators 120 may be depressed by applying sufficient force to the touch- sensitive display 118 to overcome the actuation force of the actuators 120.
  • the actuators 120 may be actuated by pressing anywhere on the touch-sensitive display 118.
  • the actuator 120 may provide input to the processor 102 when actuated. Contraction of the piezoelectric actuators applies a spring-like force, for example, opposing a force externally applied to the touch-sensitive display 118.
  • Each piezoelectric actuator includes a piezoelectric device, such as a piezoelectric (PZT) ceramic disk adhered to a metal substrate.
  • PZT piezoelectric
  • the metal substrate bends when the PZT disk contracts due to build up of charge at the PZT disk or in response to a force, such as an external force applied to the touch-sensitive display 118.
  • the charge may be adjusted by varying the applied voltage or current, thereby controlling the force applied by the piezoelectric disks.
  • the charge on the piezoelectric actuator may be removed by a controlled discharge current that causes the PZT disk to expand, releasing the force thereby decreasing the force applied by the piezoelectric disks.
  • the charge may advantageously be removed over a relatively short period of time to provide tactile feedback to the user.
  • the piezoelectric disk may be slightly bent due to a mechanical preload.
  • the housing 200 can be any suitable housing for the internal components shown in Figure 1.
  • Figure 3A shows a sectional side view of portions of the portable electronic device 100 and
  • Figure 3B shows a side view of a portion of the actuators 120.
  • the housing 200 in the present example includes a back 302, a frame 304, which frames the touch-sensitive display 118 and sidewalls 306 that extend between and generally perpendicular to the back 302 and the frame 304.
  • a base 308 is spaced from and is generally parallel to the back 302.
  • the base 308 can be any suitable base and can include, for example, a printed circuit board or flexible circuit board supported by a stiff support between the base 308 and the back 302.
  • the back 302 may include a plate (not shown) that is releasably attached for insertion and removal of, for example, the power source 142 and the SIM/RUIM card 138 referred to above. It will be appreciated that the back 302, the sidewalls 306 and the frame 304 may be injection molded, for example. In the example of the portable electronic device 100 shown in Figure 2, the frame 304 is generally rectangular with rounded corners, although other shapes are possible.
  • the display screen 112 and the touch-sensitive overlay 114 are supported on a support tray 310 of suitable material such as magnesium for providing mechanical support to the display screen 112 and touch-sensitive overlay 114.
  • a compliant spacer such as gasket compliant 312 is located around the perimeter of the frame 304, between an upper portion of the support tray 310 and the frame 304 to provide a gasket for protecting the components housed in the housing 200 of the portable electronic device 100.
  • a suitable material for the compliant gasket 312 includes, for example, a cellular urethane foam for providing shock absorption, vibration damping and a suitable fatigue life. In some embodiments, a number of compliant spacers may be provided to provide the function of the gasket compliant 312.
  • the actuators 120 includes four piezoelectric disk actuators 314, as shown in Figure 4, with each piezoelectric disk actuator 314 located near a respective corner of the touch-sensitive display 118.
  • each piezoelectric disk actuator 314 is supported on a respective support ring 316 that extends from the base 308 toward the touch-sensitive display 118 for supporting the respective piezoelectric disk actuator 314 while permitting flexing of the piezoelectric disk actuator 314.
  • Each piezoelectric disk actuator 314 includes a piezoelectric disk 318 such as a PZT ceramic disk adhered to a metal substrate 320 of larger diameter than the piezoelectric disk 318 for bending when the
  • piezoelectric disk 318 contracts as a result of build up of charge at the piezoelectric disk 318.
  • Each piezoelectric disk actuator 314 is supported on the respective support ring 316 on one side of the base 308, near respective corners of the metal substrate 320, base 308 and housing 200.
  • the support 316 ring is sized such that the edge of the metal substrate 320 contacts the support ring 316 for supporting the piezoelectric disk actuator 314 and permitting flexing of the piezoelectric disk actuator 314.
  • a shock-absorbing element 322 which in the present example is in the form of a cylindrical shock-absorber of suitable material such as a hard rubber is located between the piezoelectric disk actuator 314 and the support tray 310.
  • a respective force sensor 122 is located between each shock-absorbing element 322 and the respective piezoelectric disk actuator 314.
  • a suitable force sensor 122 includes, for example, a puck-shaped force sensing resistor for measuring applied force (or pressure). It will be appreciated that a force can be determined using a force sensing resistor as an increase in pressure on the force sensing resistor results in a decrease in resistance (or increase in conductance).
  • each piezoelectric disk actuator 314 is located between the base 308 and the support tray 310 and force is applied on each piezoelectric disk actuator 314 by the touch-sensitive display 118, in the direction of the base 308, causing bending of the piezoelectric disk actuator 314.
  • the piezoelectric disk actuator 314 undergoes slight bending.
  • the piezoelectric disk 318 shrinks and causes the metal substrate 320 and piezoelectric disk 318 to apply a further force, opposing the external applied force, on the touch-sensitive display 118 as the piezoelectric actuator 314 straightens.
  • Each of the piezoelectric disk actuators 314, shock absorbing elements 322 and force sensors 122 are supported on a respective one of the support rings 316 on one side of the base 308.
  • the support rings 316 can be part of the base 308 or can be supported on the base 308.
  • the base 308 can be a printed circuit board while the opposing side of the base 308 provides mechanical support and electrical connection for other components (not shown) of the portable electronic device 100.
  • Each piezoelectric disk actuator 314 is located between the base 308 and the support tray 310 such that an external applied force on the touch-sensitive display 118 resulting from a user pressing the touch-sensitive display 118 can be measured by the force sensors 122 and such that the charging of the piezoelectric disk actuator 314 causes a force on the touch-sensitive display 118, away from the base 308.
  • each piezoelectric disk actuator 314 is in contact with the support tray 310.
  • depression of the touch-sensitive display 118 by user application of a force thereto is determined by a change in resistance at the force sensors 122 and causes further bending of the piezoelectric disk actuators 314 as shown in Figure 3A.
  • the charge on the piezoelectric disk actuator 314 can be modulated to control the force applied by the piezoelectric disk actuator 314 on the support tray 310 and the resulting movement of the touch- sensitive display 118.
  • the charge can be modulated by modulating the applied voltage or current.
  • a current can be applied to increase the charge on the piezoelectric disk actuator 314 to cause the piezoelectric disk 318 to contract and to thereby cause the metal substrate 320 and the piezoelectric disk 318 to straighten as referred to above.
  • This charge therefore results in the force on the touch-sensitive display 118 for opposing the external applied force and movement of the touch-sensitive display 118 away from the base 308.
  • the charge on the piezoelectric disk actuator 314 can also be removed via a controlled discharge current causing the piezoelectric disk 318 to expand again, releasing the force caused by the electric charge and thereby decreasing the force on the touch- sensitive display 118, permitting the touch-sensitive display 118 to return to a rest position.
  • FIG. 5 shows a circuit for controlling the actuators 120 of the portable electronic device 100 according to one embodiment. As shown, each of the piezoelectric disks 318 is connected to a controller 500 such as a
  • microprocessor including a piezoelectric driver 502 and an amplifier and analog-to- digital converter (ADC) 504 that is connected to each of the force sensors 122 and to each of the piezoelectric disks 318.
  • ADC 504 is a 9- channel ADC.
  • the controller 500 is also in communication with the main processor 102 of the portable electronic device 100. The controller 500 can provide signals to the main processor 102 of the portable electronic device 100.
  • the piezoelectric driver 502 may be embodied in drive circuitry between the controller 500 and the piezoelectric disks 318.
  • the mechanical work performed by the piezoelectric disk actuator 314 can be controlled to provide generally consistent force and movement of the touch- sensitive display 118 in response to detection of an applied force on the touch- sensitive display 118 in the form of a touch, for example. Fluctuations in
  • mechanical work performed as a result of, for example, temperature can be reduced by modulating the current to control the charge.
  • the controller 500 controls the piezoelectric driver 502 for controlling the current to the piezoelectric disks 318, thereby controlling the charge.
  • the charge is increased to increase the force on the touch-sensitive display 118 away from the base 308 and decreased to decrease the force on the touch-sensitive display 118, facilitating movement of the touch-sensitive display 118 toward the base 308.
  • each of the piezoelectric disk actuators 314 are connected to the controller 500 through the piezoelectric driver 502 and are all controlled equally and concurrently. Alternatively, the piezoelectric disk actuators 314 can be controlled separately.
  • the portable electronic device 100 is controlled generally by
  • the force is applied by at least one of the piezoelectric disk actuators 314, in a single direction on the touch- sensitive input surface of the touch-sensitive display 118.
  • the charge at each of the piezoelectric disks 318 is modulated to modulate the force applied by the piezoelectric disk actuators 314 on the touch-sensitive display 118 and to thereby cause movement of the touch- sensitive display 118 for simulating the collapse of a dome-type switch.
  • the charge at each of the piezoelectric disks 318 is modulated to modulate the force applied by the piezoelectric disk actuators 314 to the touch-sensitive display 118 to cause movement of the touch-sensitive display 118 for simulating release of a dome-type switch.
  • the touch-sensitive display 118 is moveable within the housing 200 as the touch-sensitive display 118 can be moved away from the base 308, thereby compressing the compliant gasket 312, for example. Further, the touch-sensitive display 118 can be moved toward the base 308, thereby applying a force to the piezoelectric disk actuators 314. By this arrangement, the touch-sensitive display 118 is mechanically constrained by the housing 200 and resiliently biased by the compliant gasket compliant 312. In at least some embodiments, the touch- sensitive display 118 is resiliently biased and moveable between at least a first position and a second position in response to externally applied forces wherein the touch-sensitive display 118 applies a greater force to the force sensors 122 in the second position than in the first position.
  • the movement of the touch-sensitive display 118 in response to externally applied forces is detected by the force sensors 122.
  • the analog-to-digital converter 504 is connected to the piezoelectric disks 318.
  • an output such as a voltage output, from a charge created at each piezoelectric disk 318 may be measured based on signals received at the analog to digital converter 504.
  • a voltage signal which is proportional to the charge, is measured to determine the extent of the mechanical deformation.
  • the piezoelectric disks 318 also act as sensors for determining mechanical deformation.
  • the actuator 120 is a mechanical dome-type switch or a plurality of mechanical dome-type switches, which can be located in any suitable position such that displacement of the touch-sensitive display 118 resulting from a user pressing the touch-sensitive display 118 with sufficient force to overcome the bias and to overcome the actuation force for the switch, depresses and actuates the switch.
  • Figures 6A and 6B are schematic diagrams of a user interface screen 601 in accordance with one example embodiment of the present disclosure.
  • the screen 601 may be for any application 148 on the device 100 including, but not limited to, a clock application or calendar application.
  • a control interface in the form of a widget 606 is displayed on the display 112 in response to predetermined interaction with the screen 601 via the touch-sensitive overlay 114.
  • the widget 606 overlays a portion of the screen 601.
  • the widget 606 may be embedded or provided inline within the content of screen 601.
  • the widget 606 may be a date selection widget, time selection widget or date and time selection widget for managing the date and/or time of the operating system 146 or managing the date and/or time of an object in an application 148 such as, but not limited to, the clock application or calendar application.
  • the widget 606 is an element of the GUI which provides management of user configurable
  • a widget displays information which is manageable or changeable by the user in a window or box presented by the GUI.
  • the widget provides a single interaction point for the manipulation of a particular type of data. All applications 148 on the device 100 which allow input or manipulation of the particular type of data invoke the same widget. For example, each application 148 which allows the user to manipulate date and time for data objects or items may utilize the same date and time selection widget. Widgets are building blocks which, when called by an application 148, process and manage available interactions with the particular type of data.
  • the widget 606 is displayed in response to a
  • Such a predetermined interaction can be, but is not limited to, a user input for invoking or displaying the widget 606, a user input received in response to a prompt, and a user input directed to launching an application 148.
  • the widget 606 occupies only a portion of the screen 601 in the shown embodiment.
  • the widget 606 has a number of selectable fields each having a predefined user interface area indicated individually by references 608a, 608b and 608c.
  • the fields define a date and comprise a month field, day field and year field having values of "4", "24" and "2009” respectively (i.e., April 24, 2009). While the month field is numeric in the shown embodiment, in other embodiments the month field may be the month name.
  • the day of week (e.g., "Wed") may be included in addition to or instead of the numeric day field.
  • the fields may define a date and a time.
  • the fields may comprise a month field, day field, year field, hour field and minute field.
  • the fields may further comprise a day of week field, for example as the leading or first field, an AM/PM indicator, for example as the terminal or last field, or both.
  • an AM/PM indicator is not required and so may be eliminated.
  • the fields may define a time.
  • the fields may comprise an hour field and minute field.
  • the predefined user interface areas 608a-c of the selectable fields are shown using ghost outline to indicate that the field boundaries are hidden.
  • the boundaries of the predefined user interface areas 608a-c are typically not displayed in practice, but are shown in Figures 6A and 6B for the purpose of explanation.
  • Figure 6A shows the widget 606 when none of the fields are selected; however, in some embodiments one of the fields is always selected.
  • a default field may be selected automatically.
  • Fields in the widget 606 can be selected by corresponding interaction with the touch-sensitive display 114. For example, touching the predefined user interface area 608a, 608b or 608c associated with a respective field will select that field.
  • an onscreen position indicator also known as the "caret” or "focus" 620 is moved to the selected field.
  • the onscreen position indicator changes the appearance of the selected field to provide a visual indication of which field is currently selected.
  • the onscreen position indicator 620 may change the background colour of the selected field, text colour of the selected field or both.
  • the onscreen position indicator 620 causes the background colour of the selected field to be blue and the text colour of the selected field to be white.
  • the background colour of an unselected field may be black and the text colour of an unselected field may be white.
  • the background colour may be white and the text colour may be black when a field is unselected. It will be understood that the present disclosure is not limited to any colour scheme used for fields of the widget 606 to show its status as selected or unselected.
  • a touchscreen gesture is a predetermined touch gesture performed by touching the touch-sensitive display 118 in a predetermined manner, typically using a finger.
  • the predetermined touch gesture can be performed at any location on the touch-sensitive display 118.
  • the initial contact point of the predetermined touch gesture must not be at a location of a selectable field other than currently selected field or the touch event may select that other field and the predetermined touch gesture will be associated with that other field.
  • two distinct touch events may be required : an initial selection event in which a field of the widget 606 is selected and a predetermined touch gesture performed while a field in the widget 606 is selected.
  • Two distinct touch events assist in resolving ambiguity between touch events on the touch-sensitive display 118.
  • the predetermined touch gesture may be a movement in a
  • predetermined direction i.e. a touch event having a centroid which moves during the touch event by an amount which exceeds a predetermined distance (typically measured in displayed pixels).
  • the vertical movement relative to the screen orientation of the GUI causes the value of the selected field to be changed when the distance of that movement exceeds the predetermined distance.
  • the predetermined distance is used to debounce touch events to prevent small inadvertent movements of the centroid of the touch event from causing the value of the selected field to be changed.
  • the predetermined distance may be quite small (e.g. a few pixels) and could be a user configurable parameter. In other embodiments, the predetermined distance could be omitted. In some
  • an upward movement of the centroid of the touch event moves or advances the value of the selected field forward through a sequence list of values for the field
  • a downward movement of the centroid of the touch event moves or advances the value of the selected field backward through the sequence list of values for the field.
  • the effect of upward and downward movement may be switched in other embodiments.
  • the predetermined touch gesture may comprise a horizontal movement as well as a vertical movement provided the amount of vertical movement exceeds the predetermined distance. Accordingly, the predetermined movement could be vertical movement (i.e., an up or down movement) or a diagonal movement (i.e., an up-right, down-right, up-left or down- left movement).
  • the predetermined movement may be strictly a vertical movement, i .e., an up or down movement.
  • Touch data reported by the touch-sensitive display 118 may be analyzed to determine whether the horizontal component of the movement is less than a predetermined threshold. When the horizontal component is less than the predetermined threshold, the movement is considered vertical . When the horizontal component is more than the predetermined threshold, the movement is not considered vertical .
  • the horizontal movement relative to the screen orientation of the GUI causes the value of the selected field to be changed when the distance of that movement exceeds the predetermined distance.
  • a leftward movement may move or advance the value of the selected field forward through the sequence list of values for the field
  • a rightward movement may move or advance the value of the selected field backward through the sequence list of values for the field.
  • the touch gesture may comprise a vertical movement as well as a horizontal movement provided the amount of the horizontal movement exceeds the predetermined distance.
  • the predetermined movement may be strictly a horizontal movement, i .e., a left or right movement.
  • the predetermined touch gesture may comprise a number of movements and the movement of the touch event is evaluated during the event and is evaluated with respect to an initial contact point (e.g., centroid) of the touch event.
  • an initial contact point e.g., centroid
  • the value of the selected field is changed accordingly. If a second movement in the centroid of the touch event relative to the initial contact point which exceeds the predetermined distance is detected during the same touch event, the value is again changed accordingly. This may occur regardless of whether the second movement is in the same direction or a different direction from the first movement.
  • the amount by which the value of the selected field is moved through the sequential list is proportional to the distance that the centroid of the touch event has moved relative to the initial contact point.
  • the number of positions in the sequential list that the value is moved may be
  • the predetermined touch gesture may also be a swipe gesture. Unlike the movements described above, swipe gestures are evaluated after the event has ended. Swipe gestures have a single direction and do not comprise a number of movements. The direction of the swipe gesture is evaluated with respect to an initial contact point of the touch event at which the finger makes contact with the touch-sensitive display 118 and a terminal or ending contact point at which the finger is lifted from the touch-sensitive display 118.
  • swipe gestures include a horizontal swipe gesture and vertical swipe gesture.
  • a horizontal swipe gesture typically comprises an initial contact with the touch-sensitive display 118 towards its left or right edge to initialize the gesture, followed by a horizontal movement of the point of contact from the location of the initial contact to the opposite edge while maintaining continuous contact with the touch-sensitive display 118, and a breaking of the contact at the opposite edge of the touch-sensitive display 118 to complete the horizontal swipe gesture.
  • a vertical swipe gesture typically comprises an initial contact with the touch-sensitive display 118 towards its top or bottom edge to initialize the gesture, followed by a vertical movement of the point of contact from the location of the initial contact to the opposite edge while maintaining continuous contact with the touch-sensitive display 118, and a breaking of the contact at the opposite edge of the touch-sensitive display 118 to complete the vertical swipe gesture.
  • swipe gestures can be of various lengths, can be initiated in various places on the touch-sensitive display 118, and need not span the full dimension of the touch-sensitive display 118.
  • breaking contact of a swipe can be gradual, in that contact pressure on the touch-sensitive display 118 is gradually reduced while the swipe gesture is still underway.
  • touch-sensitive display 118 While interaction with the touch-sensitive display 118 is described in the context of fingers of a device user, this is for purposes of convenience only. It will be appreciated that a stylus or other object may be used for interacting with the touch-sensitive display 118 depending on the type of touchscreen display 210.
  • the value of a selected field is advanced or moved forwards through an ordered or sequential list of values of the field in response to an upward swipe gesture at any location on the touch-sensitive display 118.
  • An upward swipe gesture starts at a point on the touch-sensitive display 118 (e.g., near the bottom edge) and moves upwards from the point of view of the person conducting the swipe.
  • the value of a selected field is reversed or moved backwards through the sequential list of predetermined values of the field in response to a downward swipe gesture at any location on the touch- sensitive display 118.
  • a downward swipe gesture starts at a point on the touch- sensitive display 118 (e.g., near the top edge) and moves downwards from the point of view of the person conducting the swipe.
  • the sequential list may be configured such that the values in the sequential list wrap around to the beginning of the sequential list and vice versa. Wrapping may provide more efficient navigation and interaction with the fields for changing their values. In other embodiments, the fields may not wrap. Instead, scrolling stops at the beginning or end of the sequential list. In some embodiments, whether a field wraps may be a configurable parameters.
  • the amount of scrolling is proportional to the distance of the swipe gesture. For example, a long swipe gesture may move several values in the sequential list, whereas a shorter swipe gesture may move only fewer values in the sequential list including possibly only one.
  • proportionality is controlled by a multiplier which may be user configurable allowing the user to control the effect of finger movement on scrolling.
  • a multiplier which may be user configurable allowing the user to control the effect of finger movement on scrolling.
  • different multipliers may be used in different embodiments.
  • the ratio of scrolling to the number of swipe gestures is 1 : 1. That is, the value of the selected field is moved through the sequential list or changed by one for each swipe gesture.
  • a touch to select the desired user interface area 608a, 608b, or 608c is also the initial contact of the swipe gesture, such that the swipe gesture begins within the desired user interface area 608a, 608b, or 608c and ends outside the desired user interface area 608a, 608b, or 608c. This can be contrasted with conventional precision targeting which requires a gesture to be performed over the display element to be changed.
  • the sequential list of predetermined values for a field is context- dependent. That is, the sequential list of predetermined values for a field depends on the definition of the field. For example, when the field is a month field, the sequential list of predetermined values is defined by the months of the year. When the field is the day of week field, the sequential list of predetermined values is defined by the days of the week. When the field is the day field, the sequential list of predetermined values is defined by the days of the month (which will depend on the value of the month field).
  • the current value is shown in bold or large font or type.
  • the values before and after the current value within the sequential list of predetermined values for the field are also shown.
  • the value after the current value of the field is shown below the current value, whereas the value before the current value of the field is shown above the current value.
  • This provides a visual indication of the type of interaction that is required to change the value of a selected field, for example a direction of a touch gesture required to move forward or backward through the sequential list of values.
  • the current value is "4" and the value before it is "3" and the value after it is "5".
  • horizontal swipe gestures may be used to move between fields in the widget 606 thereby changing the selected field.
  • a leftward swipe gesture may be used to move leftward through the fields of the widget 606.
  • a leftward swipe gesture starts at a point on the touch-sensitive display 118 (e.g., near the right edge) and moves leftwards.
  • a rightward swipe gesture may be used to move rightwards through the fields of the widget 606.
  • a rightward swipe gesture starts at a point on the touch-sensitive display 118 (e.g., near the left edge) and moves rightwards.
  • FIG. 7 an alternate embodiment of a user interface screen 603 is shown.
  • directional arrows 622 and 624 are provided as part of the GUI above and below the selected field.
  • An up-arrow 622 is provided above the selected field and a down-arrow 624 is provided below the selected field in the in this embodiment.
  • the directional arrows 622 and 624 are not part of the predefined user interface areas 608.
  • Figure 8 shows an alternate embodiment of a user interface screen 605 in which the directional arrows 622 and 624 are part of the predefined user interface areas 608. In this embodiment, the values before and after the current value of the selected field are not shown.
  • pressing the touch-sensitive display 118 at the location of the up-arrow 622 actuates the actuator 120 and moves the value of the field forward through the sequential list of values for the field
  • the pressing the touch-sensitive display 118 at the location of the down-arrow 624 actuates the actuator 120 and moves the value of the field backwards through the set of predetermined values for the field.
  • pressing or "clicking" the touch-sensitive display 118 at the location of the up-arrow 622 moves the value of the field forward through the sequential list by one value (e.g., increments the current value of the selected field by one), and pressing or "clicking" the touch- sensitive display 118 at the location of the down-arrow 624 moves the value of the field backward through the sequential list by one value (e.g., decrements the current value of the selected field by one).
  • touching the up-arrow 622 or down-arrow 624 without pressing the touch-sensitive display 118 changes the value of the selected field by scrolling forwards or backwards as described above. In some
  • the touch event at the up-arrow 622 or down-arrow 624 must exceed a predetermined duration to change the value of the selected field. This requires a user to "hover" over the up-arrow 622 or down-arrow 624 to cause a corresponding change in the value of the selected field. The requirement for a time may reduce erroneous inputs to change the value of the selected field.
  • the user interface solution for the fields described above is sometimes referred to as a "spin dial" or "spin box".
  • the widget 606 of Figures 6A to 8 has three spin boxes: the month field, the day field, and the year field.
  • the teachings above can be applied to any number of spin boxes which can be provided in a widget or elsewhere in the GUI .
  • the spin boxes may be managed by a spin box manager (not shown) which is part of a user interface (UI) manager (not shown) for the device 100.
  • the user interface manager renders and displays the GUI of the device 100 in accordance with instructions of the operating system 146 and programs 148.
  • the spin box manager enforces a common appearance of across the controlled spin boxes e.g. height, visible rows, and padding.
  • Figure 9 shows a screen capture of a new appointment user interface screen 607 for a calendar application in accordance with one example embodiment of the present disclosure.
  • the fields of the widget 606 are defined by references 609a, 609b, 609c, and 609d.
  • the fields define a date and comprise a day of week field, month field, day field and year field having values of "Tue” or “Tuesday, "Aug” or “August”, “11” and "2009” respectively (i.e., Tuesday, August 11, 2009).
  • the value before the current value (e.g. "Mon” or “Monday”) in the sequential list is provided above the current value, whereas the value after the current value (e.g. "Wed” or “Wednesday”) in the sequential list is provided below the current value.
  • An onscreen position indicator 621 is used to show the selected field as described above, however, the values in the sequential list before and after the current value are de-emphasized by the onscreen position indicator 621 relative the current value.
  • the onscreen position indicator 621 is smaller (e.g. thinner) over the before and after values relative to the current value, and as colour gradient which diminishes in colour intensity (becomes transparent) in the vertical direction moving away from the current value.
  • the combination of user interface features in Figure 9 provides a visual indication that of how
  • interaction with the touch-sensitive display 118 can change the value of the selected field, i.e. that an upward or downward swipe will scroll backwards or forwards, respectively.
  • the present disclosure is primarily directed to a widget for date fields, time fields or date and time fields
  • teachings of the present disclosure can be applied to provide an efficient and user friendly widget or similar user interface element for changing the value of a field from a sequential list of predetermined values, or selecting an item from a sequential list.
  • Examples of sequential lists include numbers, dates, words, names, graphical symbols or icons, or any combination of these. While the examples of sequential lists described herein are text values, the sequential lists need not be limited to text.
  • date field, time fields and date and time fields are associated with a clock or calendar application and that changes in the value of at least some of the subfields of these fields may trigger changes in the values of other subfields in accordance with predetermined logical rules governing the clock and calendar.
  • FIG. 10 an example process 400 for a method of controlling touch input on a touch-sensitive display of a portable electronic device in accordance with one embodiment of the present disclosure will be described.
  • the steps of Figure 10 may be carried out by routines or subroutines of software executed by, for example, the processor 102.
  • the coding of software for carrying out such steps is well within the scope of a person of ordinary skill in the art given the present disclosure.
  • the widget 606 is rendered by a UI manager (not shown) and displayed on the display 112 in response to predetermined interaction with the touch-sensitive display 118.
  • a UI manager not shown
  • the interaction is, but is not limited to, a finger or stylus touching the touch-sensitive display 118 at the location of a user interface element having an invokable widget 606 associated with it, or pressing the touch-sensitive display 118 at the location of the user interface element having an invokable widget 606.
  • the predetermined interaction may involve selecting a corresponding menu option from a corresponding menu to invoke the widget 606.
  • the widget 606 comprises at least one field but typically a number of fields which may be spin boxes. The widget 606 is displayed on the user interface screen from which it was invoked and occupies a portion of the user interface screen.
  • a field in the widget 606 is selected.
  • the field is selected in response to predetermined interaction with the touch-sensitive display 118; however, the selected field may be a default field selected
  • An example of such predetermined interaction is, but is not limited to, a finger or stylus touching the touch-sensitive display 118 at the location of the field.
  • step 406 the value of the selected field is changed in response to a predetermined touch gesture at any location on the touch-sensitive display.
  • the original value of the selected field and the changed value of the selected field is stored, typically in RAM 108.
  • the predetermined touch gesture may be a movement in a predetermined direction, i .e. a touch event having a centroid which moves during the touch event by an amount which exceeds a predetermined distance (typically measured in displayed pixels).
  • the predetermined touch gesture is a vertical movement which exceeds the predetermined distance.
  • an upward movement of the centroid of the touch event moves or advances the value of the selected field forward through the sequence list of values for the field
  • a downward movement of the centroid of the touch event moves or advances the value of the selected field backward through the sequence list of values for the field.
  • the effect of upward and downward movement may be switched in other embodiments.
  • a swipe gesture in a first direction at any location on the touch-sensitive display scrolls forward through a sequential list of values for the field to select a new value for the field.
  • a swipe gesture in a second direction at any location on the touch-sensitive display scrolls backward through the sequential list of values for the field to select a new value for the field.
  • the swipe gesture in the first direction may be an upward swipe gesture and the swipe gesture in the second direction may be a downward swipe gesture in some embodiments.
  • the amount by which the value of the selected field is moved through the sequential list is proportional to the distance that the centroid of the touch event has moved relative to the initial contact point.
  • step 408 the widget 606 and possibly the user interface screen is re-rendered and re-displayed on the display screen 112 in accordance with the changed value of the selected field.
  • input accepting or rejecting a change in the value of the fields of the widget 606 may be required (step 410).
  • the changed value(s) are stored in the memory 110 of the device 100 (step 412) and the widget 606 is removed from the touch-sensitive display 118.
  • the user interface screen is re-rendered and re-displayed accordingly.
  • changes may be accepted by activating or "clicking" an "Ok” virtual button in the widget 606, for example, by pressing the touch-sensitive display 118 at the location of the "Ok” virtual button in the widget 606.
  • any changes may be accepted by activating or "clicking" a "Cancel” virtual button in the widget 606, for example, by pressing the touch-sensitive display 118 at the location of the "Cancel” virtual button in the widget 606.
  • predetermined touch gestures can be used to select different fields in the widget 606, for example, to scroll or move between fields in the widget 606.
  • a leftward swipe gesture at any location on the touch-sensitive display scrolls leftward through the fields in the widget to select a new field.
  • a rightward swipe gesture at any location on the touch-sensitive display scrolls rightward through the fields in the widget to select a new field.
  • a virtual keyboard or keypad may be invoked via predetermined interaction with the touch- sensitive display 118 while a field in a widget is selected.
  • Figure 11A shows a virtual keyboard in accordance with other example embodiment.
  • the shown virtual keyboard is a reduced keyboard provided by a portrait screen orientation; however, a full keyboard could be used in a landscape screen orientation or in a portrait screen orientation in a different embodiment.
  • Figure 11B shows a virtual keypad in accordance with other example embodiment.
  • the shown virtual keypad is a numeric keypad provided by a portrait screen orientation.
  • the virtual keyboard of Figure 11A or virtual keypad of Figure 11B is selected in accordance with a data type of the field which is selected when the virtual keyboard or virtual keypad is invoked.
  • the virtual keypad is invoked when the selected field is a numeric field and the virtual keyboard is invoked when the selected field is a alphabetic or alphanumeric field.
  • the virtual keyboard or keypad may allow custom entry of values in the widget while taking advantage of its scrolling (or spinning) functionality which seeks to provide a more efficient and easy-to-use interface and potentially reducing the number of erroneous inputs.
  • Figure 12A to 12C are screen captures of a widget for the user interface screen of Figure 11A or 11B.
  • the virtual keyboard or keypad may be used to input values in a text entry field for use in the selected field in the widget.
  • the input in the text entry field does not need to match the sequential list of values associated with that field.
  • input in the text entry field which does not match a data type and/or data format of the selected field is not accepted by the virtual keyboard or keypad virtual (i.e., the input is rejected).
  • an alphabetic character cannot be entered into a numeric field.
  • entry in the text entry field is automatically populated into the selected field.
  • the values of the sequential list are dynamically changed in accordance with the current value of the selected field. Accordingly, the values before and after the current value shown in the widget of Figures 12A to 12C are dynamically determined based on the current value of the selected field. As more characters are input in the widget, from “2" to "20" to "200” the values in the sequential list are dynamically changed and the displayed values before and after the selected field are changed accordingly from “1" and "3", to "19 and "21", to "199” and "201".
  • the values in the sequential list define a numeric series which values differ only by one, this is a function of the particular type of field, i.e. date fields.
  • the difference between values in the sequential list could be different, for example, 5, 10 or 15, and be unequal between values in the sequential list.
  • the size of each field is fixed in width according to the maximum number of character or digits.
  • the value of each field may be center-aligned within each field. The number of characters or digits is fixed according to the data type of the field.
  • At least some fields may have a maximum and minimum value.
  • modification of a minute field of a time widget in accordance with one example embodiment of the present disclosure will be described. Firstly, the minute field is selected and a virtual keypad is invoked as shown in Figure 13A. Next, the user enters the value "2" in the entry field ( Figure 13B), followed by a second "2" to create the custom value of "22" ( Figure 13C). In the shown embodiment, the values before and after the custom value are the values in the sequential list between the custom value. Using the virtual keypad of the time widget, any number between the minimum and
  • a predetermined touch gesture to change the value of the selected field is performed rather than clicking, the change of the selected field is changed in accordance with the predetermined touch gesture (e.g. in accordance with a direction of the touch event) as described above rather than accepting the value "22" and the customized value is discarded. If no input is detected within a predetermined duration of inputting the custom value, the widget times out and the customized value is discarded.
  • the teachings in regards to widgets provided by present disclosure may also be used in the context on non-touchscreen devices where the navigation function provided by the touch-sensitive display 118 is provided by an alternate navigational device such as a trackball or scroll wheel . In such cases, scrolling or “spinning" is provided by movement of the trackball or scroll wheel in a corresponding direction when a field in the widget is selected.
  • the present disclosure is also directed to a pre-recorded storage device or other similar computer readable medium including program instructions stored thereon for performing the methods described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte sur un procédé de commande d'entrée tactile sur un écran tactile lorsqu'un élément d'affichage est actif, et sur un dispositif électronique portable configuré pour ce procédé. Selon un mode de réalisation, un procédé de commande d'entrée tactile sur un écran tactile d'un dispositif électronique portable est décrit, le procédé consistant à : afficher un composant d'interface graphique ayant au moins un champ sur un écran d'interface utilisateur affiché sur l'écran tactile ; sélectionner le champ dans le composant d'interface graphique en réponse à une interaction prédéterminée avec l'écran tactile ; modifier la valeur du champ sélectionné en fonction d'un geste tactile prédéterminé à n'importe quel endroit sur l'écran tactile ; et réafficher le composant d'interface graphique sur l'écran d'interface utilisateur avec la valeur modifiée du champ sélectionné.
EP10821510.4A 2009-10-07 2010-10-07 Modification de la valeur d'un champ par geste tactile ou clavier virtuel Withdrawn EP2486472A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CA 2681879 CA2681879A1 (fr) 2009-10-07 2009-10-07 Methode de commande de l'entree des touches sur un ecran tactile lorsqu'un element de l'affichage est actif, et dispositif electronique portatif ainsi configure
PCT/CA2010/001560 WO2011041885A1 (fr) 2009-10-07 2010-10-07 Modification de la valeur d'un champ par geste tactile ou clavier virtuel

Publications (2)

Publication Number Publication Date
EP2486472A1 true EP2486472A1 (fr) 2012-08-15
EP2486472A4 EP2486472A4 (fr) 2016-01-06

Family

ID=43853561

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10821510.4A Withdrawn EP2486472A4 (fr) 2009-10-07 2010-10-07 Modification de la valeur d'un champ par geste tactile ou clavier virtuel

Country Status (3)

Country Link
EP (1) EP2486472A4 (fr)
CA (1) CA2681879A1 (fr)
WO (1) WO2011041885A1 (fr)

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8225231B2 (en) 2005-08-30 2012-07-17 Microsoft Corporation Aggregation of PC settings
US8086275B2 (en) 2008-10-23 2011-12-27 Microsoft Corporation Alternative inputs of a mobile communications device
US8411046B2 (en) 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US20120304132A1 (en) 2011-05-27 2012-11-29 Chaitanya Dev Sareen Switching back to a previously-interacted-with application
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
EP2584441A1 (fr) * 2011-10-18 2013-04-24 Research In Motion Limited Dispositif électronique et son procédé de contrôle
US8810535B2 (en) 2011-10-18 2014-08-19 Blackberry Limited Electronic device and method of controlling same
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
GB2511526A (en) 2013-03-06 2014-09-10 Ibm Interactor for a graphical object
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
JP6141221B2 (ja) * 2013-07-31 2017-06-07 京セラドキュメントソリューションズ株式会社 数値入力装置及び電子機器
CN108279848B (zh) * 2014-03-13 2022-03-25 联想(北京)有限公司 一种显示方法及电子设备
CN105359094A (zh) 2014-04-04 2016-02-24 微软技术许可有限责任公司 可扩展应用表示
WO2015154276A1 (fr) 2014-04-10 2015-10-15 Microsoft Technology Licensing, Llc Couvercle coulissant pour dispositif informatique
KR102107275B1 (ko) 2014-04-10 2020-05-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 컴퓨팅 디바이스에 대한 접이식 쉘 커버
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
WO2016065568A1 (fr) 2014-10-30 2016-05-06 Microsoft Technology Licensing, Llc Dispositif d'entrée à configurations multiples
US10486938B2 (en) 2016-10-28 2019-11-26 Otis Elevator Company Elevator service request using user device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
WO2005008444A2 (fr) * 2003-07-14 2005-01-27 Matt Pallakoff Systeme et procede pour client multimedia portable
US20060184892A1 (en) * 2005-02-17 2006-08-17 Morris Robert P Method and system providing for the compact navigation of a tree structure
US8564543B2 (en) * 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2011041885A1 *

Also Published As

Publication number Publication date
CA2681879A1 (fr) 2011-04-07
EP2486472A4 (fr) 2016-01-06
WO2011041885A1 (fr) 2011-04-14

Similar Documents

Publication Publication Date Title
US20110080351A1 (en) method of controlling touch input on a touch-sensitive display when a display element is active and a portable electronic device configured for the same
EP2486472A1 (fr) Modification de la valeur d'un champ par geste tactile ou clavier virtuel
CA2761700C (fr) Procede et appareil pour un afficheur tactile
CA2667911C (fr) Dispositif electronique portatif comprenant un ecran tactile et methode de commande connexe
EP2338102B1 (fr) Appareil électronique portable et procédé de commande de celui-ci
US8689146B2 (en) Electronic device and method of displaying information in response to input
US8531417B2 (en) Location of a touch-sensitive control method and apparatus
US20110179381A1 (en) Portable electronic device and method of controlling same
EP2508972A2 (fr) Dispositif électronique portable et son procédé de commande
EP2175359A2 (fr) Dispositif électronique doté d'un écran tactile sensible à l'état
US20100085313A1 (en) Portable electronic device and method of secondary character rendering and entry
US8121652B2 (en) Portable electronic device including touchscreen and method of controlling the portable electronic device
EP2175355A1 (fr) Dispositif électronique portable et procédé pour le rendu et l'entrée de caractères secondaires
EP2105824A1 (fr) Affichage d'écran tactile pour dispositif électronique et procédé pour la détermination de l'interaction tactile avec celui-ci
CA2686769C (fr) Dispositif electronique portatif et methode de commande connexe
KR20120093056A (ko) 전자 디바이스 및 이의 제어 방법
US20120139845A1 (en) Soft key with main function and logically related sub-functions for touch screen device
KR20110133450A (ko) 휴대용 전자 디바이스 및 이의 제어 방법
KR20110105718A (ko) 휴대용 전자 디바이스 및 그 제어 방법
JP5667301B2 (ja) 携帯用電子デバイスおよびそれを制御する方法
CA2749244C (fr) Methode de localisation d'une commande sur un ecran tactile et appareil connexe
CA2768287C (fr) Dispositif electronique et procede d'affichage de l'information selon les elements saisis
EP2348392A1 (fr) Dispositif électronique portable et son procédé de contrôle
CA2706055C (fr) Systeme et methode d'application d'un algorithme de prediction de texte a un clavier virtuel
EP2199898B1 (fr) Dispositif électronique portable doté d'un écran tactile et son procédé de commande

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120224

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: BLACKBERRY LIMITED

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: BLACKBERRY LIMITED

RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20151208

RIC1 Information provided on ipc code assigned before grant

Ipc: H04W 88/02 20090101ALI20151202BHEP

Ipc: G06F 3/0488 20130101ALI20151202BHEP

Ipc: G09G 5/34 20060101ALI20151202BHEP

Ipc: G06F 15/02 20060101ALI20151202BHEP

Ipc: G06F 3/0485 20130101ALI20151202BHEP

Ipc: G06F 3/041 20060101AFI20151202BHEP

Ipc: G06F 3/0484 20130101ALI20151202BHEP

Ipc: G06F 3/048 20060101ALI20151202BHEP

17Q First examination report despatched

Effective date: 20180207

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180619

RIC1 Information provided on ipc code assigned before grant

Ipc: G09G 5/34 20060101ALI20151202BHEP

Ipc: G06F 3/0485 20130101ALI20151202BHEP

Ipc: G06F 3/0484 20130101ALI20151202BHEP

Ipc: G06F 3/048 20130101ALI20151202BHEP

Ipc: G06F 3/0488 20130101ALI20151202BHEP

Ipc: H04W 88/02 20090101ALI20151202BHEP

Ipc: G06F 15/02 20060101ALI20151202BHEP

Ipc: G06F 3/041 20060101AFI20151202BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/041 20060101AFI20151202BHEP

Ipc: G06F 3/048 20130101ALI20151202BHEP

Ipc: G06F 15/02 20060101ALI20151202BHEP

Ipc: H04W 88/02 20090101ALI20151202BHEP

Ipc: G06F 3/0488 20130101ALI20151202BHEP

Ipc: G09G 5/34 20060101ALI20151202BHEP

Ipc: G06F 3/0484 20130101ALI20151202BHEP

Ipc: G06F 3/0485 20130101ALI20151202BHEP