EP2521965A1 - Apparatus and method for conditionally enabling or disabling soft buttons - Google Patents

Apparatus and method for conditionally enabling or disabling soft buttons

Info

Publication number
EP2521965A1
EP2521965A1 EP10700014A EP10700014A EP2521965A1 EP 2521965 A1 EP2521965 A1 EP 2521965A1 EP 10700014 A EP10700014 A EP 10700014A EP 10700014 A EP10700014 A EP 10700014A EP 2521965 A1 EP2521965 A1 EP 2521965A1
Authority
EP
European Patent Office
Prior art keywords
button
time
input
buttons
enabled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP10700014A
Other languages
German (de)
English (en)
French (fr)
Inventor
Kenneth L. Kocienda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP2521965A1 publication Critical patent/EP2521965A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display

Definitions

  • the device is a desktop computer.
  • the device is portable (e.g., a notebook computer, tablet computer, or handheld device).
  • the device has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions.
  • GUI graphical user interface
  • the user interacts with the GUI primarily through finger contacts and gestures on the touch-sensitive display.
  • the functions may include various types of editing, drawing, presenting, word processing, website creating, disk authoring, spreadsheet making, game playing, telephoning, video conferencing, e-mailing, instant messaging, workout support, digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing.
  • Executable instructions for performing these functions may be included in a computer readable storage medium or other computer program product configured for execution by one or more processors.
  • a method is performed at a multifunction device with a touch screen display.
  • the method includes displaying a soft keyboard having a plurality of buttons including a plurality of unconditionally enabled buttons and one or more conditionally enabled buttons, detecting a first input with a first button at a first time, and responding to detecting the first input by activating the first button.
  • the method further includes detecting a second input with a second button at a second time, and responding to detection of the second input with the second button at the second time.
  • the method activates the second button.
  • the method conditionally disables the second button.
  • a method is performed at a multifunction device with a touch screen display.
  • the method includes displaying a soft keyboard having a plurality of buttons including a plurality of unconditionally enabled buttons and one or more conditionally enabled buttons detecting a first input with a respective unconditionally enabled button at a first time, and responding to detection of the first input at the first time by activating the respective unconditionally enabled button, and disabling at least one of the conditionally enabled buttons for a predefined period of time commencing from the first time.
  • the method further includes detecting a second input with a conditionally enabled button disabled in response to the first input, the second input being detected at a second time that is within the predefined period of time, and responding to detection of the second input at the second time by disregarding the second input.
  • Figure 2 illustrates a portable multifunction device having a touch-sensitive display in accordance with some embodiments.
  • the device supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
  • applications such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
  • Touch screen 112 and display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112.
  • touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112.
  • projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, California.
  • contacts module 137 (sometimes called an address book or contact list);
  • video player module 145 includes executable instructions to display, present or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124).
  • calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
  • map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
  • maps e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data
  • Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein).
  • These modules i.e., sets of instructions
  • video player module 145 may be combined with music player module 146 into a single module (e.g., video and music player module 152, Figure IB).
  • memory 102 may store a subset of the modules and data structures identified above.
  • memory 102 may store additional modules and data structures not described above.
  • Event sorter 170 includes event monitor 171 and event dispatcher module 174.
  • application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch sensitive display 112 when the application is active or executing.
  • device/global internal state 157 is used by event sorter 170 to determine which application(s) is(are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
  • Event information includes information about a sub-event (e.g., a user touch on touch- sensitive display 112, as part of a multi-touch gesture).
  • Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110).
  • Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch- sensitive display 112 or a touch-sensitive surface.
  • Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
  • operating system 126 includes event sorter 170.
  • a respective event recognizer 180 receives event information (e.g., event data
  • a respective event recognizer 180 includes metadata
  • a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized.
  • a respective event recognizer 180 delivers event information associated with the event to event handler 190.
  • Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view.
  • event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
  • the contact may include a gesture, such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100.
  • a gesture such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100.
  • inadvertent contact with a graphic may not select the graphic. For example, a swipe gesture that sweeps over an application icon may not select the corresponding application when the gesture corresponding to selection is a tap.
  • E-mail client 140 which may include an indicator 410 of the number of
  • Settings 412 which provides access to settings for device 100 and its various aspects
  • conditionally enabled button means a button, key or object displayed on a soft keyboard or user interface that can be activated when enabled and that cannot be activated when disabled.
  • conditionally enabled button is logically equivalent to “conditionally disabled button” because whether the button in enabled or disabled is conditional.
  • UI user interfaces
  • UI user interfaces
  • a multifunction device with a display and a touch-sensitive surface, such as device 300 or portable multifunction device 100.
  • UI 500B ( Figure 5B) illustrates contact 514 on touch screen 112.
  • contact 514 is at a location 514- A on touch screen 112 corresponding to the location of button 510-C, which also corresponds to the activation region of button 510-C.
  • a character "e” 516 is inserted into input text 506, changing text 506-1 to text 506-2, and advancing cursor 508 to position 508- B.
  • the detection of contact 514 at location 514-A corresponding to the location of button 510-C also starts a time window of a predefined amount of time, during which contacts detected at locations on touch screen 112 corresponding to the locations of one or more conditionally enabled buttons (e.g., buttons 512) of keyboard 504 are disregarded; those buttons are disabled during the time window.
  • the predefined duration of the time window is fixed (i.e., invariant), having a duration, for example, between 200 and 500 milliseconds; in another example the predefined duration of the time window is fixed at a value between 150 and 700 milliseconds; in yet another example the predefined duration of the time window is fixed at a value between 300 and 750 milliseconds.
  • UI 500F ( Figure 5F) illustrates a contact 518 on touch screen 112 directly following contact 514.
  • contact 518 is at location 518-C on touch screen 112 corresponding to the location of button 512-C, which also corresponds to the activation region of button 512-C.
  • Button 512-C is a conditionally enabled button that is affected by the time window from contact 514. Depending on when contact 518 was made (and thus when contact 518 was detected), either keyboard 504 is hidden or nothing happens in response to detection of contact 518 at location 518-C.
  • the device detects (704) a first input with a respective unconditionally enabled button at a first time. For example, in Figure 5B, contact 514 (the first input) is detected at time 552 at location 514-A, which corresponds to unconditionally enabled button 510-C.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Position Input By Displaying (AREA)
EP10700014A 2010-01-06 2010-01-06 Apparatus and method for conditionally enabling or disabling soft buttons Ceased EP2521965A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2010/020263 WO2011084157A1 (en) 2010-01-06 2010-01-06 Apparatus and method for conditionally enabling or disabling soft buttons

Publications (1)

Publication Number Publication Date
EP2521965A1 true EP2521965A1 (en) 2012-11-14

Family

ID=43088179

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10700014A Ceased EP2521965A1 (en) 2010-01-06 2010-01-06 Apparatus and method for conditionally enabling or disabling soft buttons

Country Status (7)

Country Link
EP (1) EP2521965A1 (zh)
JP (1) JP5607182B2 (zh)
KR (1) KR101441217B1 (zh)
CN (1) CN102216897B (zh)
AU (1) AU2010340370B2 (zh)
TW (1) TWI448956B (zh)
WO (1) WO2011084157A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10510097B2 (en) 2011-10-19 2019-12-17 Firstface Co., Ltd. Activating display and performing additional function in mobile terminal with one-time user input
US11842734B2 (en) 2015-03-08 2023-12-12 Apple Inc. Virtual assistant activation
US11853647B2 (en) 2015-12-23 2023-12-26 Apple Inc. Proactive assistance based on dialog communication between devices

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8898585B2 (en) * 2010-02-11 2014-11-25 Htc Corporation Electronic device, input method thereof, and computer-readable medium using the method
CN103874976B (zh) * 2012-02-14 2018-05-18 松下电器产业株式会社 电子设备
US9285980B2 (en) 2012-03-19 2016-03-15 Htc Corporation Method, apparatus and computer program product for operating items with multiple fingers
CN102841549B (zh) * 2012-08-23 2014-06-04 丰唐物联技术(深圳)有限公司 开关识别方法、其装置及Z-wave控制终端
CN103620530A (zh) * 2012-11-28 2014-03-05 华为终端有限公司 信息输入方法及触摸屏终端
EP2938989A1 (en) * 2012-12-27 2015-11-04 Delphi Technologies, Inc. Algorithm for detecting activation of a push button
KR102073615B1 (ko) 2013-01-22 2020-02-05 엘지전자 주식회사 인풋 인터페이스를 제공하는 터치 센서티브 디스플레이 디바이스 및 그 제어 방법
US20160011775A1 (en) * 2013-03-07 2016-01-14 Dongguan Yulong Telecommunication Tech Co., Ltd. Terminal and Terminal Operating Method
CN104063068A (zh) * 2014-06-25 2014-09-24 深圳市开立科技有限公司 一种基于Android原生输入法的显示方法及其系统
CN106155496A (zh) * 2015-04-27 2016-11-23 阿里巴巴集团控股有限公司 一种信息展示方法及装置
NO346144B1 (en) * 2018-09-12 2022-03-21 Elliptic Laboratories As Proximity sensing

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0659793A (ja) * 1992-08-04 1994-03-04 Oki Electric Ind Co Ltd キーボードの打鍵キー認識方法
JP2959418B2 (ja) * 1994-11-29 1999-10-06 日本電気株式会社 タッチパネル入力装置
DE69814155T2 (de) * 1997-12-16 2003-10-23 Microsoft Corp System und verfahren zur virtuellen eingabe
EP2256605B1 (en) 1998-01-26 2017-12-06 Apple Inc. Method and apparatus for integrating manual input
GB9910525D0 (en) * 1999-05-07 1999-07-07 Healey Nicholas Erroneous keyboard entry correction method
US7750891B2 (en) * 2003-04-09 2010-07-06 Tegic Communications, Inc. Selective input system based on tracking of motion parameters of an input device
AU5299700A (en) * 1999-05-27 2000-12-18 America Online, Inc. Keyboard system with automatic correction
JP2001006587A (ja) * 1999-06-18 2001-01-12 Hitachi Ltd 荷電粒子線照射装置
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7218226B2 (en) 2004-03-01 2007-05-15 Apple Inc. Acceleration-based theft detection system for portable electronic devices
US6677932B1 (en) 2001-01-28 2004-01-13 Finger Works, Inc. System and method for recognizing touch typing under limited tactile feedback conditions
US6570557B1 (en) 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US6657560B1 (en) * 2001-10-19 2003-12-02 Richard Jung Rounded keypad
JP2003288155A (ja) * 2002-03-27 2003-10-10 Nippon Create Kk 文字列入力プログラム
US7136047B2 (en) * 2003-04-09 2006-11-14 Microsoft Corporation Software multi-tap input system and method
JP3811693B2 (ja) * 2003-10-02 2006-08-23 京セラミタ株式会社 表示装置、これを備えた画像形成装置及び入力受付装置
JP2006185064A (ja) * 2004-12-27 2006-07-13 Casio Comput Co Ltd データ処理装置及びプログラム
US7659887B2 (en) * 2005-10-20 2010-02-09 Microsoft Corp. Keyboard with a touchpad layer on keys
EP1791051A1 (en) * 2005-11-23 2007-05-30 Research In Motion Limited System and method for recognizing a keystroke in an electronic device
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
KR20080073872A (ko) * 2007-02-07 2008-08-12 엘지전자 주식회사 터치 스크린을 구비한 이동통신 단말기 및 이를 이용한정보 입력 방법
JP5383053B2 (ja) * 2008-01-29 2014-01-08 京セラ株式会社 表示機能付き端末装置
US7453441B1 (en) * 2008-03-31 2008-11-18 International Business Machines Corporation Method and system for intelligent keyboard illumination
TW200951783A (en) * 2008-06-06 2009-12-16 Acer Inc Electronic device and controlling method thereof
US20090309768A1 (en) * 2008-06-12 2009-12-17 Nokia Corporation Module, user interface, device and method for handling accidental key presses

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2011084157A1 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10510097B2 (en) 2011-10-19 2019-12-17 Firstface Co., Ltd. Activating display and performing additional function in mobile terminal with one-time user input
US10896442B2 (en) 2011-10-19 2021-01-19 Firstface Co., Ltd. Activating display and performing additional function in mobile terminal with one-time user input
US11551263B2 (en) 2011-10-19 2023-01-10 Firstface Co., Ltd. Activating display and performing additional function in mobile terminal with one-time user input
US11842734B2 (en) 2015-03-08 2023-12-12 Apple Inc. Virtual assistant activation
US11853647B2 (en) 2015-12-23 2023-12-26 Apple Inc. Proactive assistance based on dialog communication between devices

Also Published As

Publication number Publication date
TW201145147A (en) 2011-12-16
CN102216897A (zh) 2011-10-12
JP5607182B2 (ja) 2014-10-15
JP2013516689A (ja) 2013-05-13
AU2010340370B2 (en) 2014-06-26
TWI448956B (zh) 2014-08-11
KR20120113770A (ko) 2012-10-15
WO2011084157A1 (en) 2011-07-14
KR101441217B1 (ko) 2014-09-17
AU2010340370A1 (en) 2012-08-09
CN102216897B (zh) 2014-07-02

Similar Documents

Publication Publication Date Title
US10891023B2 (en) Device, method and graphical user interface for shifting a user interface between positions on a touch-sensitive display in response to detected inputs
US9442654B2 (en) Apparatus and method for conditionally enabling or disabling soft buttons
AU2010340370B2 (en) Apparatus and method for conditionally enabling or disabling soft buttons
US8806362B2 (en) Device, method, and graphical user interface for accessing alternate keys
AU2010339633B2 (en) Apparatus and method having multiple application display modes including mode with display resolution of another apparatus
US9052894B2 (en) API to replace a keyboard with custom controls
EP2357556A1 (en) Automatically displaying and hiding an on-screen keyboard
US20110167339A1 (en) Device, Method, and Graphical User Interface for Attachment Viewing and Editing
AU2019204750B2 (en) Gesture based graphical user interface for managing concurrently open software applications
AU2015202565B2 (en) Gesture based graphical user interface for managing concurrently open software applications

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120710

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1177282

Country of ref document: HK

17Q First examination report despatched

Effective date: 20130829

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APPLE INC.

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20181221

REG Reference to a national code

Ref country code: HK

Ref legal event code: WD

Ref document number: 1177282

Country of ref document: HK