New! View global litigation for patent families

US20070216643A1 - Multipurpose Navigation Keys For An Electronic Device - Google Patents

Multipurpose Navigation Keys For An Electronic Device Download PDF

Info

Publication number
US20070216643A1
US20070216643A1 US11750611 US75061107A US2007216643A1 US 20070216643 A1 US20070216643 A1 US 20070216643A1 US 11750611 US11750611 US 11750611 US 75061107 A US75061107 A US 75061107A US 2007216643 A1 US2007216643 A1 US 2007216643A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
navigation
key
device
press
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11750611
Inventor
Robert Morris
Stephen Sullivan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Scenera Technologies LLC
Original Assignee
Scenera Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key

Abstract

A method and apparatus for multipurpose navigation in a portable electronic device are described. According to an exemplary embodiment, a portable electronic imaging device is described including a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons. The device also includes a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a Continuation application of co-pending U.S. patent application Ser. No. 10/869,733, filed Jun. 16, 2004, titled “Multipurpose Navigation Keys For An Electronic Imaging Device,” (now U.S. Pat. No. 7,222,307, issued May 22, 2007), which is commonly owned with this application and is herein incorporated by reference.
  • FIELD OF THE INVENTION
  • [0002]
    The present invention relates generally to portable electronic imaging devices, including digital cameras and cell phones, and more particularly to a method and apparatus for implementing navigation and select functions using a multipurpose navigation key.
  • BACKGROUND OF THE INVENTION
  • [0003]
    Portable electronic imaging devices capable of displaying digital images and video are commonplace today. Examples of such devices include digital cameras, camera-enabled cell phones, MP3 players, and personal digital assistants (PDAs), for instance. FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • [0004]
    Referring to FIG. 1A, a conventional imaging device 10 is equipped with a liquid-crystal display (LCD) or other type of display screen 12 for displaying objects 14. Objects that may be displayed on the display screen may include digital still images, video clips, menu items, and icons. In play mode, the display screen 12 is used as a playback screen for allowing the user to view objects individually or multiple objects at a time. Besides the display screen 12, the hardware user interface also includes a number of keys, buttons or switches for operating the device 10 and for navigating between displayed objects 14. Examples keys include zoom keys (not shown) for zooming a displayed image, a navigation controller 18, and a select key 20. A four-way navigation controller 18 is shown in FIG. 1A, which includes four keys; left/right keys 18 a and 18 b, having a horizontal orientation, and up/down keys 18 c and 18 d, having a vertical orientation. FIG. 1B is a diagram similar to FIG. 1A, where like components have like reference numerals, but shows the conventional imaging device 10 with a two-way navigation controller that only includes two keys 18 a and 18 b, rather than four.
  • [0005]
    In both embodiments shown in FIGS. 1A and 1B, a user navigates to a desired object 14 by pressing the navigation controller 18. In the case where a single object 14 is displayed on the screen 12, the displayed object 14 is considered the current selection. In the case where multiple objects 14 are displayed, a highlight or other indication is moved from object 14 to object 14 as the user navigates to indicate the currently selected object 14. Once the user navigates to a desired object 14, the user may initiate the default action associated with the current selection by pressing the select key 20. Examples of actions that can be performed by pressing the select key 20 include edit, open/execute, and delete. The select key 20 is shown in the center of the navigation controller 18 in FIG. 1A, but the select key 20 may also be located outside of the navigation controller, as shown in FIG. 1B. In yet other embodiments, the 2-way/4-way navigation controller 18 may be implemented as an integrated 2-way/4-way key.
  • [0006]
    Although the current solution for allowing a user to navigate among objects and to initiate an action associated with the object 14 using a combination of the navigation controller 18 and the select key 20 works for its intended purposes, this implementation has several disadvantages. First, space for keys is limited on portable imaging devices. Having separate navigation and selection keys 18 and 20 occupies valuable space on the device 10. The user must find and press the right key in the correct sequence, which given the small keys on many portable devices due to miniaturization, is not always an easy task.
  • [0007]
    In addition, the user must find the right portion of the navigation controller 18 for the direction of navigation desired. Users of devices with navigation controller keys often get unexpected results from pressing an undesired portion of the navigation controller key 18. The most typical error is when the user presses a navigation key when intending to press the selection key 20 to initiate the selection function.
  • [0008]
    Accordingly, what is needed is an improved method and apparatus for implementing the navigation and select functions on the portable electronic imaging device. The present invention addresses such a need.
  • BRIEF SUMMARY OF THE INVENTION
  • [0009]
    The present invention provides a portable electronic imaging device that includes a display screen for displaying objects including any combination of digital still images, video clips, menu items, and icons; and a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects, wherein the user may select a currently displayed object without moving a finger from a navigation key last pressed, thereby implementing navigation and select functions on a single controller. In the preferred embodiment, the portable imaging device is configured to detect double-presses and press-and-holds on any navigation key, and either or both of these events may be interpreted as a user selection event that invokes the default operation on the currently selected object(s).
  • [0010]
    According to the method and apparatus disclosed herein, the present invention eliminates the need for a user to use a select key, thus reducing user error. In addition, the select key may be eliminated from the device altogether, thereby saving space on navigation controller-equipped devices.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • [0011]
    FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • [0012]
    FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention.
  • [0013]
    FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller in accordance with a preferred embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • [0014]
    The present invention relates to implementing of navigation and select functions on a portable electronic device. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
  • [0015]
    The present inventions provides an improved method and apparatus for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller that performs both navigation and select functions.
  • [0016]
    FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention, where like components have like reference numerals. The imaging device 50 equipped with the multipurpose navigation controller 56 of the present invention allows a user to select an object 54 displayed on screen 52 without moving his/her finger from the last navigation key 56 pressed. In the preferred embodiment, the portable imaging device 50 is configured to detect double-presses and press-and-holds on any navigation key 56. Either or both of these events may be interpreted as a user selection event, which when detected invokes the default operation on the currently selected object(s). Thus, when a user navigates to a displayed object 54, he/she can simply double-click the last navigation key 56 pressed (or any navigation key) or press-and-hold the last navigation key 56 pressed to select the current object(s) 54. In a further embodiment, the device 50 may be configured to detect double-presses and press-and-holds on any navigation key 56, such that a detected double-press indicates a user selection, while a detected press-and-hold invokes an action on the currently selected object, and vice versa. With the multipurpose navigation controller 56 of the present invention, no separate selection key is required to indicate a selection event, thus eliminating the need for a separate select key, which potentially saves space on the device and reduces user error.
  • [0017]
    In a preferred embodiment, the multi-purpose navigation controller 56 may be implemented as either a 4-way or 2-way navigation controller, as shown in FIGS. 2A and 2B, respectively, and the navigation controller 56 may be implemented with separate navigation keys or as an integrated 4-way/2-way key. Also, in the preferred embodiment, a separate select key is eliminated from the device 50 in order to save space. However, an alternative embodiment, the device 50 may include a separate select key (not shown) for user convenience, whether located in the center of the navigation control or apart therefrom.
  • [0018]
    FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller 56 in accordance with a preferred embodiment of the present invention. The process begins when the device 50 detects that one of the navigation keys 56 has been pressed and released in step 100. If so, the device 50 determines if the time between the previous press of the same key and the current presses is less than a stored double-press time in step 102.
  • [0019]
    Referring again to FIGS. 2A and 2B, the double-press time 58 is preferably stored in a non-volatile memory 60 in the device 50 along with a release time 60. In a preferred embodiment, both are configurable. Referring to FIGS. 2A, 2B, and 3, if the time between presses is greater than the double-press time 58, in step 102, then the device 50 interprets the key press as a navigation event and displays the next object in step 104 (or moves a highlight to the next object, depending on the current operating mode).
  • [0020]
    According to one aspect of the present invention, the device 50 is further configured to distinguish between fast scrolling during navigation and a double-press as follows. If the time between the previous press of the same key and the current press is less than the stored double-press time 58 in step 102, then the device 50 examines whether the last couple of presses (e.g., three) were performed on the same navigation key 56 in step 106. If the last couple of presses were performed on the same key in step 106, then the device 50 determines that the user is fast-scrolling through displayed objects during navigation in step 108. Accordingly, the current key press is interpreted as a navigation event and the next object is displayed, as described in step 104.
  • [0021]
    If the time between the previous press of the same key and the current press is less than the stored double-press time 58 in step 102, but the last couple of presses were not performed on the same navigation key in step 106, then the current key press is interpreted as a selection event in step 110. In step 112, the device 50 executes the action associated with the currently selected object.
  • [0022]
    Also, according to the present invention, if the device 50 detects that one of the navigation keys is pressed, but not released for a time greater than the release time 62 in step 14, then this “press-and-hold” is interpreted as a selection event in step 110, and the device 50 executes the action as described in step 112.
  • [0023]
    A method and apparatus for implementing the navigation and select functions on the portable electronic imaging device using a multipurpose navigation key has been disclosed. The present invention has been described in accordance with the embodiments shown, and one of ordinary skill in the art will readily recognize that there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Claims (22)

  1. 1. A portable electronic imaging device, comprising:
    a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons; and
    a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.
  2. 2. The device of claim 1 wherein the device is configured to detect a double-press on any portion of the navigation controller corresponding to a navigation key and is configured to interpret the double-press as a user selection event that invokes an action on the currently selected object.
  3. 3. The device of claim 1 wherein the device is configured to detect that a portion of the navigation controller corresponding to a current key has been pressed, and is configured to interpret the current key press as a navigation event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is greater than a predetermined double-press time.
  4. 4. The device of claim 3 wherein the device is configured to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than the predetermined double-press time.
  5. 5. The device of claim 1 wherein the device is configured to detect that a portion of the navigation controller corresponding to a current key has been pressed, and is configured to interpret the current key press as a fast-scrolling event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and a last predetermined plurality of presses of the navigation controller correspond to a same portion of the navigation controller corresponding to the current key.
  6. 6. The device of claim 5 wherein the device is configured to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and the last predetermined plurality of presses of the navigation controller does not correspond to the same portion of the navigation controller corresponding to the current key.
  7. 7. The device of claim 1 wherein the device is configured to detect a press-and-hold on any portion of the navigation controller corresponding to a navigation key, and is configured to interpret the press-and hold as a user selection event that invokes an action on the currently selected object.
  8. 8. The device of claim 7, wherein the device is configured to detect the press-and-hold when the current key is pressed, but not released, for a time greater than a release time.
  9. 9. The device of claim 1 wherein the device is configured to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, and a detected double-press indicates a user selection of a current object, while a detected press-and-hold invokes an action on the currently selected object.
  10. 10. The device of claim 1 wherein the device is configured to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, and a detected press-and-hold indicates a user selection of a current object, while the detected double-presses invokes the action on the currently selected object.
  11. 11. The device of claim 1 wherein the device comprises at least one of a digital camera, camera-enabled cell phone, MP3 player, and a personal digital assistant.
  12. 12. A portable electronic imaging device, comprising:
    a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons; and
    a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to detect at least one of a press-and-hold and a double-press of the portion of the navigation controller corresponding to the navigation key, thereby allowing the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects.
  13. 13. A method for providing a portable electronic imaging device with a multipurpose navigation controller, comprising:
    displaying objects on a display screen, the objects including at least one of digital still images, video clips, menu items, and icons; and
    providing the device with a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.
  14. 14. The method of claim 13 comprising configuring the device to detect a double-press on any portion of the navigation controller corresponding to a navigation key and to interpret the double-press as a user selection event that invokes an action on the currently selected object.
  15. 15. The method of claim 13 comprising configuring the device to detect that a portion of the navigation controller corresponding to a current key has been pressed, and to interpret the current key press as a navigation event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is greater than a predetermined double-press time.
  16. 16. The method of claim 15 comprising configuring the device to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than the predetermined double-press time.
  17. 17. The method of claim 13 comprising configuring the device to detect that a portion of the navigation controller corresponding to a current key has been pressed, and to interpret the current key press as a fast-scrolling event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and a last predetermined plurality of presses of the navigation controller correspond to a same portion of the navigation controller corresponding to the current key.
  18. 18. The method of claim 17 comprising configuring the device to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and the last predetermined plurality of presses of the navigation controller does not correspond to the same portion of the navigation controller corresponding to the current key.
  19. 19. The method of claim 13 comprising configuring the device to detect a press-and-hold on any portion of the navigation controller corresponding to a navigation key, and to interpret the press-and hold as a user selection event that invokes an action on the currently selected object.
  20. 20. The method of claim 19, comprising configuring the device to detect the press-and-hold when the current key is pressed, but not released, for a time greater than a release time.
  21. 21. The method of claim 13 comprising configuring the device to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, wherein a detected double-press indicates a user selection, while a detected press-and-hold invokes an action on the currently selected object.
  22. 22. The method of claim 13 comprising configuring the device to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, wherein a detected press-and-hold indicates a user selection, while the detected double-presses invokes the action on the currently selected object.
US11750611 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device Abandoned US20070216643A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10869733 US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device
US11750611 US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11750611 US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Publications (1)

Publication Number Publication Date
US20070216643A1 true true US20070216643A1 (en) 2007-09-20

Family

ID=35482006

Family Applications (2)

Application Number Title Priority Date Filing Date
US10869733 Active 2025-04-21 US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device
US11750611 Abandoned US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10869733 Active 2025-04-21 US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device

Country Status (5)

Country Link
US (2) US7222307B2 (en)
EP (1) EP1766626A2 (en)
JP (1) JP4403260B2 (en)
CN (1) CN101189567A (en)
WO (1) WO2006009692A3 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090135142A1 (en) * 2007-11-27 2009-05-28 Motorola, Inc. Data entry device and method
US8542133B2 (en) 2007-07-06 2013-09-24 Synaptics Incorporated Backlit haptic key
US8599047B2 (en) * 2007-07-06 2013-12-03 Synaptics Incorporated Haptic keyboard assemblies and methods

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645137B2 (en) 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US7073130B2 (en) * 2001-01-31 2006-07-04 Microsoft Corporation Methods and systems for creating skins
US6791581B2 (en) * 2001-01-31 2004-09-14 Microsoft Corporation Methods and systems for synchronizing skin properties
US7417625B2 (en) * 2004-04-29 2008-08-26 Scenera Technologies, Llc Method and system for providing input mechanisms on a handheld electronic device
US7222307B2 (en) * 2004-06-16 2007-05-22 Scenera Technologies, Llc Multipurpose navigation keys for an electronic imaging device
US20060015826A1 (en) * 2004-07-13 2006-01-19 Sony Corporation Hard disk multimedia player and method
US20060199616A1 (en) * 2005-03-03 2006-09-07 Agere Systems Inc. Mobile communication device having automatic scrolling capability and method of operation thereof
JP4515409B2 (en) * 2005-05-20 2010-07-28 エルジー エレクトロニクス インコーポレイティド Continuous click device and execution method of a mobile terminal
US20070040808A1 (en) * 2005-08-22 2007-02-22 Creative Technology Ltd. User configurable button
KR100738901B1 (en) * 2006-03-16 2007-07-06 삼성전자주식회사 Apparatus and method for inputting characters in portable terminal
KR100738902B1 (en) * 2006-03-16 2007-07-06 삼성전자주식회사 Apparatus and method for inputting characters in portable terminal
US20070290992A1 (en) * 2006-06-16 2007-12-20 Creative Technology Ltd Control interface for media player
US9378343B1 (en) 2006-06-16 2016-06-28 Nokia Corporation Automatic detection of required network key type
JP2008040019A (en) * 2006-08-03 2008-02-21 Toshiba Corp Mobile terminal
US8736557B2 (en) 2006-09-11 2014-05-27 Apple Inc. Electronic device with image based browsers
US8564543B2 (en) * 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
US8281041B2 (en) * 2006-11-22 2012-10-02 Carefusion 303, Inc. System and method for preventing keypad entry errors
US9001047B2 (en) * 2007-01-07 2015-04-07 Apple Inc. Modal change based on orientation of a portable multifunction device
US7979805B2 (en) * 2007-05-21 2011-07-12 Microsoft Corporation Button discoverability
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8862252B2 (en) * 2009-01-30 2014-10-14 Apple Inc. Audio user interface for displayless electronic device
US8913771B2 (en) * 2009-03-04 2014-12-16 Apple Inc. Portable electronic device having a water exposure indicator label
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US8438504B2 (en) 2010-01-06 2013-05-07 Apple Inc. Device, method, and graphical user interface for navigating through multiple viewing areas
US8736561B2 (en) 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US8572489B2 (en) * 2010-12-16 2013-10-29 Harman International Industries, Incorporated Handlebar audio controls
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197334A3 (en) 2013-06-07 2015-01-29 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
WO2014197336A1 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4456931A (en) * 1980-10-31 1984-06-26 Nippon Kogaku K.K. Electronic camera
US4937676A (en) * 1989-02-10 1990-06-26 Polariod Corporation Electronic camera system with detachable printer
US4982291A (en) * 1987-08-27 1991-01-01 Casio Computer Co., Ltd. Electronic still video camera capable of searching desired picture in simple and quick manner
US5021989A (en) * 1986-04-28 1991-06-04 Hitachi, Ltd. Document browsing apparatus with concurrent processing and retrievel
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5274458A (en) * 1991-01-25 1993-12-28 Sony Corporation Video camera
US5465133A (en) * 1988-10-04 1995-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Still video camera
US5497193A (en) * 1992-10-29 1996-03-05 Sony Corporation Electronic still camera with dual contact shutter switch for picture review
US5513306A (en) * 1990-08-09 1996-04-30 Apple Computer, Inc. Temporal event viewing and editing system
US5559943A (en) * 1994-06-27 1996-09-24 Microsoft Corporation Method and apparatus customizing a dual actuation setting of a computer input device switch
US5608491A (en) * 1994-02-04 1997-03-04 Nikon Corporation Camera with simplified parameter selection and dual mode operation and method of operation
US5635984A (en) * 1991-12-11 1997-06-03 Samsung Electronics Co., Ltd. Multi-picture control circuit and method for electronic still camera
US5682207A (en) * 1993-02-26 1997-10-28 Sony Corporation Image display apparatus for simultaneous display of a plurality of images
US5742339A (en) * 1994-12-27 1998-04-21 Asahi Kogaku Kogyo Kabushiki Kaisha Electronic still video camera
US5781175A (en) * 1986-04-21 1998-07-14 Canon Kabushiki Kaisha Image search apparatus
US5796428A (en) * 1993-10-21 1998-08-18 Hitachi, Ltd. Electronic photography system
US5845166A (en) * 1997-02-20 1998-12-01 Eastman Kodak Company Hybrid camera with identification matching of film and electronic images
US5861918A (en) * 1997-01-08 1999-01-19 Flashpoint Technology, Inc. Method and system for managing a removable memory in a digital camera
US5943043A (en) * 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US5969708A (en) * 1996-10-15 1999-10-19 Trimble Navigation Limited Time dependent cursor tool
US5977976A (en) * 1995-04-19 1999-11-02 Canon Kabushiki Kaisha Function setting apparatus
US6097431A (en) * 1996-09-04 2000-08-01 Flashpoint Technology, Inc. Method and system for reviewing and navigating among images on an image capture unit
US6122003A (en) * 1997-08-22 2000-09-19 Flashpoint Technology, Inc. Method and apparatus for changing operating modes of an image capture device
US6160926A (en) * 1998-08-07 2000-12-12 Hewlett-Packard Company Appliance and method for menu navigation
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20050009571A1 (en) * 2003-02-06 2005-01-13 Chiam Thor Itt Main menu navigation principle for mobile phone user
US20050283729A1 (en) * 2004-06-16 2005-12-22 Morris Robert P Multipurpose navigation keys for an electronic imaging device
US6995875B2 (en) * 2000-06-07 2006-02-07 Hewlett-Packard Development Company, L.P. Appliance and method for navigating among multiple captured images and functional menus
US7058432B2 (en) * 2001-04-20 2006-06-06 Mitsubishi Denki Kabushiki Kaisha Pointing device and mobile telephone
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0832847A (en) 1994-07-13 1996-02-02 Fuji Photo Film Co Ltd Electronic still camera and its control method
JP3399698B2 (en) 1994-08-23 2003-04-21 株式会社日立製作所 Camera-equipped recording device
JPH08205014A (en) 1995-01-31 1996-08-09 Casio Comput Co Ltd Electronic still camera
JPH08223524A (en) 1995-02-08 1996-08-30 Hitachi Ltd Portable video camera

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4456931A (en) * 1980-10-31 1984-06-26 Nippon Kogaku K.K. Electronic camera
US5781175A (en) * 1986-04-21 1998-07-14 Canon Kabushiki Kaisha Image search apparatus
US5021989A (en) * 1986-04-28 1991-06-04 Hitachi, Ltd. Document browsing apparatus with concurrent processing and retrievel
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US4982291A (en) * 1987-08-27 1991-01-01 Casio Computer Co., Ltd. Electronic still video camera capable of searching desired picture in simple and quick manner
US5465133A (en) * 1988-10-04 1995-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Still video camera
US4937676A (en) * 1989-02-10 1990-06-26 Polariod Corporation Electronic camera system with detachable printer
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5513306A (en) * 1990-08-09 1996-04-30 Apple Computer, Inc. Temporal event viewing and editing system
US5274458A (en) * 1991-01-25 1993-12-28 Sony Corporation Video camera
US5635984A (en) * 1991-12-11 1997-06-03 Samsung Electronics Co., Ltd. Multi-picture control circuit and method for electronic still camera
US5497193A (en) * 1992-10-29 1996-03-05 Sony Corporation Electronic still camera with dual contact shutter switch for picture review
US5682207A (en) * 1993-02-26 1997-10-28 Sony Corporation Image display apparatus for simultaneous display of a plurality of images
US5796428A (en) * 1993-10-21 1998-08-18 Hitachi, Ltd. Electronic photography system
US5608491A (en) * 1994-02-04 1997-03-04 Nikon Corporation Camera with simplified parameter selection and dual mode operation and method of operation
US5559943A (en) * 1994-06-27 1996-09-24 Microsoft Corporation Method and apparatus customizing a dual actuation setting of a computer input device switch
US5742339A (en) * 1994-12-27 1998-04-21 Asahi Kogaku Kogyo Kabushiki Kaisha Electronic still video camera
US5977976A (en) * 1995-04-19 1999-11-02 Canon Kabushiki Kaisha Function setting apparatus
US5943043A (en) * 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US6097431A (en) * 1996-09-04 2000-08-01 Flashpoint Technology, Inc. Method and system for reviewing and navigating among images on an image capture unit
US5969708A (en) * 1996-10-15 1999-10-19 Trimble Navigation Limited Time dependent cursor tool
US5861918A (en) * 1997-01-08 1999-01-19 Flashpoint Technology, Inc. Method and system for managing a removable memory in a digital camera
US5845166A (en) * 1997-02-20 1998-12-01 Eastman Kodak Company Hybrid camera with identification matching of film and electronic images
US6122003A (en) * 1997-08-22 2000-09-19 Flashpoint Technology, Inc. Method and apparatus for changing operating modes of an image capture device
US6160926A (en) * 1998-08-07 2000-12-12 Hewlett-Packard Company Appliance and method for menu navigation
US6995875B2 (en) * 2000-06-07 2006-02-07 Hewlett-Packard Development Company, L.P. Appliance and method for navigating among multiple captured images and functional menus
US7058432B2 (en) * 2001-04-20 2006-06-06 Mitsubishi Denki Kabushiki Kaisha Pointing device and mobile telephone
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
US20050009571A1 (en) * 2003-02-06 2005-01-13 Chiam Thor Itt Main menu navigation principle for mobile phone user
US20050283729A1 (en) * 2004-06-16 2005-12-22 Morris Robert P Multipurpose navigation keys for an electronic imaging device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8542133B2 (en) 2007-07-06 2013-09-24 Synaptics Incorporated Backlit haptic key
US8599047B2 (en) * 2007-07-06 2013-12-03 Synaptics Incorporated Haptic keyboard assemblies and methods
US20090135142A1 (en) * 2007-11-27 2009-05-28 Motorola, Inc. Data entry device and method

Also Published As

Publication number Publication date Type
US20050283729A1 (en) 2005-12-22 application
WO2006009692A3 (en) 2008-02-07 application
CN101189567A (en) 2008-05-28 application
JP2008503930A (en) 2008-02-07 application
JP4403260B2 (en) 2010-01-27 grant
WO2006009692A2 (en) 2006-01-26 application
EP1766626A2 (en) 2007-03-28 application
US7222307B2 (en) 2007-05-22 grant

Similar Documents

Publication Publication Date Title
US8194043B2 (en) Mobile communication terminal having multiple displays and a data processing method thereof
US7978176B2 (en) Portrait-landscape rotation heuristics for a portable multifunction device
US7864163B2 (en) Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US20060211454A1 (en) Display apparatus and method for mobile terminal
US20100073303A1 (en) Method of operating a user interface
US20100095205A1 (en) Portable Terminal and Control Method Therefor
US20060247851A1 (en) Mobile phone having a TV remote style user interface
US20090265628A1 (en) Method and apparatus for operating user interface and recording medium using the same
US20100173678A1 (en) Mobile terminal and camera image control method thereof
US7587683B2 (en) Display method, portable terminal device, and display program
US20090315867A1 (en) Information processing unit
US20030048262A1 (en) Method and apparatus for navigation, text input and phone dialing
US20080165148A1 (en) Portable Electronic Device, Method, and Graphical User Interface for Displaying Inline Multimedia Content
US20100169813A1 (en) Method for displaying and operating user interface and electronic device
EP1542437A2 (en) Mobile communication terminal with multi-input device and method of using the same
US8091045B2 (en) System and method for managing lists
US20080256485A1 (en) User Interface for Controlling Video Programs on Mobile Computing Devices
US20110012931A1 (en) Terminal Device With Display Function
US20080055276A1 (en) Method for controlling partial lock in portable device having touch input unit
US20080211778A1 (en) Screen Rotation Gestures on a Portable Multifunction Device
US20090303231A1 (en) Touch Screen Device, Method, and Graphical User Interface for Manipulating Three-Dimensional Virtual Objects
US20080263445A1 (en) Editing of data using mobile communication terminal
US7085590B2 (en) Mobile terminal with ergonomic imaging functions
US20130007653A1 (en) Electronic Device and Method with Dual Mode Rear TouchPad
US20080282179A1 (en) Tab browsing in mobile communication terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: IPAC ACQUISITION SUBSIDIARY I, LLC, NEW HAMPSHIRE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORRIS, ROBERT P.;SULLIVAN, STEPHEN G.;REEL/FRAME:019319/0719;SIGNING DATES FROM 20040615 TO 20040616

Owner name: SCENERA TECHNOLOGIES, LLC, NEW HAMPSHIRE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPAC ACQUISITION SUBSIDIARY I, LLC;REEL/FRAME:019319/0745

Effective date: 20061102