WO2014065812A1 - User interfaces for hand-held electronic devices - Google Patents

User interfaces for hand-held electronic devices Download PDF

Info

Publication number
WO2014065812A1
WO2014065812A1 PCT/US2012/062081 US2012062081W WO2014065812A1 WO 2014065812 A1 WO2014065812 A1 WO 2014065812A1 US 2012062081 W US2012062081 W US 2012062081W WO 2014065812 A1 WO2014065812 A1 WO 2014065812A1
Authority
WO
WIPO (PCT)
Prior art keywords
held
hand
held device
display
orientation
Prior art date
Application number
PCT/US2012/062081
Other languages
French (fr)
Inventor
Basil BADAWIYEH
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to EP12784400.9A priority Critical patent/EP2912537A1/en
Priority to JP2015539565A priority patent/JP6022703B2/en
Priority to US14/433,982 priority patent/US10192527B2/en
Priority to KR1020157010312A priority patent/KR20150073999A/en
Priority to PCT/US2012/062081 priority patent/WO2014065812A1/en
Priority to CN201280076251.7A priority patent/CN104781761A/en
Publication of WO2014065812A1 publication Critical patent/WO2014065812A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72466User interfaces specially adapted for cordless or mobile telephones with selection means, e.g. keys, having functions defined by the mode or the status of the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present invention generally relates to user interfaces for hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices, and more particularly, to user interfaces for such devices that assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes.
  • hand-held electronic devices such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices
  • PC personal computers
  • Hand-held electronic devices such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices have the capability of being physically re-orientated (e.g., flipped upside down, switched between a landscape orientation and a portrait orientation, etc.).
  • buttons, keys, etc. are designed for use in the normal orientation.
  • a hand-held device is disclosed.
  • the hand-held device is operative to be held in normal, upside down, left tilt, and right tilt orientations, and comprises processing means such as a processor and display means such as a display.
  • processing means such as a processor and display means such as a display.
  • a first physical button is located on a first side of the display when the hand-held device is held in the normal orientation.
  • a second physical button is located on a second side of the display opposite to the first side. The second physical button is positioned diagonally symmetrical to the first physical button when the hand-held device is held in the normal orientation.
  • activation of the first and second physical buttons respectively causes the processor to perform first and second functions when the hand-held device is held in the normal orientation, and respectively causes the processor to perform the second and first functions when the hand-held device is held in the upside down orientation.
  • the processor causes display of first and second virtual buttons via the display, and activation of the first and second virtual buttons respectively causes the processor to perform the first and second functions.
  • the first and second virtual buttons are respectively displayed in respective positions in the display in the one of the left tilt and right tilt orientations respectively corresponding to positions of the first and second physical buttons when the handheld device is held in the normal orientation.
  • a method for operating a device operative to be held in normal, upside down, left tilt, and right tilt orientations comprises: performing first and second functions in response to activation of first and second physical buttons, respectively, when the device is held in the normal orientation, wherein the first physical button is located on a first side of a display of the device when the device is held in the normal orientation, and the second physical button is located on a second side of the display opposite to the first side, the second physical button being positioned diagonally symmetrical to the first physical button when the device is held in the normal orientation; performing the second and first functions in response to activation of the first and second physical buttons, respectively, when the device is held in the upside down orientation; enabling display of first and second virtual buttons via the display when the device is held in one of the left tilt and right tilt orientations; performing the first and second functions in response to activation of the first and second virtual buttons, respectively, when the device is held in one of the left tilt and right tilt
  • FIG. 1 shows a block diagram of relevant portions of a device suitable for implementing exemplary embodiments of the present invention
  • FIG. 2 shows a device in a normal orientation according to an exemplary embodiment of the present invention
  • FIG. 3 shows a device in an upside down orientation according to an exemplary embodiment of the present invention
  • FIG. 4 shows a device in a left tilt orientation according to an exemplary embodiment of the present invention
  • FIG. 5 shows a device in a left tilt orientation according to another exemplary embodiment of the present invention
  • FIG. 6 shows a device in a left tilt orientation according to still another exemplary embodiment of the present invention.
  • FIG. 7 shows a device in a left tilt orientation according to yet another exemplary embodiment of the present invention.
  • FIG. 8 shows a device in a left tilt or right tilt orientation according to still yet another exemplary embodiment of the present invention.
  • FIG. 9 shows a device in a left tilt or right tilt orientation according to still yet a further exemplary embodiment of the present invention.
  • user device 100 is embodied as a hand-held device (e.g., mobile telephone device, touch tablet, personal computer (PC), slate, remote control device, etc.) and/or other type of device.
  • user device 100 comprises input/output (I/O) means such as I/O block 10, control and processing means such as controller 20, user input means such as physical buttons/keys block 30, data storage means such as memory 40, and display means such as display 50.
  • I/O input/output
  • controller 20 user input means
  • user input means such as physical buttons/keys block
  • data storage means such as memory 40
  • display means such as display 50.
  • I/O block 10 is operative to perform I/O functions of user device 100.
  • I/O block 10 is operative to receive signals such as audio, video and/or data signals in analog and/or digital modulation format(s) in a wired and/or wireless manner from one or more networks such as terrestrial, cable, satellite, internet and/or other network sources, and to output signals in a wired and/or wireless manner to such one or more networks.
  • I/O block 10 may be embodied as any type of I/O interface capable of receiving wired and/or wireless signals, and may be comprised of one or more individual components (e.g., antenna(s), plug(s), etc.).
  • Controller 20 is operative to perform various signal processing and control functions (e.g., execute software code, etc.) of user device 100 that facilitates and enables performance of the various embodiments and techniques of the present invention described herein.
  • various signal processing and control functions e.g., execute software code, etc.
  • controller 20 receives the signals provided from I/O block 10 and performs and/or enables all necessary processing and control functions associated with user device 100 via one or more microprocessors and/or other element(s).
  • controller 20 is operative to process audio, video and/or data signals provided from I/O block 10 by performing functions including tuning, demodulation, forward error correction, and transport processing functions to thereby generate digital data representing audio, video and/or data content.
  • the digital data produced from such processing functions may be provided for further processing and/or output (e.g., via display 50).
  • controller 20 is operative to perform and/or enable various other functions including, but not limited to, processing user inputs made via physical buttons/keys block 30, controlling functions (e.g., volume and channel control functions, etc.) of user device 10 in response to user inputs, reading and writing data from and to memory 40, enabling on-screen displays (e.g., video, virtual buttons/keys, menus, etc.) via display 50, and/or other operations as may be described herein.
  • controller 20 includes means, such as an accelerometer, gyroscopic sensor and/or other element(s) for detecting the motion and physical orientation of user device 10.
  • Physical buttons/keys block 30 is operative to receive physical user inputs from a user operator of user device 100.
  • physical buttons/keys block 30 comprises a plurality of physical buttons and/or keys that are arranged in a symmetrical and/or other suitable manner around display 50, and may for example, be configured within and extend from a housing of user device 10. Other types of inputs may also be provided via block 30. Inputs provided via block 30 are provided to controller 20 for processing.
  • Memory 40 is operatively coupled to controller 20 and performs data storage functions of user device 100.
  • memory 40 stores data including, but not limited to, software code and other data associated with one or more computer applications including those described herein, on-screen display data (e.g., virtual buttons/keys, menus, browsers, etc.), user selection/setup data, and/or other data.
  • Display 50 is operative to provide visual displays including video content pursuant to the control of controller 20.
  • display 50 is operative to provide touch-screen capabilities including virtual buttons/keys, and thereby enables a user operator to provide inputs (separate from those provided via physical buttons/keys block 30) that are received and processed by controller 20.
  • Display 50 may be embodied using any type of suitable display device, such as a light emitting diode (LED) display, liquid crystal display (LCD), or other type of display device.
  • display 50 also includes associated aural output means, such as speakers and/or other audio output element(s), the volume of which may be controlled by, for example, the aforementioned physical buttons/keys block 30 and virtual buttons/keys of display 50.
  • buttons/keys block 30 of FIG. 1 may be embodied using physical buttons/keys 30a-30d (also respectively labeled as buttons/keys 1 -4 in FIGS. 2-9).
  • FIG. 2 shows a device 100 in a normal orientation according to an exemplary embodiment of the present invention. Specifically, device 100 is positioned in a portrait orientation in FIG. 2.
  • device 100 of FIG. 2 is used to view video programming where certain buttons thereof (30a and 30c) are used to adjust the volume of the program and other buttons (30b and 30d) are used to change between the channels that are used for broadcasting programming.
  • buttons 30a-30d in FIG. 2 An exemplary representation of the functions assigned to buttons 30a-30d in FIG. 2 is shown below:
  • controller 20 of device 100 detects this change in physical orientation and causes the video content (e.g., smiley face, etc.) presented in display 50 to be adjusted accordingly (i.e., flipped upside down). Also according to principles of the present invention, the function of physical buttons/keys 30a-30d is adjusted accordingly in response to device 100 being flipped upside down. Otherwise, a user may have difficultly using device 100 because the current configuration of physical buttons/keys 30a-30d would be reversed, and hence awkward to the user. Accordingly, once controller 20 detects that device 100 has been flipped upside down (i.e., rotated 180 degrees from the normal orientation) as shown in FIG. 3, controller 20 also changes the functions assigned to physical buttons/keys 30a-30d, as shown below:
  • controller 20 causes the video content (e.g., smiley face, etc.) presented in display 50 to be adjusted accordingly (i.e., flipped upside down), and also causes the functionality of physical buttons/keys 30a-30d to be remapped in a user-friendly manner.
  • FIG. 4 shows device 100 in a left tilt or right tilt orientation according to an exemplary embodiment of the present invention. According to principles of the present invention, the left tilt and right tilt orientations each corresponds to a landscape orientation. The physical orientation of device 100 shown in FIG. 4 may be obtained by a user rotating device 100 from the normal orientation (see FIG.
  • the right tilt and left tilt orientations in this specification are defined with to the normal orientation. If device 100 is positioned more than 45 degrees or less than 135 degrees counterclockwise with respect to the normal orientation, device 100 is considered to be in left tilt orientation and if device 100 is positioned more than 45 degrees or less than 135 degrees clockwise with respect to the normal orientation, device 100 is considered to be in right tilt orientation In FIG.
  • buttons/keys 50a-50d are provided in a touch screen area of display 50 and can be touched by a user to adjust the functions of device 100.
  • controller 20 causes virtual buttons/keys 50a-50d to be displayed via display 50 in response to detecting that device 100 has been switched to either one of the left tilt and right tilt orientations, as shown for example in FIG. 4.
  • the area of display 50 dedicated to a video presentation can optionally be scaled down in order to make room for virtual buttons/keys 50a-50d, and/or virtual buttons/keys 50a-50d may each be labeled with a legend (e.g., text, symbol, etc.) indicative of its associated function. Further details of these exemplary features may be referenced later herein.
  • a legend e.g., text, symbol, etc.
  • buttons/keys 50a-50d in FIGS. 4-6 when pressed, perform the following functions:
  • buttons/keys 50a-50d are shown in FIG. 4 on the left side of display 50.
  • such virtual buttons/keys 50a-50d can also be positioned in other configurations, such as on the right side, top and/or bottom of display 50.
  • FIG. 5 shows another configuration for display 50 where virtual buttons/keys 50a-50d surround the video image being viewed.
  • FIG. 6 shows still another configuration for display 50 where virtual buttons/keys 50a-50d are positioned in relatively close proximity to the image being viewed.
  • the visual image e.g., smiley face, etc.
  • FIGS. 4-6 may be optional and/or redundant to the use of virtual buttons/keys 50a-50d.
  • FIG. 7 shows yet another exemplary configuration for display 50 that does not include any virtual buttons/keys.
  • FIGs. 4-7 show that device 100 is in the left tilt orientation. If device 100 is in right tilt orientation, the right top and bottom physical buttons are physical buttons 30a and 30b, respectively, and the left top and bottom physical buttons are physical buttons 30c and 30d, respectively. However, the virtual buttons 50a, 50b, 50c, and 50d are visually positioned in the same ways as shown in FIGs. 4-6. In effect, a user uses a virtual button located in the same position in either the left tilt or right tilt orientation to perform the same function.
  • the physical buttons on the left and right sides are assigned the same functions regardless of whether device 100 is in right tilt or left tilt orientation.
  • the right top and bottom physical buttons are always assigned the functions of the physical buttons 30d and 30c, respectively, when device 100 is in the normal orientation
  • the left top and bottom physical buttons are always assigned the functions of the physical buttons 30b and 30a, respectively, when device 100 is in the normal orientation.
  • the right top and bottom physical buttons are always assigned the functions of the physical buttons 30a and 30b, respectively, when device 100 is in the normal orientation
  • the left top and bottom physical buttons are always assigned the functions of the physical buttons 30c and 30d, respectively, when device 100 is in the normal orientation.
  • device 100 allows a user to assign the functions of the physical buttons when device 100 is in left tilt and right tilt orientations.
  • FIGS. 8 and 9 show further exemplary embodiments of device 100 in a left tilt or right tilt (landscape) orientation where virtual buttons/keys 50a-50d are each labeled to show the respective functions of physical buttons/keys 30a-30d that surround virtual buttons/keys 50a-50d.
  • virtual buttons/keys 50a-50d would simply be visual indicators rather than executable buttons/keys. Display options for virtual buttons/keys 50a-50d may be designated, for example, by a user during a user set-up process for device 100.
  • buttons/keys 30a-30d and/or virtual buttons/keys 50a- 50d other than those expressly shown and described herein may also be employed according to principles of the present invention.
  • virtual buttons/keys 50a-50d each includes some type of label or legend, which indicates the particular function performed by that particular virtual button/key 50a-50d as a touch screen element and/or indicates to the user the direction (e.g., " ⁇ -", "->”, etc.) of physical button/key 30a-30d that performs the same or similar respective function.
  • FIG. 9 shows still yet another example of device 100 in a left tilt or right tilt (landscape) orientation.
  • the exemplary embodiment of FIG. 9 includes a physical button/key 30e (i.e., button "5") that can be activated by a user to present a help mode of device 100.
  • controller 20 of device 100 may, for example, enable a display listing the functions of various virtual buttons/keys 50a- 50d based on the labels that are currently presented via display 50 and/or provide other related functions.
  • controller 20 may automatically activate the help mode of device 100 and/or cause virtual buttons/keys 50a-50d to be displayed via display 50 in response to device 100 being shaken and/or upon a detected expiration of a predetermined time period after device 100 is turned on.
  • the present invention provides desirable user interfaces for hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices that advantageously assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes.
  • hand-held electronic devices such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices that advantageously assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes.
  • PC personal computers

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

User interfaces for hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices, assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes. According to an exemplary embodiment, a hand-held device (100) is operative to be held in normal, upside down, left tilt, and right tilt orientations, and includes: a processor (20); a display (50); a first physical button (30a) located on a first side of the display when the hand-held device is held in the normal orientation; a second physical button (30d) located on a second side of the display opposite to the first side, the second physical button positioned diagonally symmetrical to the first physical button when the hand-held device is held in the normal orientation; and wherein: activation of the first and second physical buttons (30a, 30d) respectively causes the processor to perform first and second functions when the hand-held device is held in the normal orientation, and respectively causes the processor to perform the second and first functions when the hand-held device is held in the upside down orientation; when the hand-held device is held in one of the left tilt and right tilt orientations, the processor causes display of first and second virtual buttons (50a, 50d) via the display, and activation of the first and second virtual buttons respectively causes the processor to perform the first and second functions; and the first and second virtual buttons (50a, 50d) are respectively displayed in respective positions in the display in the one of the left tilt and right tilt orientations respectively corresponding to positions of the first and second physical buttons.

Description

USER INTERFACES FOR HAND-HELD ELECTRONIC DEVICES
BACKGROUND OF THE INVENTION
Field of the Invention
The present invention generally relates to user interfaces for hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices, and more particularly, to user interfaces for such devices that assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes.
Background Information
Hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices have the capability of being physically re-orientated (e.g., flipped upside down, switched between a landscape orientation and a portrait orientation, etc.).
One problem associated with such devices is that a user may have a difficult time determining which user input elements (e.g., buttons, keys, etc.) of a device to use after the device's physical orientation changes. For example, if a device can be operatively held in different orientations, such as normal, upside down, left tilt and right tilt orientations, but the physical buttons/keys of the device are designed for use in the normal orientation, it may be difficult for users to determine which physical buttons/keys to use when the device is positioned in orientations other than the normal orientation.
Accordingly, there is a need in the art to address the foregoing issues, and thereby provide improved user interfaces for hand-held electronic devices that assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes. The present invention described herein addresses these and/or other issues. SUMMARY OF THE INVENTION
In accordance with an aspect of the present invention, a hand-held device is disclosed. According to an exemplary embodiment, the hand-held device is operative to be held in normal, upside down, left tilt, and right tilt orientations, and comprises processing means such as a processor and display means such as a display. A first physical button is located on a first side of the display when the hand-held device is held in the normal orientation. A second physical button is located on a second side of the display opposite to the first side. The second physical button is positioned diagonally symmetrical to the first physical button when the hand-held device is held in the normal orientation. Also according to an exemplary embodiment, activation of the first and second physical buttons respectively causes the processor to perform first and second functions when the hand-held device is held in the normal orientation, and respectively causes the processor to perform the second and first functions when the hand-held device is held in the upside down orientation. When the hand-held device is held in one of the left tilt and right tilt orientations, the processor causes display of first and second virtual buttons via the display, and activation of the first and second virtual buttons respectively causes the processor to perform the first and second functions. The first and second virtual buttons are respectively displayed in respective positions in the display in the one of the left tilt and right tilt orientations respectively corresponding to positions of the first and second physical buttons when the handheld device is held in the normal orientation.
In accordance with another aspect of the present invention, a method for operating a device operative to be held in normal, upside down, left tilt, and right tilt orientations is disclosed. According to an exemplary embodiment, the method comprises: performing first and second functions in response to activation of first and second physical buttons, respectively, when the device is held in the normal orientation, wherein the first physical button is located on a first side of a display of the device when the device is held in the normal orientation, and the second physical button is located on a second side of the display opposite to the first side, the second physical button being positioned diagonally symmetrical to the first physical button when the device is held in the normal orientation; performing the second and first functions in response to activation of the first and second physical buttons, respectively, when the device is held in the upside down orientation; enabling display of first and second virtual buttons via the display when the device is held in one of the left tilt and right tilt orientations; performing the first and second functions in response to activation of the first and second virtual buttons, respectively, when the device is held in one of the left tilt and right tilt orientations; and wherein: the first and second virtual buttons are respectively displayed in respective positions in the display in the one of the left tilt and right tilt orientations respectively corresponding to positions of the first and second physical buttons when the device is held in the normal orientation.
The aforementioned brief summary of exemplary embodiments of the present invention is merely illustrative of the inventive concepts presented herein, and is not intended to limit the scope of the present invention in any manner.
BRIEF DESCRIPTION OF THE DRAWINGS
The above-mentioned and other features and advantages of this invention, and the manner of attaining them, will become more apparent and the invention will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
FIG. 1 shows a block diagram of relevant portions of a device suitable for implementing exemplary embodiments of the present invention;
FIG. 2 shows a device in a normal orientation according to an exemplary embodiment of the present invention;
FIG. 3 shows a device in an upside down orientation according to an exemplary embodiment of the present invention;
FIG. 4 shows a device in a left tilt orientation according to an exemplary embodiment of the present invention; FIG. 5 shows a device in a left tilt orientation according to another exemplary embodiment of the present invention;
FIG. 6 shows a device in a left tilt orientation according to still another exemplary embodiment of the present invention;
FIG. 7 shows a device in a left tilt orientation according to yet another exemplary embodiment of the present invention;
FIG. 8 shows a device in a left tilt or right tilt orientation according to still yet another exemplary embodiment of the present invention; and
FIG. 9 shows a device in a left tilt or right tilt orientation according to still yet a further exemplary embodiment of the present invention.
The exemplifications set out herein illustrate preferred embodiments of the invention, and such exemplifications are not to be construed as limiting the scope of the invention in any manner. For clarity of description, the same reference numbers may be used throughout the following description to represent the same or similar elements of the drawing figures.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and more particularly to FIG. 1 , a block diagram showing relevant portions of a user device 100 suitable for implementing exemplary embodiments of the present invention is illustrated. According to an exemplary embodiment, user device 100 is embodied as a hand-held device (e.g., mobile telephone device, touch tablet, personal computer (PC), slate, remote control device, etc.) and/or other type of device. As indicated in FIG. 1 , user device 100 comprises input/output (I/O) means such as I/O block 10, control and processing means such as controller 20, user input means such as physical buttons/keys block 30, data storage means such as memory 40, and display means such as display 50. Some of the foregoing elements of FIG. 1 may be embodied using one or more integrated circuits (ICs). For clarity of description, certain conventional elements associated with user device 100 such as certain control signals, power signals, and/or other elements may not be shown in FIG. 1 .
I/O block 10 is operative to perform I/O functions of user device 100. According to an exemplary embodiment, I/O block 10 is operative to receive signals such as audio, video and/or data signals in analog and/or digital modulation format(s) in a wired and/or wireless manner from one or more networks such as terrestrial, cable, satellite, internet and/or other network sources, and to output signals in a wired and/or wireless manner to such one or more networks. I/O block 10 may be embodied as any type of I/O interface capable of receiving wired and/or wireless signals, and may be comprised of one or more individual components (e.g., antenna(s), plug(s), etc.).
Controller 20 is operative to perform various signal processing and control functions (e.g., execute software code, etc.) of user device 100 that facilitates and enables performance of the various embodiments and techniques of the present invention described herein.
According to an exemplary embodiment, controller 20 receives the signals provided from I/O block 10 and performs and/or enables all necessary processing and control functions associated with user device 100 via one or more microprocessors and/or other element(s). For example, controller 20 is operative to process audio, video and/or data signals provided from I/O block 10 by performing functions including tuning, demodulation, forward error correction, and transport processing functions to thereby generate digital data representing audio, video and/or data content. The digital data produced from such processing functions may be provided for further processing and/or output (e.g., via display 50).
Also according to exemplary embodiments, controller 20 is operative to perform and/or enable various other functions including, but not limited to, processing user inputs made via physical buttons/keys block 30, controlling functions (e.g., volume and channel control functions, etc.) of user device 10 in response to user inputs, reading and writing data from and to memory 40, enabling on-screen displays (e.g., video, virtual buttons/keys, menus, etc.) via display 50, and/or other operations as may be described herein. Also according to exemplary embodiments, controller 20 includes means, such as an accelerometer, gyroscopic sensor and/or other element(s) for detecting the motion and physical orientation of user device 10.
Physical buttons/keys block 30 is operative to receive physical user inputs from a user operator of user device 100. According to an exemplary embodiment, physical buttons/keys block 30 comprises a plurality of physical buttons and/or keys that are arranged in a symmetrical and/or other suitable manner around display 50, and may for example, be configured within and extend from a housing of user device 10. Other types of inputs may also be provided via block 30. Inputs provided via block 30 are provided to controller 20 for processing.
Memory 40 is operatively coupled to controller 20 and performs data storage functions of user device 100. According to an exemplary embodiment, memory 40 stores data including, but not limited to, software code and other data associated with one or more computer applications including those described herein, on-screen display data (e.g., virtual buttons/keys, menus, browsers, etc.), user selection/setup data, and/or other data.
Display 50 is operative to provide visual displays including video content pursuant to the control of controller 20. According to an exemplary embodiment, display 50 is operative to provide touch-screen capabilities including virtual buttons/keys, and thereby enables a user operator to provide inputs (separate from those provided via physical buttons/keys block 30) that are received and processed by controller 20. Display 50 may be embodied using any type of suitable display device, such as a light emitting diode (LED) display, liquid crystal display (LCD), or other type of display device. Although not expressly shown in FIG. 1 , display 50 also includes associated aural output means, such as speakers and/or other audio output element(s), the volume of which may be controlled by, for example, the aforementioned physical buttons/keys block 30 and virtual buttons/keys of display 50.
Referring now to FIGS. 2-9, a device 100 positioned in different physical orientations according to different exemplary embodiments of the present invention is shown. As indicated in FIGS. 2-9, physical buttons/keys block 30 of FIG. 1 may be embodied using physical buttons/keys 30a-30d (also respectively labeled as buttons/keys 1 -4 in FIGS. 2-9).
FIG. 2 shows a device 100 in a normal orientation according to an exemplary embodiment of the present invention. Specifically, device 100 is positioned in a portrait orientation in FIG. 2.
According to an exemplary embodiment, device 100 of FIG. 2 is used to view video programming where certain buttons thereof (30a and 30c) are used to adjust the volume of the program and other buttons (30b and 30d) are used to change between the channels that are used for broadcasting programming. An exemplary representation of the functions assigned to buttons 30a-30d in FIG. 2 is shown below:
Figure imgf000008_0001
If a user flips device 100 of FIG. 2 upside down as shown in FIG. 3 (which is also a portrait orientation), controller 20 of device 100 detects this change in physical orientation and causes the video content (e.g., smiley face, etc.) presented in display 50 to be adjusted accordingly (i.e., flipped upside down). Also according to principles of the present invention, the function of physical buttons/keys 30a-30d is adjusted accordingly in response to device 100 being flipped upside down. Otherwise, a user may have difficultly using device 100 because the current configuration of physical buttons/keys 30a-30d would be reversed, and hence awkward to the user. Accordingly, once controller 20 detects that device 100 has been flipped upside down (i.e., rotated 180 degrees from the normal orientation) as shown in FIG. 3, controller 20 also changes the functions assigned to physical buttons/keys 30a-30d, as shown below:
Figure imgf000009_0001
As indicated above, when device 100 is switched from a normal orientation to an upside down orientation, as reflected in FIGS. 2 and 3, controller 20 causes the video content (e.g., smiley face, etc.) presented in display 50 to be adjusted accordingly (i.e., flipped upside down), and also causes the functionality of physical buttons/keys 30a-30d to be remapped in a user-friendly manner. FIG. 4 shows device 100 in a left tilt or right tilt orientation according to an exemplary embodiment of the present invention. According to principles of the present invention, the left tilt and right tilt orientations each corresponds to a landscape orientation. The physical orientation of device 100 shown in FIG. 4 may be obtained by a user rotating device 100 from the normal orientation (see FIG. 2) to the left by 90 degrees (i.e., left tilt), and/or by the user rotating device 100 from the upside down orientation (see FIG. 3) to the right by 90 degrees (i.e., right tilt). However, the right tilt and left tilt orientations in this specification are defined with to the normal orientation. If device 100 is positioned more than 45 degrees or less than 135 degrees counterclockwise with respect to the normal orientation, device 100 is considered to be in left tilt orientation and if device 100 is positioned more than 45 degrees or less than 135 degrees clockwise with respect to the normal orientation, device 100 is considered to be in right tilt orientation In FIG. 4, instead of having a user rely on purely a remapping of physical buttons/keys 30a-30d, virtual buttons/keys 50a-50d are provided in a touch screen area of display 50 and can be touched by a user to adjust the functions of device 100. According to an exemplary embodiment, controller 20 causes virtual buttons/keys 50a-50d to be displayed via display 50 in response to detecting that device 100 has been switched to either one of the left tilt and right tilt orientations, as shown for example in FIG. 4.
Also according to exemplary embodiments, the area of display 50 dedicated to a video presentation can optionally be scaled down in order to make room for virtual buttons/keys 50a-50d, and/or virtual buttons/keys 50a-50d may each be labeled with a legend (e.g., text, symbol, etc.) indicative of its associated function. Further details of these exemplary features may be referenced later herein.
According to exemplary embodiments, virtual buttons/keys 50a-50d in FIGS. 4-6, when pressed, perform the following functions: Button
Function
Number
r Raise Volume
2' Channel Up
Lower
3'
Volume
Channel
4'
Down
For purposes of example and explanation, the position of virtual buttons/keys 50a-50d is shown in FIG. 4 on the left side of display 50. However, such virtual buttons/keys 50a-50d can also be positioned in other configurations, such as on the right side, top and/or bottom of display 50.
FIG. 5 shows another configuration for display 50 where virtual buttons/keys 50a-50d surround the video image being viewed. FIG. 6 shows still another configuration for display 50 where virtual buttons/keys 50a-50d are positioned in relatively close proximity to the image being viewed. In this exemplary embodiment, the visual image (e.g., smiley face, etc.) is also scaled down in size and/or resolution to accommodate the positioning of virtual buttons/keys 50a-50d. The use of physical buttons/keys 30a-30d in FIGS. 4-6 may be optional and/or redundant to the use of virtual buttons/keys 50a-50d. FIG. 7 shows yet another exemplary configuration for display 50 that does not include any virtual buttons/keys.
FIGs. 4-7 show that device 100 is in the left tilt orientation. If device 100 is in right tilt orientation, the right top and bottom physical buttons are physical buttons 30a and 30b, respectively, and the left top and bottom physical buttons are physical buttons 30c and 30d, respectively. However, the virtual buttons 50a, 50b, 50c, and 50d are visually positioned in the same ways as shown in FIGs. 4-6. In effect, a user uses a virtual button located in the same position in either the left tilt or right tilt orientation to perform the same function.
Furthermore, in the case that no virtual button is displayed in the right tilt or left tilt orientation, the physical buttons on the left and right sides are assigned the same functions regardless of whether device 100 is in right tilt or left tilt orientation. For example, the right top and bottom physical buttons are always assigned the functions of the physical buttons 30d and 30c, respectively, when device 100 is in the normal orientation, and the left top and bottom physical buttons are always assigned the functions of the physical buttons 30b and 30a, respectively, when device 100 is in the normal orientation. In another embodiment, the right top and bottom physical buttons are always assigned the functions of the physical buttons 30a and 30b, respectively, when device 100 is in the normal orientation, and the left top and bottom physical buttons are always assigned the functions of the physical buttons 30c and 30d, respectively, when device 100 is in the normal orientation. In yet another embodiment, device 100 allows a user to assign the functions of the physical buttons when device 100 is in left tilt and right tilt orientations.
FIGS. 8 and 9 show further exemplary embodiments of device 100 in a left tilt or right tilt (landscape) orientation where virtual buttons/keys 50a-50d are each labeled to show the respective functions of physical buttons/keys 30a-30d that surround virtual buttons/keys 50a-50d. These exemplary embodiments can be used even if display 50 does not include touch screen capabilities. In such embodiments, virtual buttons/keys 50a-50d would simply be visual indicators rather than executable buttons/keys. Display options for virtual buttons/keys 50a-50d may be designated, for example, by a user during a user set-up process for device 100. Configurations for physical buttons/keys 30a-30d and/or virtual buttons/keys 50a- 50d other than those expressly shown and described herein may also be employed according to principles of the present invention. As shown in FIGS. 8 and 9, virtual buttons/keys 50a-50d each includes some type of label or legend, which indicates the particular function performed by that particular virtual button/key 50a-50d as a touch screen element and/or indicates to the user the direction (e.g., "<-", "->", etc.) of physical button/key 30a-30d that performs the same or similar respective function. According to an exemplary embodiment, virtual buttons/keys 50a-50d in FIGS. 8 and 9, when pressed, perform the following functions:
Figure imgf000013_0001
FIG. 9 shows still yet another example of device 100 in a left tilt or right tilt (landscape) orientation. Specifically, the exemplary embodiment of FIG. 9 includes a physical button/key 30e (i.e., button "5") that can be activated by a user to present a help mode of device 100. During this mode, controller 20 of device 100 may, for example, enable a display listing the functions of various virtual buttons/keys 50a- 50d based on the labels that are currently presented via display 50 and/or provide other related functions.
Various other techniques for activating the help mode of device 100 and/or causing the display of virtual buttons/keys 50a-50d via display 50 may be employed according to the present invention. For example, controller 20 may automatically activate the help mode of device 100 and/or cause virtual buttons/keys 50a-50d to be displayed via display 50 in response to device 100 being shaken and/or upon a detected expiration of a predetermined time period after device 100 is turned on.
As described above, the present invention provides desirable user interfaces for hand-held electronic devices, such as mobile telephone devices, touch tablets, personal computers (PC), remote control devices, and/or other devices that advantageously assist users with, among other things, knowing which user input elements of a device to use when the device's physical orientation changes. While this invention has been described as having a preferred design, the present invention can be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains and which fall within the limits of the appended claims.

Claims

1 . A hand-held device (100) operative to be held in normal, upside down, left tilt, and right tilt orientations, said hand-held device comprising:
a processor (20);
a display (50);
a first physical button (30a) located on a first side of said display when said hand-held device is held in said normal orientation;
a second physical button (30d) located on a second side of said display opposite to said first side, said second physical button positioned diagonally symmetrical to said first physical button when said hand-held device is held in said normal orientation; and wherein:
activation of said first and second physical buttons (30a, 30d) respectively causes said processor to perform first and second functions when said hand-held device is held in said normal orientation, and respectively causes said processor to perform said second and first functions when said hand-held device is held in said upside down orientation;
when said hand-held device is held in one of said left tilt and right tilt orientations, said processor causes display of first and second virtual buttons (50a, 50d) via said display, and activation of said first and second virtual buttons respectively causes said processor to perform said first and second functions; and said first and second virtual buttons (50a, 50d) are respectively displayed in respective positions in said display in said one of said left tilt and right tilt orientations respectively corresponding to positions of said first and second physical buttons (30a, 30d) when said hand-held device is held in said normal orientation.
2. The hand-held device (100) of claim 1 , wherein:
said normal and upside down orientations each corresponds to a portrait orientation; and
said left tilt and right tilt orientations each corresponds to a landscape orientation.
3. The hand-held device (100) of claim 1 , wherein:
said first function corresponds to a volume control function; and
said second function corresponds to a channel control function.
4. The hand-held device (100) of claim 1 , wherein said processor is further operative to reduce a size of a visual image provided via said display when said first and second virtual buttons are displayed.
5. The hand-held device (100) of claim 1 , wherein said processor causes said first and second virtual buttons to be displayed in response to said hand-held device being switched from one of said normal and upside down orientations to one of said left tilt and right tilt orientations.
6. The hand-held device (100) of claim 1 , wherein said processor causes said first and second virtual buttons to be displayed in response to at least one of: (i) said hand-held device being shaken, and (ii) expiration of a predetermined time period after said hand-held device is turned on.
7. The hand-held device (100) of claim 1 , wherein said first and second virtual buttons each includes a label corresponding to one of said first and second functions.
8. A hand-held device (100) operative to be held in normal, upside down, left tilt, and right tilt orientations, said hand-held device comprising:
processing means (20) for processing user inputs to said hand-held device and enabling visual displays;
display means (50) for providing said visual displays;
a first physical button (30a) located on a first side of said display means when said hand-held device is held in said normal orientation; a second physical button (30d) located on a second side of said display means opposite to said first side, said second physical button positioned diagonally symmetrical to said first physical button when said hand-held device is held in said normal orientation; and wherein:
activation of said first and second physical buttons (30a, 30d) respectively causes said processing means to perform first and second functions when said hand-held device is held in said normal orientation, and respectively causes said processing means to perform said second and first functions when said hand-held device is held in said upside down orientation;
when said hand-held device is held in one of said left tilt and right tilt orientations, said processing means causes display of first and second virtual buttons (50a, 50d) via said display means, and activation of said first and second virtual buttons respectively causes said processing means to perform said first and second functions; and
said first and second virtual buttons (50a, 50d) are respectively displayed in respective positions in said display means in said one of said left tilt and right tilt orientations respectively corresponding to positions of said first and second physical buttons (30a, 30d) when said hand-held device is held in said normal orientation.
9. The hand-held device (100) of claim 8, wherein:
said normal and upside down orientations each corresponds to a portrait orientation; and
said left tilt and right tilt orientations each corresponds to a landscape orientation.
10. The hand-held device (100) of claim 8, wherein:
said first function corresponds to a volume control function; and
said second function corresponds to a channel control function.
1 1 . The hand-held device (100) of claim 8, wherein said processing means is further operative to reduce a size of a visual image provided via said display means when said first and second virtual buttons are displayed.
12. The hand-held device (100) of claim 8, wherein said processing means causes said first and second virtual buttons to be displayed in response to said hand-held device being switched from one of said normal and upside down orientations to one of said left tilt and right tilt orientations.
13. The hand-held device (100) of claim 8, wherein said processing means causes said first and second virtual buttons to be displayed in response to at least one of: (i) said hand-held device being shaken, and (ii) expiration of a predetermined time period after said hand-held device is turned on.
14. The hand-held device (100) of claim 8, wherein said first and second virtual buttons each includes a label corresponding to one of said first and second functions.
15. A method for operating a device (100) operative to be held in normal, upside down, left tilt, and right tilt orientations, said method comprising:
performing first and second functions in response to activation of first and second physical buttons (30a, 30d), respectively, when said device is held in said normal orientation, wherein said first physical button (30a) is located on a first side of a display of said device when said device is held in said normal orientation, and said second physical button (30d) is located on a second side of said display opposite to said first side, said second physical button being positioned diagonally symmetrical to said first physical button when said device is held in said normal orientation;
performing said second and first functions in response to activation of said first and second physical buttons (30a, 30d), respectively, when said device is held in said upside down orientation; enabling display of first and second virtual buttons (50a, 50d) via said display when said device is held in one of said left tilt and right tilt orientations;
performing said first and second functions in response to activation of said first and second virtual buttons, respectively, when said device is held in one of said left tilt and right tilt orientations; and wherein:
said first and second virtual buttons (50a, 50d) are respectively displayed in respective positions in said display in said one of said left tilt and right tilt orientations respectively corresponding to positions of said first and second physical buttons (30a, 30d) when said device is held in said normal orientation.
16. The method of claim 15, wherein:
said normal and upside down orientations each corresponds to a portrait orientation; and
said left tilt and right tilt orientations each corresponds to a landscape orientation.
17. The method of claim 15, wherein:
said first function corresponds to a volume control function; and
said second function corresponds to a channel control function.
18. The method of claim 15, further comprised of reducing a size of a visual image provided via said display when said first and second virtual buttons are displayed.
19. The method of claim 15, wherein said first and second virtual buttons are displayed in response to said device being switched from one of said normal and upside down orientations to one of said left tilt and right tilt orientations.
20. The method of claim 15, wherein said first and second virtual buttons are displayed in response to at least one of: (i) said hand-held device being shaken, and (ii) expiration of a predetermined time period after said device is turned on.
21 . The method of claim 15, wherein said first and second virtual each includes a label corresponding to one of said first and second functions.
PCT/US2012/062081 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices WO2014065812A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP12784400.9A EP2912537A1 (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices
JP2015539565A JP6022703B2 (en) 2012-10-26 2012-10-26 User interface for portable electronic devices
US14/433,982 US10192527B2 (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices
KR1020157010312A KR20150073999A (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices
PCT/US2012/062081 WO2014065812A1 (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices
CN201280076251.7A CN104781761A (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2012/062081 WO2014065812A1 (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices

Publications (1)

Publication Number Publication Date
WO2014065812A1 true WO2014065812A1 (en) 2014-05-01

Family

ID=47172899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/062081 WO2014065812A1 (en) 2012-10-26 2012-10-26 User interfaces for hand-held electronic devices

Country Status (6)

Country Link
US (1) US10192527B2 (en)
EP (1) EP2912537A1 (en)
JP (1) JP6022703B2 (en)
KR (1) KR20150073999A (en)
CN (1) CN104781761A (en)
WO (1) WO2014065812A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016122990A (en) * 2014-12-25 2016-07-07 京セラ株式会社 Portable terminal, control program and control method
EP3507970A4 (en) * 2016-12-01 2019-09-25 Samsung Electronics Co., Ltd. Electronic device having combined button
US11797249B2 (en) 2014-09-11 2023-10-24 Samsung Electronics Co., Ltd. Method and apparatus for providing lock-screen

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598030B (en) * 2015-01-15 2018-03-23 青岛海信电器股份有限公司 A kind of intelligent terminal operating key function automatic adjusting method, device and intelligent terminal
US10607063B2 (en) * 2015-07-28 2020-03-31 Sony Corporation Information processing system, information processing method, and recording medium for evaluating a target based on observers
JP6824047B2 (en) * 2017-01-20 2021-02-03 株式会社クボタ Work vehicle
CN110162372B (en) * 2019-05-24 2023-04-21 Oppo广东移动通信有限公司 Virtual key creation method and related equipment
KR20210045154A (en) * 2019-10-16 2021-04-26 삼성전자주식회사 Electronic device and method for operating key of electronic device according to key input thereof
CN114217732A (en) * 2021-12-13 2022-03-22 深圳Tcl新技术有限公司 Display page switching method and device, storage medium and display equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040233173A1 (en) * 2003-05-20 2004-11-25 Bettina Bryant Keypad for portable electronic devices
WO2007096688A1 (en) * 2006-02-23 2007-08-30 Alon Lotan A display and actuator device
US20070252853A1 (en) * 2006-04-28 2007-11-01 Samsung Electronics Co., Ltd. Method and apparatus to control screen orientation of user interface of portable device
US20090066654A1 (en) * 2007-09-11 2009-03-12 Hon Hai Precision Industry Co., Ltd. Multi orientation user interface and electronic device with same
US20090179854A1 (en) * 2008-01-11 2009-07-16 Apple Inc. Dynamic input graphic display

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7800592B2 (en) * 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
US6684087B1 (en) * 1999-05-07 2004-01-27 Openwave Systems Inc. Method and apparatus for displaying images on mobile devices
JP2003005898A (en) * 2001-06-20 2003-01-08 Sharp Corp Key input device
US6824069B2 (en) 2002-01-30 2004-11-30 Howard B. Rosen Programmable thermostat system employing a touch screen unit for intuitive interactive interface with a user
US7401300B2 (en) 2004-01-09 2008-07-15 Nokia Corporation Adaptive user interface input device
JP2005260643A (en) * 2004-03-12 2005-09-22 Matsushita Electric Ind Co Ltd Foldable portable telephone system
JP2005303659A (en) * 2004-04-12 2005-10-27 Sony Corp Remote control system, operation target device, remote control device, and remote control method
US20060176278A1 (en) * 2005-02-10 2006-08-10 Motorola, Inc. Method and system for display orientation
US20070259685A1 (en) * 2006-05-08 2007-11-08 Goran Engblom Electronic equipment with keylock function using motion and method
US8274532B2 (en) * 2006-11-10 2012-09-25 Draeger Medical Systems, Inc. System for adaptively orienting a display image on a device
WO2008152679A1 (en) * 2007-06-13 2008-12-18 Yappa Corporation Portable terminal and input device
US7880722B2 (en) * 2007-10-17 2011-02-01 Harris Technology, Llc Communication device with advanced characteristics
US20090235281A1 (en) * 2008-03-12 2009-09-17 Inventec Corporation Handheld electronic device, input device and method thereof, and display device and method thereof
JP2009258817A (en) * 2008-04-14 2009-11-05 Kenwood Corp Touch panel type operation device and display control method
US20100053089A1 (en) 2008-08-27 2010-03-04 Research In Motion Limited Portable electronic device including touchscreen and method of controlling the portable electronic device
JP2010141524A (en) * 2008-12-10 2010-06-24 Toshiba Corp Electronic appliance
US20100156939A1 (en) 2008-12-22 2010-06-24 Research In Motion Limited Portable electronic device and method of controlling same
US20110087963A1 (en) 2009-10-09 2011-04-14 At&T Mobility Ii Llc User Interface Control with Edge Finger and Motion Sensing
US8621380B2 (en) * 2010-01-06 2013-12-31 Apple Inc. Apparatus and method for conditionally enabling or disabling soft buttons
JP2012089940A (en) * 2010-10-15 2012-05-10 Kyocera Corp Portable electronic device, method of controlling portable electronic device, and input control program
US20120206332A1 (en) * 2011-02-16 2012-08-16 Sony Corporation Method and apparatus for orientation sensitive button assignment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040233173A1 (en) * 2003-05-20 2004-11-25 Bettina Bryant Keypad for portable electronic devices
WO2007096688A1 (en) * 2006-02-23 2007-08-30 Alon Lotan A display and actuator device
US20070252853A1 (en) * 2006-04-28 2007-11-01 Samsung Electronics Co., Ltd. Method and apparatus to control screen orientation of user interface of portable device
US20090066654A1 (en) * 2007-09-11 2009-03-12 Hon Hai Precision Industry Co., Ltd. Multi orientation user interface and electronic device with same
US20090179854A1 (en) * 2008-01-11 2009-07-16 Apple Inc. Dynamic input graphic display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2912537A1 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11797249B2 (en) 2014-09-11 2023-10-24 Samsung Electronics Co., Ltd. Method and apparatus for providing lock-screen
JP2016122990A (en) * 2014-12-25 2016-07-07 京セラ株式会社 Portable terminal, control program and control method
EP3507970A4 (en) * 2016-12-01 2019-09-25 Samsung Electronics Co., Ltd. Electronic device having combined button
US10620828B2 (en) 2016-12-01 2020-04-14 Samsung Electronics Co., Ltd. Electronic device having combined button

Also Published As

Publication number Publication date
KR20150073999A (en) 2015-07-01
US20150221287A1 (en) 2015-08-06
JP6022703B2 (en) 2016-11-09
CN104781761A (en) 2015-07-15
EP2912537A1 (en) 2015-09-02
JP2015532996A (en) 2015-11-16
US10192527B2 (en) 2019-01-29

Similar Documents

Publication Publication Date Title
US10192527B2 (en) User interfaces for hand-held electronic devices
US11042185B2 (en) User terminal device and displaying method thereof
US20200319774A1 (en) Method and apparatus for displaying picture on portable device
US10198178B2 (en) Electronic apparatus with split display areas and split display method
US8988342B2 (en) Display apparatus, remote controlling apparatus and control method thereof
CN105511675B (en) Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal
EP3023865B1 (en) Portable terminal having display and method for operating same
EP2677741A1 (en) Remote control apparatus and control method thereof
US20110285631A1 (en) Information processing apparatus and method of displaying a virtual keyboard
US20110122085A1 (en) Apparatus and method for providing side touch panel as part of man-machine interface (mmi)
JP2004070492A (en) Display equipped with touch panel, and method of processing information
US20150261253A1 (en) Information processing device, information processing method and recording mefium
JP2011233064A (en) Information processor and display screen operation method
US10635227B2 (en) Image display device
WO2022068726A1 (en) Method and apparatus for providing control, and electronic device
CN106407027B (en) Information display method of mobile terminal and mobile terminal
CN111601127A (en) Display device and control method thereof
US9417724B2 (en) Electronic apparatus
EP2998838A1 (en) Display apparatus and method for controlling the same
WO2017022031A1 (en) Information terminal device
US20120162262A1 (en) Information processor, information processing method, and computer program product
US20090201259A1 (en) Cursor creation for touch screen
CN114995710A (en) Wearable device interaction method, wearable device and readable storage medium
JP2003076350A (en) Video display device
US20110043459A1 (en) Display integrated with touch control function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12784400

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14433982

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2012784400

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2015539565

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20157010312

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE