WO2015108310A1 - User interface for touch devices - Google Patents

User interface for touch devices Download PDF

Info

Publication number
WO2015108310A1
WO2015108310A1 PCT/KR2015/000308 KR2015000308W WO2015108310A1 WO 2015108310 A1 WO2015108310 A1 WO 2015108310A1 KR 2015000308 W KR2015000308 W KR 2015000308W WO 2015108310 A1 WO2015108310 A1 WO 2015108310A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
touch
screen
swipe
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2015/000308
Other languages
English (en)
French (fr)
Inventor
Pulkit AGRAWAL
Lovlesh MALIK
Tarun Sharma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to CN201580005040.8A priority Critical patent/CN105917300B/zh
Priority to US15/110,267 priority patent/US20160328144A1/en
Priority to EP15737245.9A priority patent/EP3097473A4/en
Publication of WO2015108310A1 publication Critical patent/WO2015108310A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present subject matter relates to touch devices and, particularly but not exclusively, to methods and systems for reconfiguring user interface of touch devices.
  • touch devices have increasingly become popular in consumer electronics, such as mobile communication devices, computing devices, global position system (GPS) navigation units, digital video recorders, and other handheld devices.
  • the touch devices generally include a user interface to facilitate user interactions with application programs running on the touch devices.
  • the user interface facilitates the user interactions by simultaneously displaying a number of user interface (UI) elements to a user and receiving user input through, for example, the user’s finger(s) or a stylus.
  • UI user interface
  • the UI elements are generally preconfigured and evenly disposed on entire touch-screen of the touch devices by the manufacturers.
  • the present subject matter relates to systems and methods for dynamic reconfiguration of user interface in touch devices.
  • the methods can be implemented in various touch devices, such as mobile phones, hand-held devices, tablets, netbooks, laptops or other portable computers, personal digital assistants (PDAs), notebooks and other devices that implement a touch-screen or touch-panel.
  • PDAs personal digital assistants
  • a touch device provides various functionalities, for example, accessing and displaying websites, sending and receiving e-mails, taking and displaying photographs and videos, playing music and other forms of audio, etc. These, and numerous other functionalities, are generally performed by execution of an application on selection of the application’s icon present on the touch device’s user interface. With increasing demands from users for better interaction capabilities and additional functionalities, the touch devices are nowadays configured with touch user interfaces having larger sizes, sometimes even larger than 5 inches.
  • the touch device configured with larger size touch user interface as displayed on a touch-screen, commonly has user interface (UI) elements arranged on the entire touch-screen of the touch device.
  • UI user interface
  • the UI elements cannot be scaled and/or positioned as per a user’s desire, which can otherwise help to influence user interactions with the touch device.
  • the touch device does not have the capability to reconfigure the UI elements.
  • the UI elements are generally preconfigured and evenly positioned on entire touch-screen of the touch devices by the manufacturers. This often gives rise to a situation in which a few UI elements may be preconfigured beyond a single hand operational capability of the user.
  • the touch device configured with larger size touch user interface is often operated using both hands.
  • a user defines an area on a touch-screen of a touch device within the reach of a user’s hand, and user interface is dynamically configured so that the UI elements are positioned in the reach of the user’s hand.
  • the user’s hand includes, without any limitation, user’s fingers, user’s thumb or other input devices, such as stylus held by the user.
  • reconfiguration capability of the present subject matter can be provided as an app that can be downloaded from a computer-readable medium and installed on the touch device.
  • the present subject matter facilitates a user to communicate with the touch device and register the extent of his reach on a touch-screen of the touch device by providing a user swipe input on the touch-screen.
  • the touch-screen utilizes its multi-touch capabilities to receive the user swipe input, and thus does not require any additional hardware, such as specialized sensors.
  • the touch-screen of touch device may receive the user swipe input when the user swipes a user input means, for example, user finger, user thumb or user stylus, from a first edge of the touch-screen to a second-edge of the touch-screen tracing a swipe boundary on the touch-screen.
  • a user input means for example, user finger, user thumb or user stylus
  • the first edge and the second edge can be either adjacent sides or oppositely lying sides.
  • the touch-screen of the touch device may receive the user swipe input that may not be touching any edge of the touch-screen.
  • the user may trace the swipe boundary by the user input means from a point nearest to the first edge of the touch-screen to a point nearest to a second edge of the touch-screen. Then, the touch device may connect that point nearest to the first edge or the second edge to respective nearest edge.
  • the touch-device may include a reconfiguring mechanism to receive the user swipe input by prompting the user to touch a soft-button on the touch-screen for automatically tracing the swipe boundary.
  • Such automatic tracing of swipe boundary is performed based on swipe history maintained over a pre-defined period of time by the reconfiguring module.
  • the reconfiguring mechanism may not automatically trace the swipe boundary as it has nothing stored or maintained as the swipe history. Thereafter, in an example, the reconfiguration mechanism may automatically trace the swipe boundary based on mean value of the previous traces stored in the swipe history.
  • the touch device determines a user-touchable area.
  • the user-touchable area can be either a user-defined swipe boundary area or a user-defined enclosed area enclosed by the swipe boundary and sides of the touch-screen.
  • the user-defined swipe boundary area is not confined to an actual area touched by the user input means. Specifically, when the user input means touches a specific part of the touch-screen, the touch device determines whether the user has slided or dragged the user input means, for example, from right to left or from left to right, estimates a swipe boundary area based on a specific touched area on the touch-screen, and determines the estimated swipe boundary area as the user-defined swipe boundary area.
  • the user-defined enclosed area is an area enclosed between the first edge of the touch-screen, the second edge of the touch-screen, and the swipe boundary traced by the user swipe input.
  • the touch-based device dynamically reconfigures the user interface of the touch device within the user-touchable area based on reconfiguration setting.
  • reconfiguration of the user interface ensures a single handed operation of the touch device by reconfiguring the user interface within the user-touchable area.
  • reconfiguring may include, without any limitation, the context of restructuring, rendering, rearranging, readjusting, or repositioning.
  • the reconfiguration of the user interface can be categorized into two categories, namely partial reconfiguration and complete reconfiguration.
  • the reconfiguration setting may be predefined reconfiguration setting or may be set by the user.
  • UI elements lying within the user-touchable area retain their positions on current UI element screen, while the UI elements lying outside the user-touchable area are reconfigured within the user-touchable area on a next UI element screen. This results in an increase in the number of UI element screens.
  • the size of all the UI elements is decreased or optimized to accommodate all the UI elements within the user-touchable area on current UI element screen.
  • the number of UI element screens is not increased, as no UI element is reconfigured on a next UI element screen.
  • the exemplary embodiment of the present subject matter may provide methods and systems for reconfiguring user interface in a user-touchable area by adjusting the positions, intervals, and layout of the UI elements so that a user may conveniently manipulate the touch device with single hand.
  • Fig. 1 illustrates a touch device, according to an embodiment of the present subject matter.
  • Fig. 2 illustrates an exemplary user swipe input received on the touch device, according to an embodiment of the present subject matter.
  • Fig. 3 illustrates an exemplary implementation of partial reconfiguration of user interface on the touch device, according to an embodiment of the present subject matter.
  • Fig. 4 illustrates an exemplary implementation of complete reconfiguration of user interface on the touch device, according to an embodiment of the present subject matter.
  • Fig. 5 illustrates a method for dynamic reconfiguration of user interface on the touch device, according to an embodiment of the present subject matter.
  • Fig. 6 illustrates a method for dynamic reconfiguration of user interface based on direction of the user swipe input, according to an embodiment of the present subject matter.
  • Fig. 1 illustrates exemplary components of a touch device 100, in accordance with an embodiment of the present subject matter.
  • the touch device 100 facilitates a user to provide a user swipe input for reconfiguration of user interface (UI) on the touch device 100.
  • the touch device 100 may be implemented as various computing devices, such as but not limited to a mobile phone, a smart phone, a personal digital assistant (PDA), a digital diary, a tablet, a net-book, a laptop computer, and the like.
  • the touch device 100 includes one or more processor(s) 102, I/O interface(s) 104, and a memory 106 coupled to the processor(s) 102.
  • the processor(s) 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 102 is configured to fetch and execute computer-readable instructions stored in the memory 106.
  • the I/O interface(s) 104 may include a variety of software and hardware interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, and an external memory. Further, the I/O interfaces 104 may facilitate multiple communications within a wide variety of protocol types including, operating system to application communication, inter process communication, etc.
  • the memory 106 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • volatile memory such as static random access memory (SRAM) and dynamic random access memory (DRAM)
  • DRAM dynamic random access memory
  • non-volatile memory such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • the touch device 100 may include a touch-screen 108.
  • the touch-screen 108 is operable to display images in response to a video signal and is also operable to output a touch signal that indicates a position, on the touch-screen 108, which is touched by a user.
  • the touch signal is generated in response to contact or proximity of a portion of the user’s hand, for example, user’s thumb or user’s finger, with respect to the touch-screen 108.
  • the touch signal can also be generated in response to contact or proximity of an implement, such as a stylus.
  • the touch-screen 108 can be implemented using any one of a number of well-known technologies that are suitable for performing the functions described herein with respect to the present subject matter. Any suitable technology now known or later devised can be employed to implement the touch-screen 108. Exemplary technologies that can be employed to implement the touch-screen 108 include resistive touch sensing, surface acoustic wave touch sensing, capacitive touch sensing, and other suitable technologies.
  • the touch-screen 108 can be positioned on top of a display unit having a user interface.
  • the touch-screen 108 is substantially transparent such that the display on the display unit is visible through the touch-screen 108.
  • the touch-screen 108 and the display unit are sized complementary to one another.
  • the touch-screen 108 can be approximately of the same size as the display unit, and is positioned with respect to the display unit such that a touchable area of the touch-screen 108 and a viewable area of the display unit are substantially coextensive.
  • the touch-screen 108 can be a capacitive touch-screen.
  • the display unit is a liquid crystal display that is operable to output a touch signal in response to a user’s touch on the touch-screen.
  • the touch-screen of the present exemplary embodiment may have a relatively large screen size, compared to a related-art touch-screen.
  • a touch-screen includes the user-untouchable area, i.e., an area untouchable and/or unreachable, by the user input means according to a user’s reach or an area above which the user input means cannot be placed, the present exemplary embodiment is applicable to the touch-screen.
  • the touch device 100 may include module(s) 110 and data 112.
  • the modules 110 and the data 112 may be coupled to the processor(s) 102.
  • the modules 110 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types.
  • the modules 110 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • the modules 110 may be computer-readable instructions which, when executed by a processor/processing unit, perform any of the described functionalities.
  • the machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium.
  • the computer-readable instructions can be also be downloaded to a storage medium via a network connection.
  • the module(s) 110 includes a surface area processor 114, a reconfiguration controller 116, including a partial reconfiguration controller 118 and a complete reconfiguration controller 120, and other module(s) 122.
  • the other module(s) 122 may include programs or coded instructions that supplement applications or functions performed by the touch device 100.
  • the data 112 may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 110.
  • the data 112 is shown internal to the touch device 100, it may be understood that the data 112 can reside in an external repository (not shown in the figure), which may be coupled to the touch device 102.
  • the touch device 100 may communicate with the external repository through the I/O interface(s) 104 to obtain information from the data 112.
  • the processor(s) 102 is operable to display a user interface, in preconfigured or predefined mode, on the touch-screen 108 of the touch device 100.
  • the user interface facilitates a user to interact with user interface (UI) elements to execute application programs installed on the touch device 100.
  • UI user interface
  • the user can interact with the UI elements presented on the user interface by performing a “tap” operation.
  • the “tap” operation on the touch device 100 is a form of gesture.
  • the touch device 100 commonly supports a variety of gesture-based user commands, such as swipe gesture, pinch gesture and spread gesture, to interact with the UI elements presented on the user interface.
  • the user may not be able to interact with few UI elements positioned away from the reach of the user.
  • the touch device 100 may include a UI reconfiguration mode to allow the user to switch-on and switch-off the reconfiguration of the UI based on user’s reach of hand or thumb or finger.
  • a UI reconfiguration mode to allow the user to switch-on and switch-off the reconfiguration of the UI based on user’s reach of hand or thumb or finger.
  • the user may activate the UI reconfiguration mode.
  • the touch device 100 prompts the user to provide a user swipe input to reconfigure the existing UI.
  • the user provides the user swipe input on the touch-screen 108.
  • the user swipe input is then utilized by the touch device 100 to register the extent of user’s reach on the touch-screen 108.
  • the touch-screen 108 utilizes its multi-touch capabilities to receive the user swipe input, and thus does not require any additional hardware, such as specialized sensors, to be integrated in the existing touch-screen 108.
  • the touch-screen 108 having multi-touch capabilities can receive the user swipe input when the user keeps maximum area of user input means in contact with the touch-screen 108 while providing the user swipe input.
  • the user input means may include user thumb, user finger, or a user stylus.
  • the user input means may be any suitable and/or similar input means, such as any finger of a user and a stylus. It is to be understood that the user input means is not limited to a user’s hand in the present subject matter.
  • Fig. 2 illustrates an exemplary user swipe input 202 received on the touch-screen 108 of the touch device 100, according to an embodiment of the present subject matter.
  • the user swipe input 202 may be received when the user swipes the user input means, for example, user thumb or user finger or user stylus, from a first edge 204-1 of the touch-screen 108 to a second edge 204-2 of the touch-screen 108 tracing a swipe boundary on the touch-screen 108.
  • the user swipe input means for example, user thumb or user finger or user stylus
  • the touch-screen 108 of the touch device 100 may receive the user swipe input 202 that may not be touching any edge of the touch-screen 108.
  • the user may trace the swipe boundary by the user input means from a point nearest to the first edge 204-1 of the touch-screen 108 to a point nearest to a second edge 204-2 of the touch-screen 108. Then, the touch device 100 may connect that point nearest to the first edge 204-1 or the second edge 204-2 to respective nearest edge.
  • the touch-device 100 may include a reconfiguring mechanism to receive the user swipe input 200 by prompting the user to touch a soft-button on the touch-screen 108 for automatically tracing the swipe boundary.
  • a reconfiguring mechanism to receive the user swipe input 200 by prompting the user to touch a soft-button on the touch-screen 108 for automatically tracing the swipe boundary.
  • Such automatic tracing of swipe boundary is performed based on swipe history maintained over a pre-defined period of time by the reconfiguring module.
  • the reconfiguring mechanism may not automatically trace the swipe boundary as it has nothing stored or maintained as the swipe history.
  • first edge 204-1 is represented as a bottom edge and the second edge 204-2 is represented as a side edge.
  • first edge 204-1 can be any side edge and the second edge 204-2 can be a bottom or top edge.
  • first edge 204-1 and the second edge 204-2 can be adjacent edges as represented in Fig. 2
  • first edge 204-1 and the second edge 204-2 can be oppositely lying edges.
  • first edge 204-1 can be one side edge and the second edge 204-2 can be another side edge or a corner point.
  • the side edges can be longitudinal edges or horizontal edges.
  • the first edge 204-1 can be bottom edge and the second edge 204-2 can be a right edge.
  • the first edge 204-1 can be bottom edge and the second edge 204-2 can be a left edge.
  • the user swipe input 202 received in accordance with the present subject matter, can easily be distinguished from normal user swipe input by two identifications. Firstly, a large portion of user input means, for example, user thumb or user input, would be in contact with the touch-screen 108. Secondly, the user swipe input 202 is performed from the first edge 204-1 to the second edge 204-2 of the touch-screen 108, and vice versa. That is, the user swipe input 202 connects the first edge 204-1 of the touch-screen 108 with the second edge 204-2 of the touch-screen 108. It will be understood that other identifications, such as the reconfiguration mode being in active mode, can also be used.
  • the user swipe input 202 received in accordance with the present subject matter, defines a user-touchable area.
  • the user-touchable area can be either a user-defined swipe boundary area or a user-defined enclosed area 206 enclosed by the swipe boundary and sides of the touch-screen 108.
  • the user-defined swipe boundary area is not confined to an actual area touched by the user input means. Specifically, when the user input means touches a specific part of the touch-screen 108, the touch device 100 determines whether the user has slided or dragged the user input means, for example, from right to left or from left to right, estimates a swipe boundary area based on a specific touched area on the touch-screen 108, and determines the estimated swipe boundary area as the user-defined swipe boundary area.
  • the user-defined enclosed area 206 is an area enclosed on the touch-screen 108 within the first edge 204-1 of the touch-screen 108, the second edge 204-2 of the touch-screen 108, and the swipe boundary traced by the user swipe input 202.
  • the user-defined swipe boundary area or a user-defined enclosed area 206 can be enclosed between two side edges, one bottom edge, and the user swipe input 202.
  • the surface area processor 114 determines a value of the user-touchable area and compares the determined value of the user-touchable area with a predefined threshold area.
  • the predefined threshold area is defined based on an average length of a human thumb or a human finger or a stylus. Based on the comparison, in case the value of the user-touchable area is determined below a predefined threshold area, the surface area processor 114 may prompt the user to again provide the user swipe input 202.
  • the reconfiguration controller 116 makes a decision on what type of reconfiguration of the UI elements is to be executed. The decision depends on the reconfiguration setting for the user interface of the touch device 100.
  • the touch device 100 may include the user-definable reconfiguration setting that enables the user to define the reconfiguring setting for the user interface under two categories, namely partial reconfiguration and complete reconfiguration.
  • the user may define the configuration of the UI based on the direction of the user swipe input 202.
  • the user can define the user-definable reconfiguration setting that the partial reconfiguration is performed when the touch-screen 108 receives the user swipe input 202 in upward direction from the first edge 204-1 of the touch-screen 108 to the second edge 204-2 of the touch-screen 108.
  • the user can define the user definable reconfiguration setting that the complete reconfiguration is performed when the touch-screen 108 receives the user swipe input 202 in downward direction from the second edge 204-2 of the touch-screen 108 to the first edge 204-1 of the touch-screen 108.
  • the user can define the user definable reconfiguration setting that the partial reconfiguration is performed when the touch-screen 108 receives the user swipe input 202 in downward direction from the second edge 204-2 of the touch-screen 108 to the first edge 204-1 of the touch-screen 108.
  • the user can define the user definable reconfiguration setting that the complete reconfiguration is performed when the touch-screen 108 receives the user swipe input 202 in upward direction from the first edge 204-1 of the touch-screen 108 to the second edge 204-2 of the touch-screen 108.
  • the user can receive a prompt on providing the user swipe input and in response to the prompt can select whether a partial reconfiguration or a complete reconfiguration is to be done.
  • the partial reconfiguration controller 118 in case the reconfiguration controller 116 makes a decision to perform the partial reconfiguration of the user interface based on the reconfiguration setting, the partial reconfiguration controller 118 is invoked to perform the partial reconfiguration of the user interface within the user-defined enclosed area 206 enclosed by the user swipe input 202. Thereafter, the partial reconfiguration controller 118 retains the positions of user interface (UI) elements lying within the user-defined enclosed area 206 on a current UI element screen, while reconfigure positions of UI elements lying outside the user-defined enclosed area 206 onto a next UI element screen within the user-defined enclosed area 206.
  • UI user interface
  • UI elements such as calculator, voice recorder, phone, contacts, messaging, internet, ChatON, Samsung apps, Samsung Link, WatchON, and Video
  • the UI elements such as clock, S Planner, Camera, Gallery, Settings, Email, Samsung Hub, and Music
  • the UI elements lying outside the user-defined enclosed area 206 are reconfigured or moved on to next UI element screen within the user-defined enclosed area 206.
  • a number of UI element screens containing the UI elements may increase.
  • the size of the UI elements is not scaled down to adjust into the user-defined enclosed area 206.
  • the complete reconfiguration controller 120 in case the reconfiguration controller 116 makes a decision to perform the complete reconfiguration of the user interface based on the reconfiguration setting, the complete reconfiguration controller 120 is invoked to perform the complete reconfiguration of the user interface within the user-defined enclosed area 206. Thereafter, the complete reconfiguration controller 120 optimizes or scales down the size of all user interface (UI) elements to accommodate within the user-defined enclosed area 206 on a current UI element screen. The optimized or scaled down UI elements are then reconfigured or shrank within the user-defined enclosed area 206.
  • UI user interface
  • size of all the UI elements is scaled down to adjust all the UI elements within the user-defined enclosed area 206 enclosed on the touch-screen 108.
  • a number of UI element screens on the touch device 100 are not decreased as no UI element is reconfigured or moved to next UI element screen.
  • the visibility of the elements is affected due to scaling down of the size of all the UI elements.
  • the reconfiguration of the user interface is performed by using the technologies known in the art to a skilled person. Such technologies may divide an existing display area for the reconfigured user interface, into a plurality of sub-areas and calculates the coordinates of each sub-area. Thereafter, a mapping relationship between the coordinates of the actual display area and the coordinates of the reconfigured display sub-area is determined, so as to display the reconfigured user interface.
  • technologies may divide an existing display area for the reconfigured user interface, into a plurality of sub-areas and calculates the coordinates of each sub-area. Thereafter, a mapping relationship between the coordinates of the actual display area and the coordinates of the reconfigured display sub-area is determined, so as to display the reconfigured user interface.
  • any other reconfiguration technique can be implemented to restructure distant user interface (UI) elements within the user-touchable area, which is defined within the reach of the user’s hand to motivate a single handed operation.
  • UI distant user interface
  • the present subject matter provides convenience to a user for interacting with distant UI elements even when the distant UI elements are positioned beyond a single hand operational capability of the user.
  • the present subject matter facilitates the mentioned convenience by dynamically reconfiguration of the user interface within the user-touchable area computed based on the user swipe input 202. Such reconfiguration of the user interface ensures that all the UI elements on the user interface are within the reach of the user during a single hand operation.
  • a portion outside the user-touchable area or user-defined enclosed area 206 of the reconfigured user interface is left unutilized.
  • the said portion outside the user-touchable area or user-defined enclosed area 206 can be used to preview images, videos, contacts, grids of files/folders, or other preview-able files or items.
  • the setting for the said portion can be made through user definable reconfiguration setting of the touch device 100.
  • the reconfigured user interface may include soft-keys representing the functionality of the hard-keys of the touch device 100. This ensures that the user may not have to stretch his hand to reach the hard-keys provided on the top of the touch device 100.
  • Fig. 5 and Fig. 6 illustrate methods 500 and 600 for reconfiguration of user interface on a touch device 100.
  • the order in which the methods 500 and 600 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the methods, or alternative methods. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein.
  • the methods may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the methods may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network.
  • computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
  • steps of the methods 500 and 600 can be performed by programmed computers and computing devices.
  • program storage devices for example, digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, where said instructions perform some or all of the steps of the described method.
  • the program storage devices may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • the embodiments are also intended to cover both communication network and computing devices configured to perform said steps of the exemplary method.
  • a user swipe input 202 is received from a user on a touch-screen 108.
  • the touch-screen 108 of the touch device 1000 may receive the user swipe input 202 when the user swipes user input means, for example, user thumb or user finger or user stylus, from a first edge 204-1 of the touch-screen 108 to a second edge 204-2 of the touch-screen 108 tracing a swipe boundary on the touch-screen 108.
  • the surface area receiver 114 of the touch device 100 determines a user-touchable area.
  • the user-touchable area can be either a user-defined swipe boundary area or a user-defined enclosed area 206 enclosed by the swipe boundary and sides of the touch-screen.
  • a reconfiguration controller 116 reconfigures the user interface present on the touch-screen 108 within the user-touchable area based on reconfiguration setting. Such reconfiguration of the user interface ensures a single handed operation of the touch device 100 by positioning all the user interface (UI) elements within the user-defined enclosed area 206 of the user.
  • UI user interface
  • Fig. 6 describes the method 600 for reconfiguration of the user interface on the touch device 100, in accordance with one implementation of the present subject matter.
  • a user swipe input 202 is received from a user on a touch-screen 108.
  • the touch-screen 108 of the touch device 1000 may receive the user swipe input 202 when the user swipes user input means, for example, user thumb or user stylus, from a first edge 204-1 of the touch-screen 108 to a second edge 204-2 of the touch-screen 108.
  • the surface area receiver 114 of the touch device 100 determines a user-touchable area.
  • the user-touchable area can be either a user-defined swipe boundary area or a user-defined enclosed area 206 enclosed by the swipe boundary and sides of the touch-screen.
  • a reconfiguration controller 116 makes a decision on what type of reconfiguration of the user interface is to be executed. For example, based on the reconfiguration setting, a partial reconfiguration would be performed when the user swipe input 202 in provided in an upward direction, while a complete reconfiguration would be performed when the user swipe input 202 in provided in downward direction, and vice versa.
  • the reconfiguration of the user interface can be categorized into two categories, namely the partial reconfiguration and the complete reconfiguration, based on the direction of the user swipe input 202.
  • the partial reconfiguration is performed when the reconfiguration controller 116 detects the user swipe input 202 in a direction from the first edge 204-1 of the touch-screen 108 to the second edge 204-2 of the touch-screen 108.
  • the complete reconfiguration is performed when the reconfiguration controller 116 detects the user swipe input 202 in a direction from the second edge 204-2 of the touch-screen 108 to the first edge 204-1 of the touch-screen 108.
  • the reconfiguration controller 116 in case the reconfiguration controller 116 detects that the partial configuration is to be performed, invokes the partial reconfiguration controller 118 to perform the partial reconfiguration of the user interface within the user-defined enclosed area 206 enclosed by the user swipe input 202.
  • the partial reconfiguration controller 118 retains positions of UI elements lying within the user-defined enclosed area 206 on a current UI element screen. That is, the UI elements lying within the user-defined enclosed area 206 are left unchanged on the touch-screen 108 of the touch device 100.
  • the partial reconfiguration controller 118 reconfigures positions of UI elements lying outside the user-defined enclosed area 206 onto a next UI element screen. That is, the UI elements lying outside the user-defined enclosed area 206 are reconfigured or moved within the user-defined enclosed area 206 on the next UI element screen.
  • the reconfigured user interface is outputted on a display unit of the touch device 100.
  • the reconfiguration controller 116 in case the reconfiguration controller 116 detects that the complete configuration is to be performed, the reconfiguration controller 116 invokes the complete reconfiguration controller 120 to perform the complete reconfiguration of the user interface within the user-defined enclosed area 206 enclosed by the user swipe input 202.
  • the complete reconfiguration controller 120 optimizes or scale down the size of all UI elements in such a way that the optimized or scaled down UI elements may accommodate within the user-defined enclosed area 206 on a current UI element screen.
  • the optimized or scaled down UI elements are reconfigured or shrank within the user-defined enclosed area 206 on the current UI element screen.
  • the reconfigured user interface is outputted on the display unit of the touch device 100.
  • user interface or all user interface elements are positioned within a user-defined enclosed area 206 or a user-touchable area lying within the reach of a user, so as to facilitate a single handed operation of the touch device 100.
  • a user-touchable area is set in a display area of the touch-screen and UI elements are reconfigured in the user-touchable area by adjusting the positions and sizes of the UI elements in the touch device, a user experience is enhanced. Furthermore, such reconfiguration of the UI elements may utilize less number of computing resources as compared to the related art touch devices as the reconfigured UI elements utilizes a partial area of the touch-screen as a user interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/KR2015/000308 2014-01-20 2015-01-13 User interface for touch devices Ceased WO2015108310A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201580005040.8A CN105917300B (zh) 2014-01-20 2015-01-13 用于触摸设备的用户界面
US15/110,267 US20160328144A1 (en) 2014-01-20 2015-01-13 User interface for touch devices
EP15737245.9A EP3097473A4 (en) 2014-01-20 2015-01-13 User interface for touch devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN166/DEL/2014 2014-01-20
IN166DE2014 IN2014DE00166A (enExample) 2014-01-20 2015-01-13

Publications (1)

Publication Number Publication Date
WO2015108310A1 true WO2015108310A1 (en) 2015-07-23

Family

ID=53543145

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/000308 Ceased WO2015108310A1 (en) 2014-01-20 2015-01-13 User interface for touch devices

Country Status (5)

Country Link
US (1) US20160328144A1 (enExample)
EP (1) EP3097473A4 (enExample)
CN (1) CN105917300B (enExample)
IN (1) IN2014DE00166A (enExample)
WO (1) WO2015108310A1 (enExample)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6653489B2 (ja) * 2016-12-16 2020-02-26 パナソニックIpマネジメント株式会社 入力装置、及び、入力方法
CN108833679B (zh) * 2018-05-24 2021-02-23 维沃移动通信有限公司 一种对象显示方法及终端设备
US11385770B1 (en) * 2021-04-21 2022-07-12 Qualcomm Incorporated User interfaces for single-handed mobile device control
US20230281102A1 (en) * 2022-03-07 2023-09-07 Paypal, Inc. Customer journey prediction and recommendation systems and methods
US12079463B1 (en) * 2023-06-29 2024-09-03 Adeia Guides Inc. Methods and systems for positioning display elements

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110013733A (ko) * 2009-08-03 2011-02-10 삼성전자주식회사 사용자인터페이스 제공 방법 및 장치
US20110265040A1 (en) * 2010-04-22 2011-10-27 Samsung Electronics Co., Ltd. Method for providing graphical user interface and mobile device adapted thereto
US20120017177A1 (en) * 2010-07-16 2012-01-19 Jungwoo Kim Mobile terminal and method of organizing a menu screen therein
US20130307801A1 (en) * 2012-05-21 2013-11-21 Samsung Electronics Co. Ltd. Method and apparatus of controlling user interface using touch screen
US20140013254A1 (en) * 2012-07-05 2014-01-09 Altaf Hosein System and method for rearranging icons displayed in a graphical user interface

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4741983B2 (ja) * 2006-06-20 2011-08-10 シャープ株式会社 電子機器及び電子機器の動作方法
JP2009110286A (ja) * 2007-10-30 2009-05-21 Toshiba Corp 情報処理装置、ランチャー起動制御プログラムおよびランチャー起動制御方法
WO2010110550A1 (en) * 2009-03-23 2010-09-30 Core Logic Inc. Apparatus and method for providing virtual keyboard
KR101078380B1 (ko) * 2009-03-23 2011-10-31 주식회사 코아로직 가상 키보드 제공 장치 및 방법
KR101364837B1 (ko) * 2009-06-16 2014-02-19 인텔 코오퍼레이션 핸드헬드 디바이스를 위한 적응형 버츄얼 키보드
KR101657117B1 (ko) * 2009-08-11 2016-09-13 엘지전자 주식회사 이동단말기 및 그 제어방법
JP2011086036A (ja) * 2009-10-14 2011-04-28 Victor Co Of Japan Ltd 電子機器、アイコン表示方法およびアイコン表示プログラム
KR101361214B1 (ko) * 2010-08-17 2014-02-10 주식회사 팬택 터치스크린의 제어영역을 설정하는 인터페이스 장치 및 방법
CN102508595B (zh) * 2011-10-02 2016-08-31 上海量明科技发展有限公司 一种用以触摸屏操作的方法及终端
CN102368197A (zh) * 2011-10-02 2012-03-07 上海量明科技发展有限公司 用以触摸屏操作的方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110013733A (ko) * 2009-08-03 2011-02-10 삼성전자주식회사 사용자인터페이스 제공 방법 및 장치
US20110265040A1 (en) * 2010-04-22 2011-10-27 Samsung Electronics Co., Ltd. Method for providing graphical user interface and mobile device adapted thereto
US20120017177A1 (en) * 2010-07-16 2012-01-19 Jungwoo Kim Mobile terminal and method of organizing a menu screen therein
US20130307801A1 (en) * 2012-05-21 2013-11-21 Samsung Electronics Co. Ltd. Method and apparatus of controlling user interface using touch screen
US20140013254A1 (en) * 2012-07-05 2014-01-09 Altaf Hosein System and method for rearranging icons displayed in a graphical user interface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3097473A4 *

Also Published As

Publication number Publication date
US20160328144A1 (en) 2016-11-10
IN2014DE00166A (enExample) 2015-07-24
CN105917300A (zh) 2016-08-31
CN105917300B (zh) 2020-04-14
EP3097473A1 (en) 2016-11-30
EP3097473A4 (en) 2017-09-13

Similar Documents

Publication Publication Date Title
JP6663453B2 (ja) 側面に搭載されたタッチパッドを用いたナビゲーション・アプリケーション
US9740268B2 (en) Intelligent management for an electronic device
US9753607B2 (en) Electronic device, control method, and control program
EP2993566B9 (en) Application interface presentation method and apparatus, and electronic device
CN103955331B (zh) 一种应用程序图标的显示处理方法及装置
WO2013032234A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
WO2019128732A1 (zh) 一种图标管理的方法及装置
CN104007919B (zh) 电子装置及其控制方法
CN105580024B (zh) 一种截屏方法及装置
WO2019184490A1 (zh) 用于显示寄宿应用的图标的方法、设备和存储介质
CN105144066A (zh) 用于调整显示区域的方法及其电子设备
CN107025055A (zh) 便携式设备中的触摸输入方法和装置
WO2015108310A1 (en) User interface for touch devices
WO2014129787A1 (en) Electronic device having touch-sensitive user interface and related operating method
WO2013141464A1 (ko) 터치 기반의 입력제어 방법
CN107832330A (zh) 一种搜索方法及终端设备
CN105849683A (zh) 用于处理通过显示器提供的对象的方法和设备
US10732719B2 (en) Performing actions responsive to hovering over an input surface
CN105468182A (zh) 虚拟键盘显示系统及方法
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
CN104182120B (zh) 屏幕界面的显示方法和显示装置
US20210096728A1 (en) Control Method and Electronic Device
CN103279263B (zh) 一种文件夹快速创建和编辑的方法及系统
JPWO2014003012A1 (ja) 端末装置、表示制御方法およびプログラム
CN107291367B (zh) 一种橡皮擦的使用方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15737245

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15110267

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015737245

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015737245

Country of ref document: EP