CN106909304B - Method and apparatus for displaying graphical user interface - Google Patents
Method and apparatus for displaying graphical user interface Download PDFInfo
- Publication number
- CN106909304B CN106909304B CN201710119581.4A CN201710119581A CN106909304B CN 106909304 B CN106909304 B CN 106909304B CN 201710119581 A CN201710119581 A CN 201710119581A CN 106909304 B CN106909304 B CN 106909304B
- Authority
- CN
- China
- Prior art keywords
- contact
- user interface
- graphical user
- sensor
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04142—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position the force sensing means being located peripherally, e.g. disposed at the corners or at the side of a touch sensing plate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method and apparatus for displaying a graphical user interface. A method of displaying a graphical user interface on a display unit of a device includes: detecting a first input; determining at least one graphical user interface element based on an application being executed by the device; displaying the at least one graphical user interface element on a display unit; detecting a second input corresponding to the first input; performing a function based on the at least one graphical user interface element according to a second input.
Description
The application is a divisional application of an invention patent application with the application date of 2009, 9 and 14, the application number of 200910169036.1, and the invention name of "method and device for displaying a graphical user interface according to a contact mode of a user".
Technical Field
Exemplary embodiments of the present invention relate to a Graphical User Interface (GUI) for an electronic device, and more particularly, to an apparatus of a method of displaying a GUI according to a contact pattern of a user.
Background
A touch screen may be used as the display unit and the input unit. Accordingly, the electronic device having the touch screen may not require an additional display unit and input unit. Due to this advantage, the touch screen can be widely used for limited-sized electronic devices such as, for example, mobile devices (which may also be referred to as portable devices or handheld devices).
Typically, a user may operate the touch screen with one or both hands to command the execution of a desired function or application. When the user uses two hands, one hand holds the device and the other hand can touch the touch screen of the device. However, when a user uses only one hand, fingers (e.g., thumb) of the holding hand often occlude portions of the touch screen.
Fig. 10A is a schematic example illustrating a left thumb of a user selecting one of menu icons displayed on a touch screen. In this example, if the user touches a particular icon (e.g., a music icon) located in the upper right portion of the touch screen, the finger may fully or partially occlude some of the other icons (e.g., game icons, display icons, and schedule icons) displayed on the touch screen. Additionally, these obscured icons may be in contact with the thumb, such that the function associated with the obscured icons may be undesirably performed.
FIG. 10B is another illustrative example showing a user's left thumb touching a scroll bar presented on a touch screen. If the user touches the scroll bar located on the right side of the touch screen, the displayed content (e.g., scene) may be occluded by the thumb. In addition, some of the displayed content may be undesirably touched and accessed by the thumb.
Without the use of a touch screen or keyboard, an electronic device with a tactile sensor may provide control of an electronic device application only if the user maintains contact with a particular portion of the electronic device. These electronic devices may provide a display screen with a GUI to guide contact-based input. If the GUI is displayed in a fixed form regardless of the user's contact pattern, there may be locations in the GUI where user contact may not be recorded/entered. This phenomenon may be caused by differences in the size of the respective hands, the size of the fingers, and the form of holding. It is difficult to implement a GUI suitable for a plurality of users. If there is no match between the location in the GUI and the point of contact the user makes contact, then a distraction may arise when the user operates an application on the electronic device.
Disclosure of Invention
Exemplary embodiments of the present invention provide a method and apparatus for displaying a Graphical User Interface (GUI) by a hand suitable for a user to operate.
Exemplary embodiments of the present invention also provide an apparatus having a touch screen and a tactile sensor.
Additional features of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
An exemplary embodiment of the present invention discloses a method of displaying a GUI on a display unit in a device including a tactile sensor unit. The method comprises the following steps: the tactile sensor unit detects a user's contact; determining a contact pattern based on the detected contact; the GUI is displayed in correspondence with the contact mode.
An exemplary embodiment of the present invention provides an apparatus to display a GUI. The apparatus comprises: a tactile sensor unit configured to create a contact detection signal when detecting contact of a user, wherein the tactile sensor unit includes a left sensor portion and a right sensor portion, each sensor portion having a plurality of sensor components; a display unit configured to display a GUI; a control unit configured to receive the contact detection signal from the tactile sensor unit, determine a contact mode based on the contact detection signal, and control the display unit to display the GUI corresponding to the contact mode.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the principles of the invention.
Fig. 1A is a block diagram illustrating an internal structure of an apparatus according to an exemplary embodiment of the present invention.
FIG. 1B shows an example of a tactile sensor unit located at the side of the device shown in FIG. 1A, according to an exemplary embodiment of the invention.
Fig. 2 is a flowchart illustrating a method of displaying a GUI according to a hand performing an operation according to an exemplary embodiment of the present invention.
Fig. 3 is a flowchart illustrating an example of detailed processing of a step of determining a hand to operate in the GUI display method illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Fig. 4 is a flowchart illustrating another example of detailed processing of a step of determining a hand to operate in the GUI display method illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Fig. 5 is a flowchart illustrating another example of detailed processing of a step of determining a hand to operate in the GUI display method illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Fig. 6A illustrates an example in which a left hand holds a device and a tactile sensor unit is located at a side of the device according to an exemplary embodiment of the present invention.
Fig. 6B illustrates an example in which the device is held by the right hand and the tactile sensor unit is located at the side of the device according to an exemplary embodiment of the present invention.
Fig. 7A illustrates another example in which a left hand holds a device and a tactile sensor unit is located at a side of the device according to an exemplary embodiment of the present invention.
Fig. 7B illustrates another example in which the device is held by the right hand and the tactile sensor unit is located at the side of the device according to an exemplary embodiment of the present invention.
Fig. 8 illustrates an example of a GUI according to an exemplary embodiment of the present invention.
Fig. 9 illustrates another example of a GUI according to an exemplary embodiment of the present invention.
Fig. 10A is a schematic example illustrating a left thumb of a user selecting one of menu icons displayed on a touch screen.
Fig. 10B is another illustrative example showing a left thumb of a user touching a scroll bar located on a touch screen according to a conventional GUI.
Fig. 11 is a flowchart illustrating a method of displaying a GUI based on a hand performing an operation according to an exemplary embodiment of the present invention.
Fig. 12A illustrates an example of a screen on which menu icons are displayed by a user's contact in an idle screen application according to an exemplary embodiment of the present invention.
Fig. 12B illustrates an example of a screen in which a menu icon displayed by a new contact of the user is changed in an idle screen application according to an exemplary embodiment of the present invention.
Fig. 13A illustrates an example of a screen on which function icons are displayed by user's contact in a camera application according to an exemplary embodiment of the present invention.
Fig. 13B illustrates another example of a screen in which function icons displayed by a user's new contact are changed in a camera application according to an exemplary embodiment of the present invention
Fig. 14A illustrates an example of a screen on which function icons are displayed by user's contact in an MP3 application according to an exemplary embodiment of the present invention.
Fig. 14B illustrates an example of a screen in which a function icon displayed by a new contact of the user is changed in the MP3 application according to an exemplary embodiment of the present invention.
Detailed Description
The present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure will be thorough and will fully convey the scope of the invention to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity. Like reference symbols in the various drawings indicate like elements.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present invention.
Spatially relative terms, such as "below …," "below …," "below," "above …," "above," and the like, may be used herein for ease of description to describe one element or feature's relationship to another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the exemplary term "below …" can include both an orientation of "above …" and "below …". The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In other instances, well-known or widely used techniques, elements, mechanisms and processes have not been described or shown in detail to avoid obscuring the spirit of the invention.
Before explaining exemplary embodiments of the present invention, the following description defines the related art.
A Graphical User Interface (GUI) may refer to a graphical display disposed on a display (e.g., screen) of an electronic device. The GUI may include at least one window, at least one icon, at least one scroll bar, and any other graphical items used by a user to input commands to the device. It should be understood that exemplary embodiments of the present invention may include various GUIs of various shapes, designs and configurations.
The operating hand may refer to a hand of a user of the electronic device operating a touch screen of the electronic device. The operating hand may comprise one or more hands performing a touch action on the touch screen. Additionally, the hand performing the operation may include one or more hands in contact with an electronic device having a tactile sensor. The hand to be operated may be the user's left hand, right hand or both hands.
The tactile sensor unit or the tactile sensor may refer to at least one sensor that is sensitive to a touch of a user. The tactile sensor unit may be different from a touch sensor included in the touch screen, and the tactile sensor unit may be generally located at least one side of the electronic device. If the user holds the device, the tactile sensor unit may detect contact between the user's hand and the device, create a contact detection signal, and transmit the contact detection signal to the control unit. The tactile sensor unit may include at least one tactile sensor that may detect the magnitude of the contact pressure and the location of the contact/pressure. Alternatively, a combination of a pressure sensor and a touch sensor may be used for the tactile sensor unit. The tactile sensor unit may include left and right sensor portions, each of which may include a plurality of sensor components. The tactile sensor unit may be formed on the upper and/or lower side of the device, or may be formed on any and/or all sides of the device.
The sensor assembly may refer to elements constituting the tactile sensor unit. Each sensor assembly may independently detect contact by a user. Based on the kind or size of the sensor components, the number of sensor components included in the tactile sensor unit may be determined.
The group of components may refer to a set of sensor components sequentially arranged among the contact detecting sensor components. The component groups may be used to create contact pattern information. The location of the component groups and the number of sensor components included in a single component group may vary depending on the form of grip of the user.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1A is a block diagram illustrating an internal structure of an apparatus according to an exemplary embodiment of the present invention.
Referring to fig. 1A, the apparatus 100 may be a mobile communication terminal, a portable terminal such as a Personal Digital Assistant (PDA), a computer, a TV, or any other electronic device having a touch screen. The device 100 may include a tactile sensor 110, a memory unit 120, a touch screen 130, and a control unit 140.
The tactile sensor unit 110 may detect contact of a user's hand while the user holds the device 100. The tactile sensor unit 110 may detect the magnitude of the contact pressure and the location of the contact/pressure. The tactile sensor unit 110 may include a tactile sensor, a touch screen, and/or a combination of a pressure sensor and a touch screen. The tactile sensor unit 110 may be located at a side of the apparatus 100, but is not limited thereto. In some exemplary embodiments, the tactile sensor units 110 may be located on each face of the device 100.
Fig. 1B shows an example of the tactile sensor unit 110 located at the side of the device 100. After detecting the contact of the user's hand, the tactile sensor unit 110 may transmit a contact detection signal to the control unit 140. The tactile sensor unit 110 may include left and right sensor portions, each of which may include a plurality of sensor components.
The memory unit 120 may store a plurality of programs required to perform functions of the apparatus 100 and data created when the functions are performed. The memory unit 120 may store data related to the processing of the hand assumed to be operated and contact pattern information.
The touch screen 130 may display information and may receive user input. The touch screen 130 may include a display unit 132 and a touch sensor unit 134.
The display unit 132 may be formed of a Liquid Crystal Display (LCD) or any other suitable type of display. The display unit 132 may provide a plurality of graphic information related to the state and operation of the apparatus 100. The display unit 132 may display a GUI according to a hand performing an operation. In some cases, the display unit 132 may display the GUI according to the position of the finger of the user holding the device 100.
The touch sensor unit 134 may determine whether a user's hand touches the touch screen. The touch sensor unit 134 may be formed of a capacitive touch sensor, a pressure sensor, and/or any other suitable touch sensitive sensor. When the touch of the user's hand is detected, the touch sensor unit 134 may transmit a touch signal to the control unit 140. The touch signal may include coordinate data representing a touch position of the user. In some cases, touch sensor unit 134 may not be included in device 100.
The control unit 140 may control the state and operation of one or more elements of the apparatus 100. For example, the control unit 140 may receive a contact detection signal from the tactile sensor unit 110, and may determine a contact pattern of the user by using the contact detection signal. Accordingly, the control unit 140 may command the display unit 132 to display the GUI according to the user's contact mode.
Fig. 2 is a flowchart illustrating a method of displaying a GUI according to a hand performing an operation according to an exemplary embodiment of the present invention.
Referring to fig. 2, the tactile sensor unit 110 may detect a user' S contact (S210). The user's contact may be the result of the user holding the device 100. When the user's contact is detected, the tactile sensor unit 110 may transmit a contact detection signal to the control unit 140.
The control unit 140 may receive a contact detection signal from the tactile sensor unit 110 (S220). Then, the control unit 140 may determine the contact pattern of the user, and thus may determine the hand performing the operation (S230). Step S230 is shown in detail in fig. 3, 4 and 5.
After step S230, the control unit 140 may command the touch screen 130 to display a GUI according to the hand performing the operation (S240). Then, the control unit 140 may determine whether an additional contact detection signal is received from the tactile sensor unit 110 (S250). If the control unit 140 determines that the additional contact detection signal is received, the method of displaying the GUI may return to step S230 to re-determine the contact mode of the user and also re-determine the hand performing the operation. When the user changes the grip form, an additional contact detection signal may be provided by the tactile sensor unit 110.
The display unit 132 may maintain the current GUI if the control unit 140 does not receive an additional contact detection signal from the tactile sensor unit 110. Subsequently, the user can operate the GUI displayed on the touch screen 130 to input a command to the apparatus 100.
Fig. 3 is a flowchart illustrating an example of detailed processing of step S230 illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Referring to fig. 3, the control unit 140 may generate at least one component group based on the contact detection signal received from the tactile sensor unit 110 (S310). As described above, a group of components may refer to a set of sensor components arranged in one or more orders among the contact detecting sensor components.
Exemplary embodiments of sensor assemblies and groups of assemblies are shown in fig. 6A, 6B, 7A, and 7B.
Fig. 6A shows an example in which the user holds the device 100 with the left hand. The tactile sensor unit 110 may be located at a side of the device 100. The left sensor portion may be located on the left side of the device 100 and the right sensor portion may be located on the right side of the device 100. Each sensor section may include a plurality of sensor components. The number of sensor assemblies varies depending on the size of the sensor assemblies. For example, the smaller the size of the sensor assembly, the more sensor assemblies may be disposed on the side of the device 100. In fig. 6A, for example, the number of sensor components belonging to each sensor section may be 23. The marked component in the sensor components of the left sensor portion may represent the component that detects contact with the left hand. In the sensor components of the right sensor portion, the labeled components may represent components that detect contact with fingers of the left hand (e.g., four fingers other than the thumb). The contact detection assemblies may be grouped in the order in which they are arranged. For example, the sequentially arranged 9 components in the left sensor section may be divided into a group. In addition, four pairs of components, in which two components in the right sensor section are a pair, may be divided into four groups.
Returning to fig. 3, after the step S310 of generating the component group, the control unit 140 may create contact pattern information based on the component group (S320). Thus, the contact pattern information may vary based on how the user holds the device 100. The contact pattern information may include, for example, the number of component groups in each sensor section, the positions of the component groups, the intervals between the component groups, the number of sensor components in each component group, and/or pressure detection data of each sensor component.
Referring to fig. 6A, the contact pattern information of the left sensor portion may include the following data: a group of components including nine sensor components located, for example, from the 12 th sensor component to the 20 th sensor component. The contact pattern information of the right sensor part may include the following data: four component groups, each of which includes two sensor components, the 8 sensor components being located, for example, at the 4 th, 5 th, 9 th, 10 th, 14 th, 15 th, 19 th and 20 th component positions. Three sensor assemblies may be positioned between two adjacent sets of assemblies.
Returning to fig. 3, the control unit 140 may retrieve the stored contact pattern information from the memory unit 120 (S330). The memory unit 120 may store contact pattern information, and may generally store different contact pattern information corresponding to different grip types. The contact pattern information stored in the memory unit 120 may include, for example, the number of component groups in each sensor section, the positions of the component groups, the intervals between the component groups, the number of sensor components in each component group, and/or pressure detection data of each sensor component.
The control unit 140 may sequentially compare the created contact pattern information with the retrieved contact pattern information (S340). For example, the control unit 140 may perform individual comparison of the number of component groups in each sensor section, the positions of the component groups, the intervals between the component groups, the number of sensor components in each component group, and/or the pressure detection data of each sensor component.
The control unit 140 may also determine whether the created contact pattern information is within a range related to the retrieved contact pattern information (S350). When the created information completely corresponds to the retrieved information, the control unit 140 may determine that the created information is within a range associated with the retrieved information. If an allowed margin is previously allocated to the contact pattern information stored in the memory cell 120, the created information may be within the allowed margin, and thus the created information may be determined to be within the range. The allowed margins may be respectively assigned to, for example, the number of component groups in each sensor section, the positions of the component groups, the intervals between the component groups, the number of sensor components in each component group, and/or the pressure detection data of each sensor component.
If the created contact pattern information is within the range of the retrieved contact pattern information, the control unit 140 may determine the hand performing the operation corresponding to the created contact pattern information (S360). The memory unit 120 may have stored information about a hand operating differently according to different contact pattern information. The control unit 140 may determine a hand performing an operation corresponding to the created contact pattern information if the created contact pattern information belongs to the range of the retrieved contact pattern information. The determined hand to perform the operation may be the left hand or the right hand.
If the created contact pattern information does not belong to the range of the retrieved contact pattern information, the control unit 140 may determine that the operating hands are both hands (S370). After determining the hand performing the operation, the control unit 140 may return to the previous step S240 of displaying the GUI according to the hand performing the operation.
Fig. 4 is a flowchart illustrating another example of the detailed process of step S230 illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Referring to fig. 4, the control unit 140 may generate at least one component group (S410), which may be a set of sensor components sequentially arranged among the contact detecting sensor components. Then, the control unit 140 may calculate the number of sensor components included in each component group (S420). For example, as exemplarily shown in fig. 6A, one component group in the left sensor section may have 9 sensor components, and each of four component groups in the right sensor section may have two sensor components.
After calculating the number of sensor components in each component group, the control unit 140 may determine which component group and which sensor part may have the most contact-detecting sensor components (S430). For example, the largest component group may be in the left sensor part or the right sensor part, and thus, the control unit 140 may determine whether the largest sensor part is the left sensor part or the right sensor part at step S430. For example, referring to fig. 6A, the control unit 140 may determine that the largest component group has 9 sensor components and is located in the left sensor part. Referring to another example shown in fig. 6B, the largest group of components may have 9 sensor components and be located in the right sensor portion. Accordingly, the control unit 140 may determine that the largest sensor portion is the right sensor portion. Similarly, the largest sensor portions may be the left sensor portion in fig. 7A and the right sensor portion in fig. 7B.
If the largest sensor part is the left sensor part, the control unit 140 may further determine whether the left sensor part has an additional component group (S440). An additional component group may refer to one or more component groups that are located in the largest sensor portion but are not the largest sensor component group. In fig. 6A, for example, the left sensor section, which is the largest sensor section, may have the largest component group, but no additional component group. However, in fig. 7A, the left sensor portion having the largest group of components may have one additional group of components including three sensor components from the 3 rd sensor component to the 5 th sensor component.
If there is no additional component group as shown in fig. 6A, the control unit 140 may determine that the hand performing the operation is the left hand (S450). The largest component group may then be considered to be in contact with the palm of the left hand. In addition, no additional component group may indicate that the thumb of the left hand is not in contact with the tactile sensor unit 110. In these cases, the control unit 140 may determine that the user operates the touch screen 130 with the thumb of the left hand. That is, the user may hold the device 100 and touch the touch screen 130 using his or her left hand. The control unit 140 may determine that the hand performing the operation is the left hand.
Similar steps may be performed if the hand performing the operation is determined to be the right hand. For example, if the largest sensor part is the right sensor part, the control unit 140 may determine whether the right sensor part has an additional component group (S460). If the right sensor part does not have the additional component group as shown in fig. 6B, the control unit 140 may determine that the hand performing the operation is the right hand (S470).
If the right sensor part has the additional component group, the control unit 140 may determine that the hand performing the operation is likely to be both hands (S480). The presence of the additional component group may indicate that the thumb of the hand held by the user is in contact with the tactile sensor unit 110. The control unit 140 may determine that the user can operate the touch screen 130 with the thumb of the hand that is not held. Therefore, the control unit 140 determines that the hands performing the operation are both hands.
Fig. 5 is a flowchart illustrating another example of the detailed process of step S230 illustrated in fig. 2 according to an exemplary embodiment of the present invention.
Referring to fig. 5, the control unit 140 may generate at least one component group (S510), which may be a set of sensor components sequentially arranged among the contact detecting sensor components. The control unit 140 may calculate the number of component groups in each sensor part (S520). In fig. 6A, 6B, 7A, and 7B, the sensor portions may be left and right sensor portions. In some cases, the control unit 140 may simultaneously count the number of component groups in the left and right sensor sections.
For example, in fig. 6A, the number of component groups of the left sensor section may be 1, and the number of component groups of the right sensor section may be 4. In fig. 6B, the number of component groups of the left sensor section may be 4, and the number of component groups of the right sensor section may be 1. In fig. 7A, the number of component groups of the left sensor section is 2, and the number of component groups of the right sensor section is 4. In fig. 7B, the number of component groups of the left sensor section may be 4, and the number of component groups of the right sensor section may be 2.
The control unit 140 may determine whether the number of component groups in the left sensor part is three or more and whether the number of component groups in the right sensor part is one or less (S530). If the number of component groups in the left sensor part is 3 or more and the number of component groups in the right sensor part is 1 or less, the control unit may determine that the hand performing the operation is the right hand of the user (S540). At least 3 fingers other than the thumb of 3 or more than 3 component groups in the left sensor portion may be represented to be in contact with the left sensor portion. 1 or less than 1 of the groups of components in the right sensor portion may indicate that the palm of the user's right hand is in contact with the right sensor portion, but the thumb may not be in contact with the right sensor portion. In this case, the control unit 140 may determine that the user can operate the touch screen with the right thumb of the user. That is, the user's right hand may be used to hold the device 100 and touch the touch screen 130. Accordingly, the control unit 140 may determine that the hand performing the operation is the right hand.
Similarly, in determining that the hand performing the operation is the left hand, the control unit 140 may determine whether the number of component groups in the right sensor portion is three or more and the number of component groups in the left sensor portion is one or less (S550).
If the answers at step 530 and step 550 are both no, the control unit 140 may determine that the operating hands are both hands (S570).
Fig. 8 illustrates an example of a GUI according to an exemplary embodiment of the present invention.
Fig. 8 shows an exemplary embodiment of a display 810 having menu icons and a left hand determined to be the hand performing the operation. The control unit 140 may arrange the menu icons from the upper left corner to the lower right corner of the display screen 810 to correspond to the movement path of the left thumb. Accordingly, the user may select an icon by touching the display screen 810 with his or her left thumb to perform a desired function corresponding to the selected icon. Since the menu icons are arranged along the moving path of the left thumb, the icons may not be obscured by the thumb and an undesired touch of the icons may be prevented.
FIG. 8 also shows an exemplary embodiment of a display 820 of the operating hand with the right hand determined. In this case, the control unit 140 may arrange the menu icons along the right thumb movement path. In some other cases, as shown in the display screen 830 in fig. 8, when both hands are determined as the operating hands, the control unit 140 may maintain a normal GUI, which varies according to the user's intention.
Fig. 9 illustrates another example of a GUI according to an exemplary embodiment of the present invention.
FIG. 9 illustrates an exemplary embodiment of a display screen 910 having a scroll bar and a left hand determined to be the hand to operate. The control unit 140 may arrange the scroll bar along the left side of the display screen 910 to correspond to the movement path of the left thumb. Thus, the user can move the scroll bar up or down by dragging the scroll bar with the user's left thumb. Since the scroll bar may be arranged in the left direction, the displayed content may not be obscured by the thumb, and undesired touch of the displayed content may be prevented.
FIG. 9 also shows an exemplary embodiment of a display 920 having a scroll bar and a right hand determined to be the operating hand. In this case, the control unit 140 may arrange the scroll bar along the right side of the display screen 910 to correspond to the movement path of the right thumb. Thus, the user can move/drag the scroll bar with the user's right thumb without obstructing or touching the displayed content. Accordingly, when the user drags the scroll bar, the displayed content can be prevented from being undesirably touched. In some other cases, as shown in the display 930 of fig. 9, when both hands are determined as the hands to operate, the control unit 140 may maintain a normal GUI, which may change.
Fig. 11 is a flowchart illustrating a method of displaying a GUI based on a hand performing an operation according to an exemplary embodiment of the present invention. For example, when the user inputs a command to the apparatus 100 using the tactile sensor unit 110, the method described with reference to fig. 11 may be applied.
Referring to fig. 11, the tactile sensor unit 110 may detect a user' S contact (S1110). The user's contact may correspond to the user's holding of the device 100. For example, a user may hold device 100 with one hand (as shown in fig. 12A and 12B) or with both hands (as shown in fig. 13A and 13B). When the user's contact is detected, the tactile sensor unit 110 may transmit a contact detection signal including information on the position and pressure of the contact to the control unit 140.
The control unit 140 may receive a contact detection signal from the tactile sensor unit 110 (S1120). Next, the control unit 140 may determine a contact mode of the user according to the contact detection signal (S1130). The contact pattern of the user may be determined based on information about the location and pressure of the contact. In some cases, the storage unit 120 may store a list of user's grip forms related to the user's contact position and pressure. The control unit 140 retrieves a specific holding form corresponding to the received information on the position and pressure of the contact from the storage unit 120. For example, if two contact detection signals are received from the left sensor part and four contact detection signals are received from the right sensor part as shown in fig. 12A, the control unit 140 may determine that the user's left hand holds the apparatus 100.
Then, the control unit 140 may instruct the display unit 132 to display the GUI at a specific position on the display unit 132 according to the contact position of the user (S1140). Specifically, the control unit 140 may first identify the currently executed application (before displaying the GUI) and then may select a GUI element corresponding to the currently executed application. For example, when an idle screen application is running, the control unit 140 may select a menu icon as a GUI element for an idle screen. In other cases, if the camera application is running, the control unit 140 may select an icon for taking a picture and a scroll bar for zooming in/out to be displayed. After selecting the customized GUI element, the control unit 140 may determine a GUI arrangement mode based on the currently executed application and the user's grip form. For example, referring to fig. 12A and 12B, the control unit 140 may recognize an idle screen application as a currently executed application, and may also determine that the user's left hand holds the device 100. Then, the control unit 140 may determine the GUI arrangement mode so that the menu icon may be disposed near a contact position of at least one of four fingers (except for the thumb) of the left hand of the user.
After determining the GUI arrangement mode, the control unit 140 may command the display unit 132 to display GUI elements based on the GUI arrangement mode. That is, a previously selected GUI element may be displayed on the display unit 132 according to the GUI arrangement mode.
Fig. 12A and 12B illustrate two examples of screens displayed with menu icons in an idle screen application according to an exemplary embodiment of the present invention. As shown in fig. 12A and 12B, three menu icons may be located at positions in the lateral direction from the three finger contact positions. The storage unit 120 may store a ranking of the use frequency of the menu and the menu icons. The control unit 140 may arrange the menu icons in order of frequency of use. For example, when the user holds the apparatus 100 during execution of the idle-screen application, the control unit 140 may retrieve the usage frequency ranking of the menu from the storage unit 120 and may instruct the display unit 132 to display the menu icon according to the retrieved ranking. The icons displayed in the display unit 132 may be changed according to the user's preference.
The control unit 140 may also change the position of the GUI element according to the pressure of the user contact. Referring to fig. 12A and 12B, when the user increases the contact pressure (e.g., the pressure increases to exceed a predetermined pressure threshold) during the holding of the device 100, the menu icons displayed on the display unit 132 may move to the right side of the screen, i.e., to the pressed finger. In addition, when the corresponding icon reaches the right side of the display unit 132, the control unit 140 may execute a specific application. In some cases, the control unit 140 may decide whether to execute the application based on the contact pressure of the user's finger and/or when the icon reaches the right side of the display unit 132.
The control unit 140 may also change the display size of the GUI element according to the pressure of the user contact. For example, when the user increases the contact pressure, the size of the menu icon displayed on the display unit 132 may be increased or decreased. The increase in the contact pressure may highlight the menu icon displayed on the display unit 132.
Fig. 13A and 13B illustrate two examples of screens on which function icons are displayed in a camera application according to an exemplary embodiment of the present invention. Referring to fig. 13A and 13B, a user may hold device 100 with the thumb and index finger of both hands. The icon for photographing may be located at a position close to the index finger of the right hand, and the scroll bar for zooming in and out may be located at a position in the longitudinal direction from the thumb of the right hand. If the user increases the contact pressure by the right index finger, the icon for photographing moves in the direction of the right index finger. When the icon reaches the upper side of the display unit 132, a photographing function may be performed. In addition, the user may increase the contact pressure by the right thumb to control zooming in/out.
Fig. 14A and 14B show two examples of screens on which function icons are displayed in the MP3 application. Referring to fig. 14A and 14B, the user may hold the device 100 with the left hand. The function icons may be displayed according to the contact position of the fingers of the left hand other than the thumb, whereas the volume bar may be displayed according to the contact position of the thumb. The displayed function icons may be arranged according to a predetermined GUI arrangement pattern. While holding the device 100, the user may control the execution of the MP3 application by increasing the contact pressure or performing an action such as a tap. For example, the position, size, and/or presentation effect of the icon may vary depending on the user's contact.
Returning to fig. 11, after displaying the GUI, the control unit 140 may determine whether the contact position of the user has changed (S1150). Specifically, when the user changes the grip form during gripping the device 100, the tactile sensor unit 110 may detect a change in the user's contact and may generate a new contact detection signal. Then, the control unit 140 may receive a new contact detection signal from the tactile sensor unit 110, may determine the contact pattern of the user again, and may modify the display of the GUI according to the new contact pattern.
Referring to fig. 12A and 12B, the user's contact in fig. 12B is different from the user's contact in fig. 12A. For example, in fig. 12B, the position of the finger may be moved downward. The control unit 140 may receive a new contact detection signal from the tactile sensor unit 110 and may determine a new contact pattern based on new information regarding the position and pressure of the contact. Then, the control unit 140 may change the display of the GUI according to the new contact mode command.
Comparing fig. 13A and 13B, the right index finger of the user may move (e.g., to the left) in fig. 13B compared to fig. 13A. As shown in fig. 13B, the control unit 140 may receive a new contact detection signal, determine a new contact mode, and move the photographing icon to the current index finger contact direction.
Referring to fig. 14A and 14B, the number of contacts and the positions of the contacts may be changed. For example, the four contacts on the right side in fig. 14A may move downward and may be reduced to 3 contacts in fig. 14B. Additionally, the left side contact may move downward. The volume control bar may also move down the left side. In addition, the back icon, the play/pause icon, and the stop icon may move down along the right side, but the forward icon corresponding to a finger (e.g., little finger) of the left hand may be removed from the display unit 132.
As described above, exemplary embodiments of the present invention disclose a method and apparatus for displaying and modifying a GUI according to a location and pressure of user contact. Accordingly, when a user operates a device, exemplary embodiments of the present invention can avoid confusion.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Claims (12)
1. A method of displaying a graphical user interface on a display unit of a device, the method comprising:
detecting a first contact input of a sensor unit arranged on at least one side of the device;
determining at least one graphical user interface element based on an application being executed by the device;
displaying the at least one graphical user interface element on a display unit;
detecting, via the sensor unit, a second contact input corresponding to the first contact input, wherein the second contact input is a pressure of the first contact input;
performing a function based on the displayed graphical user interface element according to the second contact input,
wherein the step of executing the function comprises:
the function is performed in response to the displayed graphical user interface element moving to the predetermined position in accordance with the second contact input.
2. The method of claim 1, wherein the position of the graphical user interface element displayed on the display unit is determined according to the position of the first contact input.
3. The method of claim 1, wherein the step of performing a function comprises: an application corresponding to the displayed graphical user interface element is executed.
4. The method of claim 1, wherein the step of performing a function comprises: at least one display characteristic of the displayed graphical user interface element is changed.
5. The method of claim 1, wherein the first contact input is a contact to a tactile sensor unit located on at least one side of the device.
6. The method of claim 5, wherein the step of determining a graphical user interface element comprises: a graphical user interface arrangement mode is determined based on the executed application and the contact mode of the first contact input.
7. An apparatus for displaying a graphical user interface on a display unit, the apparatus comprising:
a sensor unit disposed on at least one side of the device and configured to detect a first contact input and a second input corresponding to the first contact input, wherein the second contact input is a pressure of the first contact input;
a control unit configured to determine at least one graphical user interface element based on an application being executed on the device, display the determined graphical user interface element on the display unit and perform a function based on the graphical user interface element displayed on the display unit according to a second contact input,
wherein the control unit is further configured to perform a function in response to the displayed graphical user interface element moving to a predetermined position according to the second contact input.
8. The device of claim 7, wherein a position of a graphical user interface element displayed on the display unit is determined according to a position of the first contact input.
9. The apparatus of claim 7, wherein the control unit executes an application corresponding to the displayed graphical user interface element.
10. The apparatus of claim 7, wherein the control unit changes at least one display characteristic of the displayed graphical user interface element.
11. The device of claim 7, wherein the sensor unit comprises a tactile sensor unit and the first contact input is a contact to the tactile sensor unit located on at least one side of the device.
12. The apparatus of claim 11, wherein the control unit determines the graphical user interface arrangement mode based on the executed application and a contact mode of the first contact input.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2008-0097591 | 2008-10-06 | ||
KR20080097591 | 2008-10-06 | ||
KR1020090012687A KR20100039194A (en) | 2008-10-06 | 2009-02-17 | Method for displaying graphic user interface according to user's touch pattern and apparatus having the same |
KR10-2009-0012687 | 2009-02-17 | ||
CN200910169036A CN101714055A (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910169036A Division CN101714055A (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106909304A CN106909304A (en) | 2017-06-30 |
CN106909304B true CN106909304B (en) | 2020-08-14 |
Family
ID=42215793
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710119581.4A Active CN106909304B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN200910169036A Pending CN101714055A (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
CN201710119962.2A Active CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910169036A Pending CN101714055A (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
CN201710119962.2A Active CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
Country Status (3)
Country | Link |
---|---|
KR (1) | KR20100039194A (en) |
CN (3) | CN106909304B (en) |
ES (1) | ES2776103T3 (en) |
Families Citing this family (105)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012019350A1 (en) * | 2010-08-12 | 2012-02-16 | Google Inc. | Finger identification on a touchscreen |
CN102375652A (en) * | 2010-08-16 | 2012-03-14 | 中国移动通信集团公司 | Mobile terminal user interface regulation system and method |
CN102402275B (en) * | 2010-09-13 | 2017-05-24 | 联想(北京)有限公司 | Portable electronic equipment and holding gesture detection method |
WO2012049942A1 (en) * | 2010-10-13 | 2012-04-19 | Necカシオモバイルコミュニケーションズ株式会社 | Mobile terminal device and display method for touch panel in mobile terminal device |
CN102479035A (en) * | 2010-11-23 | 2012-05-30 | 汉王科技股份有限公司 | Electronic device with touch screen, and method for displaying left or right hand control interface |
JP2012168932A (en) * | 2011-02-10 | 2012-09-06 | Sony Computer Entertainment Inc | Input device, information processing device and input value acquisition method |
CN102131003A (en) * | 2011-04-06 | 2011-07-20 | 罗蒙明 | Method for judging finger key pressing on virtual keyboard of mobile phone with touch screen |
CN102790816A (en) * | 2011-05-16 | 2012-11-21 | 中兴通讯股份有限公司 | Processing method and device of pushbutton function |
CN102810039A (en) * | 2011-05-31 | 2012-12-05 | 中兴通讯股份有限公司 | Left or right hand adapting virtual keyboard display method and terminal |
CN102841723B (en) * | 2011-06-20 | 2016-08-10 | 联想(北京)有限公司 | Portable terminal and display changeover method thereof |
JP5453351B2 (en) * | 2011-06-24 | 2014-03-26 | 株式会社Nttドコモ | Mobile information terminal, operation state determination method, program |
JP5588931B2 (en) * | 2011-06-29 | 2014-09-10 | 株式会社Nttドコモ | Mobile information terminal, arrangement area acquisition method, program |
CN102299996A (en) * | 2011-08-19 | 2011-12-28 | 华为终端有限公司 | Handheld device operating mode distinguishing method and handheld device |
JP5911961B2 (en) * | 2011-09-30 | 2016-04-27 | インテル コーポレイション | Mobile devices that eliminate unintentional touch sensor contact |
KR101908947B1 (en) | 2011-11-23 | 2018-10-17 | 삼성전자주식회사 | Method and apparatus for peripheral connection |
KR101866272B1 (en) * | 2011-12-15 | 2018-06-12 | 삼성전자주식회사 | Apparatas and method of user based using for grip sensor in a portable terminal |
CN102722247A (en) * | 2012-03-09 | 2012-10-10 | 张伟明 | Operation and control component, information processing system using same and information processing method thereof |
CN103324423B (en) * | 2012-03-21 | 2018-11-13 | 北京三星通信技术研究有限公司 | A kind of terminal and its method for displaying user interface |
KR101979666B1 (en) | 2012-05-15 | 2019-05-17 | 삼성전자 주식회사 | Operation Method For plural Touch Panel And Portable Device supporting the same |
CN102662603A (en) * | 2012-05-18 | 2012-09-12 | 广州市渡明信息技术有限公司 | Input method display method and input method display system for mobile phone with touch screen |
KR101995486B1 (en) * | 2012-06-26 | 2019-07-02 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
CN102890558B (en) * | 2012-10-26 | 2015-08-19 | 北京金和软件股份有限公司 | The method of mobile hand-held device handheld motion state is detected based on sensor |
CN103809866B (en) * | 2012-11-13 | 2018-07-06 | 联想(北京)有限公司 | A kind of operation mode switching method and electronic equipment |
CN103118166B (en) * | 2012-11-27 | 2014-11-12 | 广东欧珀移动通信有限公司 | Method of realizing single hand operation of mobile phone based on pressure sensing |
US9591339B1 (en) | 2012-11-27 | 2017-03-07 | Apple Inc. | Agnostic media delivery system |
US9774917B1 (en) | 2012-12-10 | 2017-09-26 | Apple Inc. | Channel bar user interface |
CN103870140B (en) * | 2012-12-13 | 2018-01-23 | 联想(北京)有限公司 | A kind of object processing method and device |
US10200761B1 (en) | 2012-12-13 | 2019-02-05 | Apple Inc. | TV side bar user interface |
US9532111B1 (en) | 2012-12-18 | 2016-12-27 | Apple Inc. | Devices and method for providing remote control hints on a display |
CN103576850A (en) * | 2012-12-26 | 2014-02-12 | 深圳市创荣发电子有限公司 | Method and system for judging holding mode of handheld device |
CN103902141A (en) * | 2012-12-27 | 2014-07-02 | 北京富纳特创新科技有限公司 | Device and method for achieving dynamic arrangement of desktop functional icons |
US20140184519A1 (en) * | 2012-12-28 | 2014-07-03 | Hayat Benchenaa | Adapting user interface based on handedness of use of mobile computing device |
US10521188B1 (en) | 2012-12-31 | 2019-12-31 | Apple Inc. | Multi-user TV user interface |
US9904394B2 (en) | 2013-03-13 | 2018-02-27 | Immerson Corporation | Method and devices for displaying graphical user interfaces based on user contact |
JP5995171B2 (en) * | 2013-03-13 | 2016-09-21 | シャープ株式会社 | Electronic device, information processing method, and information processing program |
CN104360813B (en) * | 2013-04-12 | 2016-03-30 | 努比亚技术有限公司 | A kind of display device and information processing method thereof |
CN104216602B (en) * | 2013-05-31 | 2017-10-20 | 国际商业机器公司 | A kind of method and system for control slide block |
KR102139110B1 (en) * | 2013-06-20 | 2020-07-30 | 삼성전자주식회사 | Electronic device and method for controlling using grip sensing in the electronic device |
KR102138505B1 (en) * | 2013-07-10 | 2020-07-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9134818B2 (en) * | 2013-07-12 | 2015-09-15 | Facebook, Inc. | Isolating mobile device electrode |
US10162416B2 (en) * | 2013-09-06 | 2018-12-25 | Immersion Corporation | Dynamic haptic conversion system |
CN104714731B (en) * | 2013-12-12 | 2019-10-11 | 南京中兴软件有限责任公司 | The display methods and device of terminal interface |
US20150192989A1 (en) * | 2014-01-07 | 2015-07-09 | Samsung Electronics Co., Ltd. | Electronic device and method of controlling electronic device |
CN103795949A (en) * | 2014-01-14 | 2014-05-14 | 四川长虹电器股份有限公司 | Control terminal, device terminal and system for adjusting volume of device terminal |
KR102155091B1 (en) * | 2014-01-22 | 2020-09-11 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN104850339B (en) * | 2014-02-19 | 2018-06-01 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN104915073B (en) * | 2014-03-14 | 2018-06-01 | 敦泰科技有限公司 | Hand-held type touch device |
US9239648B2 (en) * | 2014-03-17 | 2016-01-19 | Google Inc. | Determining user handedness and orientation using a touchscreen device |
US9727161B2 (en) * | 2014-06-12 | 2017-08-08 | Microsoft Technology Licensing, Llc | Sensor correlation for pen and touch-sensitive computing device interaction |
JP6482578B2 (en) | 2014-06-24 | 2019-03-13 | アップル インコーポレイテッドApple Inc. | Column interface for navigating in the user interface |
CN117573019A (en) * | 2014-06-24 | 2024-02-20 | 苹果公司 | Input device and user interface interactions |
CN105468269A (en) * | 2014-08-15 | 2016-04-06 | 深圳市中兴微电子技术有限公司 | Mobile terminal capable of automatically identifying holding by left hand or right hand, and implementation method thereof |
CN105468245B (en) * | 2014-08-22 | 2020-05-01 | 中兴通讯股份有限公司 | Terminal and display method of terminal operation interface |
KR102291565B1 (en) | 2014-12-03 | 2021-08-19 | 삼성디스플레이 주식회사 | Display device and drving method for display devece using the same |
CN104461322A (en) * | 2014-12-30 | 2015-03-25 | 中科创达软件股份有限公司 | Display method and system for user interface of handheld device |
CN104615368A (en) * | 2015-01-21 | 2015-05-13 | 上海华豚科技有限公司 | Following switching method of keyboard interface |
CN104571919A (en) * | 2015-01-26 | 2015-04-29 | 深圳市中兴移动通信有限公司 | Terminal screen display method and device |
CN104679427B (en) * | 2015-01-29 | 2017-03-15 | 努比亚技术有限公司 | Terminal split-screen display method and system |
KR101686629B1 (en) * | 2015-01-30 | 2016-12-14 | 한국과학기술연구원 | Method for determining location in virtual space indicated by users input regarding information on pressure and apparatus and computer-readable recording medium using the same |
CN105988692A (en) * | 2015-02-02 | 2016-10-05 | 中兴通讯股份有限公司 | Handheld electronic equipment, and method and device for controlling handheld electronic equipment |
KR102358110B1 (en) | 2015-03-05 | 2022-02-07 | 삼성디스플레이 주식회사 | Display apparatus |
CN104731501B (en) * | 2015-03-25 | 2016-03-23 | 努比亚技术有限公司 | Control chart calibration method and mobile terminal |
CN104735256B (en) * | 2015-03-27 | 2016-05-18 | 努比亚技术有限公司 | Holding mode determination methods and the device of mobile terminal |
CN104834463A (en) * | 2015-03-31 | 2015-08-12 | 努比亚技术有限公司 | Holding recognition method and device of mobile terminal |
KR102384284B1 (en) * | 2015-04-01 | 2022-04-08 | 삼성전자주식회사 | Apparatus and method for controlling volume using touch screen |
CN104765541A (en) * | 2015-04-10 | 2015-07-08 | 南京理工大学 | Method and system for identifying whether left hand or right hand operates mobile phone |
CN104898959B (en) * | 2015-04-30 | 2018-06-05 | 努比亚技术有限公司 | A kind of method and apparatus for adjusting virtual push button position |
CN104866136B (en) * | 2015-05-11 | 2019-02-15 | 努比亚技术有限公司 | A kind of method and device of determining terminal operating mode |
KR102422181B1 (en) * | 2015-06-02 | 2022-07-18 | 삼성전자주식회사 | Method for controling a display of an electronic device and the electronic device thereof |
CN104915143B (en) * | 2015-06-19 | 2019-01-22 | 努比亚技术有限公司 | The control method and terminal of Rimless mobile terminal |
US10157410B2 (en) * | 2015-07-14 | 2018-12-18 | Ebay Inc. | Enhanced shopping actions on a mobile device |
CN105227768A (en) * | 2015-09-18 | 2016-01-06 | 努比亚技术有限公司 | A kind of application APP display system and method |
CN105183235B (en) * | 2015-10-19 | 2018-02-06 | 上海斐讯数据通信技术有限公司 | A kind of method of touch-control platen edge false-touch prevention |
CN105224181B (en) * | 2015-10-20 | 2018-05-25 | 魅族科技(中国)有限公司 | A kind of sidebar display methods and device |
CN106610746A (en) * | 2015-10-26 | 2017-05-03 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and control method thereof |
CN105573622A (en) * | 2015-12-15 | 2016-05-11 | 广东欧珀移动通信有限公司 | Single-hand control method and device of user interface and terminal device |
KR101876020B1 (en) * | 2016-05-10 | 2018-07-06 | 홍익대학교세종캠퍼스산학협력단 | Cursor Scrolling Control Method Using A 3D Touch Of A Mobile Device |
DK201670581A1 (en) | 2016-06-12 | 2018-01-08 | Apple Inc | Device-level authorization for viewing content |
DK201670582A1 (en) | 2016-06-12 | 2018-01-02 | Apple Inc | Identifying applications on which content is available |
CN106406656B (en) * | 2016-08-30 | 2019-07-26 | 维沃移动通信有限公司 | A kind of control method and mobile terminal of application tool bar |
US11966560B2 (en) | 2016-10-26 | 2024-04-23 | Apple Inc. | User interfaces for browsing content from multiple content applications on an electronic device |
KR102659062B1 (en) * | 2016-11-29 | 2024-04-19 | 삼성전자주식회사 | Device for displaying user interface based on sensing signal of grip sensor |
CN106648329A (en) * | 2016-12-30 | 2017-05-10 | 维沃移动通信有限公司 | Application icon display method and mobile terminal |
US10635255B2 (en) | 2017-04-18 | 2020-04-28 | Google Llc | Electronic device response to force-sensitive interface |
CN109710099A (en) * | 2017-10-26 | 2019-05-03 | 南昌欧菲生物识别技术有限公司 | Electronic device |
WO2019113895A1 (en) * | 2017-12-14 | 2019-06-20 | 深圳市柔宇科技有限公司 | Control method and electronic device |
DK201870354A1 (en) | 2018-06-03 | 2019-12-20 | Apple Inc. | Setup procedures for an electronic device |
US10838541B2 (en) * | 2018-09-03 | 2020-11-17 | Htc Corporation | Method for operating handheld device, handheld device and computer-readable recording medium thereof |
KR102539579B1 (en) | 2018-12-18 | 2023-06-05 | 삼성전자주식회사 | Electronic device for adaptively changing display area of information and operation method thereof |
CN113906419A (en) | 2019-03-24 | 2022-01-07 | 苹果公司 | User interface for media browsing application |
US11683565B2 (en) | 2019-03-24 | 2023-06-20 | Apple Inc. | User interfaces for interacting with channels that provide content that plays in a media browsing application |
US20200301567A1 (en) | 2019-03-24 | 2020-09-24 | Apple Inc. | User interfaces for viewing and accessing content on an electronic device |
US11445263B2 (en) | 2019-03-24 | 2022-09-13 | Apple Inc. | User interfaces including selectable representations of content items |
US11863837B2 (en) | 2019-05-31 | 2024-01-02 | Apple Inc. | Notification of augmented reality content on an electronic device |
EP3977245A1 (en) | 2019-05-31 | 2022-04-06 | Apple Inc. | User interfaces for a podcast browsing and playback application |
CN112486346B (en) * | 2019-09-12 | 2023-05-30 | 北京小米移动软件有限公司 | Key mode setting method, device and storage medium |
JP7279622B2 (en) * | 2019-11-22 | 2023-05-23 | トヨタ自動車株式会社 | display device and display program |
US20230142200A1 (en) * | 2020-02-10 | 2023-05-11 | Nec Corporation | Non-transitory storage medium, processing method for portable terminal, and portable terminal |
US11843838B2 (en) | 2020-03-24 | 2023-12-12 | Apple Inc. | User interfaces for accessing episodes of a content series |
US11899895B2 (en) | 2020-06-21 | 2024-02-13 | Apple Inc. | User interfaces for setting up an electronic device |
CN112543362A (en) * | 2020-11-02 | 2021-03-23 | 当趣网络科技(杭州)有限公司 | Display interface switching method, remote controller, television system and electronic equipment |
KR20220064162A (en) * | 2020-11-11 | 2022-05-18 | 삼성전자주식회사 | An electronic device including a stretchable display |
US11720229B2 (en) | 2020-12-07 | 2023-08-08 | Apple Inc. | User interfaces for browsing and presenting content |
US11934640B2 (en) | 2021-01-29 | 2024-03-19 | Apple Inc. | User interfaces for record labels |
CN113867594A (en) * | 2021-10-21 | 2021-12-31 | 元心信息科技集团有限公司 | Information input panel switching method and device, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1397870A (en) * | 2001-07-17 | 2003-02-19 | 仁宝电脑工业股份有限公司 | Touch display able to control amplificatino rabio by pressure |
CN101133385A (en) * | 2005-03-04 | 2008-02-27 | 苹果公司 | Hand held electronic device with multiple touch sensing devices |
CN101183292A (en) * | 2006-11-16 | 2008-05-21 | Lg电子株式会社 | Mobile terminal and screen display method thereof |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US7009599B2 (en) * | 2001-11-20 | 2006-03-07 | Nokia Corporation | Form factor for portable device |
US20030117376A1 (en) * | 2001-12-21 | 2003-06-26 | Elen Ghulam | Hand gesturing input device |
GB0201074D0 (en) * | 2002-01-18 | 2002-03-06 | 3G Lab Ltd | Graphic user interface for data processing device |
US7116314B2 (en) * | 2003-05-06 | 2006-10-03 | International Business Machines Corporation | Method for distribution wear for a touch entry display |
WO2005008444A2 (en) * | 2003-07-14 | 2005-01-27 | Matt Pallakoff | System and method for a portbale multimedia client |
DE602004013116T2 (en) * | 2004-01-20 | 2009-07-02 | Sony Deutschland Gmbh | Haptic key-controlled data entry |
KR100608576B1 (en) * | 2004-11-19 | 2006-08-03 | 삼성전자주식회사 | Apparatus and method for controlling a potable electronic device |
CN1901785B (en) * | 2005-07-22 | 2012-08-29 | 鸿富锦精密工业(深圳)有限公司 | Display device and its display control method |
CN100592247C (en) * | 2005-09-21 | 2010-02-24 | 鸿富锦精密工业(深圳)有限公司 | Multi-gradation menu displaying device and display control method |
CN1940834B (en) * | 2005-09-30 | 2014-10-29 | 鸿富锦精密工业(深圳)有限公司 | Circular menu display device and its display controlling method |
JP4699955B2 (en) * | 2006-07-21 | 2011-06-15 | シャープ株式会社 | Information processing device |
JP2008204402A (en) * | 2007-02-22 | 2008-09-04 | Eastman Kodak Co | User interface device |
-
2009
- 2009-02-17 KR KR1020090012687A patent/KR20100039194A/en active Search and Examination
- 2009-08-10 ES ES09167533T patent/ES2776103T3/en active Active
- 2009-09-14 CN CN201710119581.4A patent/CN106909304B/en active Active
- 2009-09-14 CN CN200910169036A patent/CN101714055A/en active Pending
- 2009-09-14 CN CN201710119962.2A patent/CN106909305B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1397870A (en) * | 2001-07-17 | 2003-02-19 | 仁宝电脑工业股份有限公司 | Touch display able to control amplificatino rabio by pressure |
CN101133385A (en) * | 2005-03-04 | 2008-02-27 | 苹果公司 | Hand held electronic device with multiple touch sensing devices |
CN101183292A (en) * | 2006-11-16 | 2008-05-21 | Lg电子株式会社 | Mobile terminal and screen display method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN106909305A (en) | 2017-06-30 |
KR20100039194A (en) | 2010-04-15 |
CN101714055A (en) | 2010-05-26 |
CN106909304A (en) | 2017-06-30 |
ES2776103T3 (en) | 2020-07-29 |
CN106909305B (en) | 2020-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106909304B (en) | Method and apparatus for displaying graphical user interface | |
EP2175344B1 (en) | Method and apparatus for displaying graphical user interface depending on a user's contact pattern | |
US10444989B2 (en) | Information processing apparatus, and input control method and program of information processing apparatus | |
US9035883B2 (en) | Systems and methods for modifying virtual keyboards on a user interface | |
CN102129311B (en) | Messaging device, method of operation input and operation loading routine | |
US20090109187A1 (en) | Information processing apparatus, launcher, activation control method and computer program product | |
US8159469B2 (en) | User interface for initiating activities in an electronic device | |
US20140123049A1 (en) | Keyboard with gesture-redundant keys removed | |
US10019154B2 (en) | Method, apparatus and computer program product for operating items with multiple fingers | |
CN103477306A (en) | Electronic apparatus, control setting method, and program | |
KR20110085189A (en) | Operation method of personal portable device having touch panel | |
EP4155891A1 (en) | Device control method and apparatus, and storage medium and electronic device | |
KR20110066025A (en) | Operation method of touch pannel and touch pannel driving chip | |
KR20090056469A (en) | Apparatus and method for reacting to touch on a touch screen | |
KR101678213B1 (en) | An apparatus for user interface by detecting increase or decrease of touch area and method thereof | |
KR101366170B1 (en) | User Interface for controlling state of menu | |
US20140085340A1 (en) | Method and electronic device for manipulating scale or rotation of graphic on display | |
KR20120095155A (en) | Operation method of personal portable device having touch panel | |
KR20110084042A (en) | Operation method of touch pannel and touch pannel driving chip | |
KR20110047421A (en) | Operation method of touch pannel | |
KR20110057027A (en) | Operation method of touch pannel | |
KR20120095130A (en) | Operation method of personal portable device having touch panel | |
KR20110057711A (en) | Operation method of touch pannel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |