KR101507595B1 - Method for activating function using gesture and mobile device thereof - Google Patents
Method for activating function using gesture and mobile device thereof Download PDFInfo
- Publication number
- KR101507595B1 KR101507595B1 KR20130103445A KR20130103445A KR101507595B1 KR 101507595 B1 KR101507595 B1 KR 101507595B1 KR 20130103445 A KR20130103445 A KR 20130103445A KR 20130103445 A KR20130103445 A KR 20130103445A KR 101507595 B1 KR101507595 B1 KR 101507595B1
- Authority
- KR
- South Korea
- Prior art keywords
- function
- icon
- gesture
- unit
- vectors
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
According to an embodiment of the present invention, an icon generating unit generates a guide icon including an individual function icon displayed in first to eighth regions except for a center region among the nine divided regions and the nine divided regions; A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors; A display unit for displaying the guide icon on a display screen; A sensing unit that receives a user's gesture and detects a gesture direction component; A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; A function execution unit for executing the function corresponding to the determined vector; A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And a memory, which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit. A portable electronic device is provided.
Description
The present invention relates to a method for executing a function using a gesture and a portable electronic device for the same. More particularly, the present invention relates to a method for displaying a gesture input by displaying a guide icon for providing a guide to a gesture on one side of a screen, A method for executing a function corresponding to an input direction, and a portable electronic device.
Portable electronic devices such as smartphones and tablets provide a touch-based interface. In a touch-based interface, a user can input a gesture on a touch screen without using an additional input device such as a keyboard to use the display device as an input device.
In an interface environment provided by modern portable electronic devices, a user needs to take a tap gesture to select an application icon to execute a specific application. If the application icon is located on the home screen (the main screen of the portable electronic device or the desktop screen), the application can be executed by selecting the icon directly on the home screen. However, if the icon is not on the home screen, It is necessary to select an icon of the application. In this case, you need two gestures: menu button selection and application icon selection.
Therefore, there is a method of placing many application icons on the home screen for quick execution of the application, but when there are too many icons on the home screen, the simplicity of the screen and the recognition of the icons are rather lowered. In addition, the number of icons that can be located in one home screen is limited.
Similarly, in order to execute another application in the middle of executing one application, in order to return to the home screen, a home button is pressed, a menu button of the home screen is selected, and a total of 3 I needed a gesture.
It is an object of the present invention to enable execution of various functions through a gesture and to display a gesture guide icon to provide a user with an intuitive guide to a gesture.
According to an embodiment of the present invention, an icon generating unit generates a guide icon including an individual function icon displayed in first to eighth regions except for a center region among the nine divided regions and the nine divided regions; A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors; A display unit for displaying the guide icon on a display screen; A sensing unit that receives a user's gesture and detects a gesture direction component; A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; A function execution unit for executing the function corresponding to the determined vector; A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And a memory, which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit. A portable electronic device is provided.
In the present invention, the functions corresponding to the first to eighth vectors are a screen switching function, a specific application executing function, or a function of switching to another guide icon.
In the present invention, when the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens, When the gesture is input while a screen other than the home screen is displayed, the function corresponding to the first to eighth vectors is a specific application execution function.
In the present invention, when the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed, and the enlarged guide icon displayed on each of the first to eighth areas The function icon may be changed in position and function according to the gesture of the user input on the enlarged guide icon.
In the present invention, the nine-divided area is an area obtained by dividing the guide icon into a 3 by 3 matrix shape, and the central area corresponds to two rows and two columns of the matrix.
According to another embodiment of the present invention, there is provided a method of generating a guide icon, the method comprising the steps of: generating a guide icon including an individual function icon displayed in first to eighth regions except a center region among nine divided regions and nine divided regions; Setting first to eighth vectors having directional components that proceed from the first to eighth areas to the center area, and mapping the functions of the individual function icons to the first to eighth vectors; Displaying the guide icon on a display screen; Receiving a user's gesture and detecting a gesture direction component; Determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; And performing the function corresponding to the determined vector; A method for executing a function using a gesture is provided.
According to the present invention, a user can obtain information about a function to be executed according to a gesture type by referring to a gesture guide icon.
In addition, according to the present invention, various functions can be performed using a gesture beyond the spatial limitation of the screen.
1 is a block diagram illustrating a configuration of a portable electronic device according to an embodiment of the present invention.
2 is a flowchart illustrating a method of executing a function using a gesture according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating guide icons generated according to an embodiment of the present invention.
4 is a diagram illustrating an example of direction vectors according to an embodiment of the present invention.
5 is an example of a screen for executing an application with a gesture according to an embodiment of the present invention.
FIG. 6 is an example of a case where a screen is switched to a gesture according to an embodiment of the present invention.
7 is a diagram showing an example of switching a plurality of guide icons according to an embodiment of the present invention.
8 is a screen for changing the setting of the guide icon according to an embodiment of the present invention.
The terms used in this specification will be briefly described and the present invention will be described in detail.
While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.
When an element is referred to as "including" an element throughout the specification, it is to be understood that the element may include other elements, without departing from the spirit or scope of the present invention. Also, the terms "part," " module, "and the like described in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software .
Throughout the specification, "gesture" means a hand gesture that a user uses to control a portable electronic device. For example, gestures described herein may include tabs, touch & hold, double tap, drag, panning, flick, drag and drop, and the like.
The "tab" represents an operation in which the user touches the screen very quickly using a finger or a stylus. That is, the time difference between the touch-in time, which is the time when the finger or the touch tool touches the screen, and the touch-out time, which is the time when the finger or the touch tool falls off the screen, is very short.
"Touch & Hold" represents an operation in which a user touches a screen using a finger or a stylus and then maintains a touch input over a critical time. That is, the time difference between the touch-in point and the touch-out point is equal to or greater than the threshold time. In order to allow the user to recognize whether the touch input is a tap or a touch & hold, a feedback signal may be provided visually or audibly when the touch input is maintained for a predetermined time or more.
"Drag" means an operation of moving a finger or a touch tool to another position on the screen while the user holds the touch after touching the finger or the touch tool with the screen. The object is moved due to the drag operation or a panning operation to be described later is performed.
"Panning" indicates a case where a user performs a drag operation without selecting an object. Since panning does not select a specific object, the object is not moved within the page, but the page itself moves within the screen, or the group of objects moves within the page.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.
1 is a block diagram illustrating a configuration of a portable electronic device according to an embodiment of the present invention.
The portable
1, the portable
Hereinafter, the components will be described in order.
The
The guide icon of the present invention has nine divided areas and an individual function icon may be displayed in each of the first to eighth areas except for the center area among the nine divided areas. In one embodiment of the present invention, the nine-divided region may be an area divided into a shape of a 3x3 matrix, and the central region may be an area corresponding to two rows and two columns of the matrix. Also, it is possible to set the first to eighth areas in the clockwise direction starting from the upper left area of the center area of the guide icon.
FIG. 3 is a diagram illustrating guide icons generated according to an embodiment of the present invention.
Referring to FIG. 3, the portable
As illustrated in FIG. 3, the
In addition to the example of FIG. 3, the shape of the
The
In the case where the guide icon has nine divided areas of a 3 by 3 matrix shape, the direction components of the first to eighth vectors are shifted from the upper side to the down direction, from the upper left side to the lower right side -down direction, a left-down direction in a right-up direction, a right-to-left direction, and an opposite direction to the directions described above.
4 is a diagram illustrating an example of direction vectors according to an embodiment of the present invention.
4 is an example of first to eighth vectors set when the first to eighth areas are allocated in a clockwise direction, as shown in FIG. In the example of FIG. 4, the first vector 14-1 traveling from the first region 12-1 to the
The
The
In the present invention, when the
In this specification, the term "real-touch" refers to a case where a pointer is actually touched on the screen, and "proximity-touch" , But approaches a predetermined distance from the screen. In this specification, a pointer is a tool for touching or touching a specific portion of a displayed screen. For example, there are stylus fans and fingers.
The
The
In addition, a proximity sensor is an example of a sensor for sensing the touch of the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. Therefore, the proximity sensor has a considerably longer life than the contact-type sensor and its utilization is also quite high.
Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
The
In an embodiment of the present invention as shown in FIG. 4, the direction components of the first to eighth vectors may differ by 45 degrees from each other. In this case, the
-2) from -22.5 DEG to -22.5 DEG from the angle of the direction component of the gesture direction component, the vector having the direction component corresponding to the gesture direction component can be determined as the second vector 14-2.
The
In the embodiment of FIG. 4, the function of the individual function icon 13-2 corresponding to the second vector 14-2 is 'execution of the Internet browser application', and the vector having the direction component corresponding to the gesture direction component is In the case of the second vector 14-2, the
The
The broadcast receiving module receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel.
The mobile communication module transmits and receives radio signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.
The wireless Internet module refers to a module for wireless Internet access, and the wireless Internet module can be built in or externally. Also, the wired Internet module means a module for a wired Internet connection.
The short-range communication module is a module for short-range communication. Near-field communication (Bluetooth), Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, WFD .
The location information module is also a module for identifying or obtaining the location of the portable
The
The
The
In addition, the
Meanwhile, the
Meanwhile, the
2 is a flowchart illustrating a method of performing a function by a gesture according to an embodiment of the present invention.
First, a guide icon including an individual function icon displayed in each of the first to eighth regions except the central region among the nine divided regions and nine divided regions is generated (S1).
Next, the first to eighth vectors having directional components proceeding from the first to eighth areas of the guide icon to the center area are set (S2).
Next, the first to eighth vectors are associated with the function of each individual function icon displayed in the area of the guide icon (S3).
Next, the generated guide icon is displayed on the display screen (S4).
Next, the user's gesture is input, and a gesture direction component that proceeds from the start point to the end point of the gesture input is detected (S5).
Finally, a vector having a direction component corresponding to the gesture direction component among the direction components of the first to eighth vectors is determined, and the function of the individual function icon corresponding to the determined vector is executed (S6).
5 is an example of a screen for executing an application with a gesture according to an embodiment of the present invention.
First, the user can obtain an intuitive recognition as to what function can be performed according to the direction of the gesture through the
5, the user can input a gesture having a vector component proceeding from the upper right corner to the left side of the
5, since the individual function icon corresponding to the third vector 14-3 is the music reproduction application icon 13-3 located in the third region 12-3, Executes the music playback application as shown on the right screen.
In an embodiment of the present invention illustrated in FIG. 5, the user can easily execute various functions without touching a plurality of times with only one gesture, despite the limited display screen size.
FIG. 6 is an example of a case where a screen is switched to a gesture according to an embodiment of the present invention.
6A is an example in which the
In the conventional portable electronic device, it is possible to have a plurality of home screens, but it is possible to switch between the home screens only through the left and right touches, and the number of times of switching increases as the number of home screens increases. For example, in order to move from the first home screen to the fifth home screen, a total of four drag gestures moving left and right are required.
On the other hand, according to one embodiment of the present invention, up to nine home screens can be extended, and movement from the current home screen to all other home screens can be performed with a single touch.
FIG. 6B illustrates a method of switching a home screen through a gesture according to an embodiment of the present invention. In the portable
Referring to FIG. 6B, the
The user can see the shape of the individual function icon displayed on the
In the embodiment of FIG. 6B, if the currently displayed home screen is the
6, the direction component of the gesture proceeding from the upper left to the lower right corresponds to the direction component of the first vector 14-1, and the
The individual function icons 13-1 to 13-8 of the
In addition, the
The individual function icons located in the first to eighth regions 12-1 to 12-8 of the
7 is a diagram illustrating an example of switching a plurality of guide icons using a gesture according to an embodiment of the present invention.
As shown in FIG. 7, a plurality of guide icons may exist. Referring to FIG. 7, it can be seen that an individual function icon in the form of a
It is also noted that the individual function icons in the shape of the
Finally, it can be seen that an individual function icon in the form of a
8 is a screen for changing the setting of the guide icon according to an embodiment of the present invention.
The user can tap the
The user can tap one of the individual function icons 13-1 to 13-8 or the first to eighth areas 12-1 to 12-8 in the enlarged guide icon 10-a to select The set function can be changed. Functions that can be applied to the individual function icons include an application execution function and a screen switching function. When the function applied to the individual function icon is changed, the shape of the individual function icon is also changed to a shape which is changed to the corresponding function.
For example, the icon 13-3 located in the third area 12-3 is a music application icon, but the user can select a message application from the list displayed by tapping the third area 12-3, The function icon 13-3 can be changed to the message application icon. At this time, it goes without saying that the function corresponding to the third vector 14-3 is changed to the execution of the message application.
Further, the user can touch and hold one of the individual function icons 13-1 to 13-8 for a predetermined time to change the position between the individual function icons 13-1 to 13-8. For example, when changing the position of the icon 13-1 located in the first area 12-1 and the icon 13-2 located in the second area 12-2, the first area 12 -1), the icon 13-1 is kept in a state of being touched for a predetermined time or longer, and then the icon 13-1 is dragged and dropped into the second region 12-2, Can be changed.
The method and the portable electronic device according to an embodiment of the present invention can perform various functions by various gestures out of the spatial limitation of the screen and provide guide icons for gestures, Intuitive perception of the relationship is possible.
The method according to an embodiment of the present invention can be implemented in the form of a program command which can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It belongs to the scope of right.
100: portable electronic device 110: icon generation unit
120: function setting unit 130: display unit
140: sensing unit 150: vector matching unit
160: function execution unit 170: communication unit
180: memory 190:
200: External device
Claims (10)
A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors;
A display unit for displaying the guide icon on a display screen;
A sensing unit for receiving a gesture of the user on an area different from a display screen area displaying the guide icon and detecting a gesture direction component;
A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors;
A function execution unit for executing the function corresponding to the determined vector;
A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And
A memory which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit;
≪ / RTI >
Wherein the function corresponding to the first to eighth vectors is at least one of a screen switching function, a specific application executing function, and a switching function to another guide icon.
When the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens,
Wherein the function corresponding to the first to eighth vectors is a specific application execution function when the gesture is input in a state where a screen other than the home screen is displayed on the display screen.
When the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed,
Wherein the individual function icons displayed in each of the first to eighth areas can be changed in position and function according to a gesture of a user input on the enlarged guide icon.
Setting first to eighth vectors having directional components that proceed from the first to eighth areas to the center area, and mapping the functions of the individual function icons to the first to eighth vectors;
Displaying the guide icon on a display screen;
Receiving a user's gesture on an area other than a display screen area displaying the guide icon and detecting a gesture direction component;
Determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; And
Executing the function corresponding to the determined vector;
The method comprising the steps of:
The functions corresponding to the first to eighth vectors include a screen switching function, a specific application execution function, or a switching function to a different guide icon.
When the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens,
Wherein the function corresponding to the first to eighth vectors is a specific application execution function, when the gesture is input while a screen other than the home screen is displayed on the display screen.
When the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed,
Wherein the individual function icon displayed in each of the first to eighth areas is a gesture in which a position and a function can be changed according to a gesture of a user input on the enlarged guide icon.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130103445A KR101507595B1 (en) | 2013-08-29 | 2013-08-29 | Method for activating function using gesture and mobile device thereof |
PCT/KR2013/011525 WO2015030313A1 (en) | 2013-08-29 | 2013-12-12 | Method for executing function using gesture, portable electronic apparatus therefor, and computer-readable recording medium having program recorded thereon therefor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130103445A KR101507595B1 (en) | 2013-08-29 | 2013-08-29 | Method for activating function using gesture and mobile device thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20150025631A KR20150025631A (en) | 2015-03-11 |
KR101507595B1 true KR101507595B1 (en) | 2015-04-07 |
Family
ID=52586834
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR20130103445A KR101507595B1 (en) | 2013-08-29 | 2013-08-29 | Method for activating function using gesture and mobile device thereof |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101507595B1 (en) |
WO (1) | WO2015030313A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110066981A1 (en) | 2009-09-16 | 2011-03-17 | International Business Machines Corporation | Placement of items in cascading radial menus |
WO2013119008A1 (en) | 2012-02-07 | 2013-08-15 | 부가벤쳐스 | Electronic apparatus and computer-implemented method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110102333A1 (en) * | 2009-10-30 | 2011-05-05 | Wayne Carl Westerman | Detection of Gesture Orientation on Repositionable Touch Surface |
KR20110116712A (en) * | 2010-04-20 | 2011-10-26 | 주식회사 인프라웨어 | Computer-readable media containing the program of ui method using gesture |
US20130019175A1 (en) * | 2011-07-14 | 2013-01-17 | Microsoft Corporation | Submenus for context based menu system |
-
2013
- 2013-08-29 KR KR20130103445A patent/KR101507595B1/en not_active IP Right Cessation
- 2013-12-12 WO PCT/KR2013/011525 patent/WO2015030313A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110066981A1 (en) | 2009-09-16 | 2011-03-17 | International Business Machines Corporation | Placement of items in cascading radial menus |
WO2013119008A1 (en) | 2012-02-07 | 2013-08-15 | 부가벤쳐스 | Electronic apparatus and computer-implemented method |
Also Published As
Publication number | Publication date |
---|---|
WO2015030313A1 (en) | 2015-03-05 |
KR20150025631A (en) | 2015-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11886252B2 (en) | Foldable device and method of controlling the same | |
US10401964B2 (en) | Mobile terminal and method for controlling haptic feedback | |
CN108139778B (en) | Portable device and screen display method of portable device | |
AU2014200250B2 (en) | Method for providing haptic effect in portable terminal, machine-readable storage medium, and portable terminal | |
US9013368B1 (en) | Foldable mobile device and method of controlling the same | |
EP3629674B1 (en) | Mobile terminal and control method therefor | |
KR102127930B1 (en) | Mobile terminal and method for controlling the same | |
US8994678B2 (en) | Techniques for programmable button on bezel of mobile terminal | |
EP2669786A2 (en) | Method for displaying item in terminal and terminal using the same | |
EP2778884A2 (en) | Electronic device and method for controlling screen display using temperature and humidity | |
EP2741207B1 (en) | Method and system for providing information based on context, and computer-readable recording medium thereof | |
KR102168648B1 (en) | User terminal apparatus and control method thereof | |
US20160260408A1 (en) | Method for sharing screen with external display device by electronic device and electronic device | |
US11003328B2 (en) | Touch input method through edge screen, and electronic device | |
KR20150081012A (en) | user terminal apparatus and control method thereof | |
KR101929316B1 (en) | Method and apparatus for displaying keypad in terminal having touchscreen | |
US20150106706A1 (en) | Electronic device and method for controlling object display | |
KR20160004590A (en) | Method for display window in electronic device and the device thereof | |
US20150067570A1 (en) | Method and Apparatus for Enhancing User Interface in a Device with Touch Screen | |
KR20170053410A (en) | Apparatus and method for displaying a muliple screen in electronic device | |
KR20170007966A (en) | Method and apparatus for smart device manipulation utilizing sides of device | |
KR101507595B1 (en) | Method for activating function using gesture and mobile device thereof | |
KR102306535B1 (en) | Method for controlling device and the device | |
KR101165388B1 (en) | Method for controlling screen using different kind of input devices and terminal unit thereof | |
JP2022179592A (en) | Information display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
LAPS | Lapse due to unpaid annual fee |