KR101507595B1 - Method for activating function using gesture and mobile device thereof - Google Patents

Method for activating function using gesture and mobile device thereof Download PDF

Info

Publication number
KR101507595B1
KR101507595B1 KR20130103445A KR20130103445A KR101507595B1 KR 101507595 B1 KR101507595 B1 KR 101507595B1 KR 20130103445 A KR20130103445 A KR 20130103445A KR 20130103445 A KR20130103445 A KR 20130103445A KR 101507595 B1 KR101507595 B1 KR 101507595B1
Authority
KR
South Korea
Prior art keywords
function
icon
gesture
unit
vectors
Prior art date
Application number
KR20130103445A
Other languages
Korean (ko)
Other versions
KR20150025631A (en
Inventor
유제민
Original Assignee
유제민
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유제민 filed Critical 유제민
Priority to KR20130103445A priority Critical patent/KR101507595B1/en
Priority to PCT/KR2013/011525 priority patent/WO2015030313A1/en
Publication of KR20150025631A publication Critical patent/KR20150025631A/en
Application granted granted Critical
Publication of KR101507595B1 publication Critical patent/KR101507595B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to an embodiment of the present invention, an icon generating unit generates a guide icon including an individual function icon displayed in first to eighth regions except for a center region among the nine divided regions and the nine divided regions; A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors; A display unit for displaying the guide icon on a display screen; A sensing unit that receives a user's gesture and detects a gesture direction component; A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; A function execution unit for executing the function corresponding to the determined vector; A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And a memory, which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit. A portable electronic device is provided.

Description

[0001] METHOD FOR ACTIVATING FUNCTION USING GESTURE AND MOBILE DEVICE THEREOF [0002]

The present invention relates to a method for executing a function using a gesture and a portable electronic device for the same. More particularly, the present invention relates to a method for displaying a gesture input by displaying a guide icon for providing a guide to a gesture on one side of a screen, A method for executing a function corresponding to an input direction, and a portable electronic device.

Portable electronic devices such as smartphones and tablets provide a touch-based interface. In a touch-based interface, a user can input a gesture on a touch screen without using an additional input device such as a keyboard to use the display device as an input device.

In an interface environment provided by modern portable electronic devices, a user needs to take a tap gesture to select an application icon to execute a specific application. If the application icon is located on the home screen (the main screen of the portable electronic device or the desktop screen), the application can be executed by selecting the icon directly on the home screen. However, if the icon is not on the home screen, It is necessary to select an icon of the application. In this case, you need two gestures: menu button selection and application icon selection.

Therefore, there is a method of placing many application icons on the home screen for quick execution of the application, but when there are too many icons on the home screen, the simplicity of the screen and the recognition of the icons are rather lowered. In addition, the number of icons that can be located in one home screen is limited.

Similarly, in order to execute another application in the middle of executing one application, in order to return to the home screen, a home button is pressed, a menu button of the home screen is selected, and a total of 3 I needed a gesture.

It is an object of the present invention to enable execution of various functions through a gesture and to display a gesture guide icon to provide a user with an intuitive guide to a gesture.

According to an embodiment of the present invention, an icon generating unit generates a guide icon including an individual function icon displayed in first to eighth regions except for a center region among the nine divided regions and the nine divided regions; A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors; A display unit for displaying the guide icon on a display screen; A sensing unit that receives a user's gesture and detects a gesture direction component; A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; A function execution unit for executing the function corresponding to the determined vector; A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And a memory, which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit. A portable electronic device is provided.

In the present invention, the functions corresponding to the first to eighth vectors are a screen switching function, a specific application executing function, or a function of switching to another guide icon.

In the present invention, when the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens, When the gesture is input while a screen other than the home screen is displayed, the function corresponding to the first to eighth vectors is a specific application execution function.

In the present invention, when the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed, and the enlarged guide icon displayed on each of the first to eighth areas The function icon may be changed in position and function according to the gesture of the user input on the enlarged guide icon.

In the present invention, the nine-divided area is an area obtained by dividing the guide icon into a 3 by 3 matrix shape, and the central area corresponds to two rows and two columns of the matrix.

According to another embodiment of the present invention, there is provided a method of generating a guide icon, the method comprising the steps of: generating a guide icon including an individual function icon displayed in first to eighth regions except a center region among nine divided regions and nine divided regions; Setting first to eighth vectors having directional components that proceed from the first to eighth areas to the center area, and mapping the functions of the individual function icons to the first to eighth vectors; Displaying the guide icon on a display screen; Receiving a user's gesture and detecting a gesture direction component; Determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; And performing the function corresponding to the determined vector; A method for executing a function using a gesture is provided.

According to the present invention, a user can obtain information about a function to be executed according to a gesture type by referring to a gesture guide icon.

In addition, according to the present invention, various functions can be performed using a gesture beyond the spatial limitation of the screen.

1 is a block diagram illustrating a configuration of a portable electronic device according to an embodiment of the present invention.
2 is a flowchart illustrating a method of executing a function using a gesture according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating guide icons generated according to an embodiment of the present invention.
4 is a diagram illustrating an example of direction vectors according to an embodiment of the present invention.
5 is an example of a screen for executing an application with a gesture according to an embodiment of the present invention.
FIG. 6 is an example of a case where a screen is switched to a gesture according to an embodiment of the present invention.
7 is a diagram showing an example of switching a plurality of guide icons according to an embodiment of the present invention.
8 is a screen for changing the setting of the guide icon according to an embodiment of the present invention.

The terms used in this specification will be briefly described and the present invention will be described in detail.

While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

When an element is referred to as "including" an element throughout the specification, it is to be understood that the element may include other elements, without departing from the spirit or scope of the present invention. Also, the terms "part," " module, "and the like described in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software .

Throughout the specification, "gesture" means a hand gesture that a user uses to control a portable electronic device. For example, gestures described herein may include tabs, touch & hold, double tap, drag, panning, flick, drag and drop, and the like.

The "tab" represents an operation in which the user touches the screen very quickly using a finger or a stylus. That is, the time difference between the touch-in time, which is the time when the finger or the touch tool touches the screen, and the touch-out time, which is the time when the finger or the touch tool falls off the screen, is very short.

"Touch & Hold" represents an operation in which a user touches a screen using a finger or a stylus and then maintains a touch input over a critical time. That is, the time difference between the touch-in point and the touch-out point is equal to or greater than the threshold time. In order to allow the user to recognize whether the touch input is a tap or a touch & hold, a feedback signal may be provided visually or audibly when the touch input is maintained for a predetermined time or more.

"Drag" means an operation of moving a finger or a touch tool to another position on the screen while the user holds the touch after touching the finger or the touch tool with the screen. The object is moved due to the drag operation or a panning operation to be described later is performed.

"Panning" indicates a case where a user performs a drag operation without selecting an object. Since panning does not select a specific object, the object is not moved within the page, but the page itself moves within the screen, or the group of objects moves within the page.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

1 is a block diagram illustrating a configuration of a portable electronic device according to an embodiment of the present invention.

The portable electronic device 100 may be implemented in various forms. For example, the portable electronic device 100 described herein may be used in a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP) Navigation, and tablet PCs.

1, the portable electronic device 100 includes an icon generating unit 110, a function setting unit 120, a display unit 130, a sensing unit 140, a vector matching unit 150, Unit 160, a communication unit 170, a memory 180, and a control unit 190. [ However, not all illustrated components are required. The portable electronic device 100 may be implemented by more components than the components shown, and the portable electronic device 100 may be implemented by fewer components.

Hereinafter, the components will be described in order.

The icon generating unit 110 generates a guide icon according to an embodiment of the present invention. According to an embodiment of the present invention, a user can take a gesture to drag in a specific direction to execute a specific application or switch a screen, and the guide icon serves to visually provide a guide for the direction of the drag gesture .

The guide icon of the present invention has nine divided areas and an individual function icon may be displayed in each of the first to eighth areas except for the center area among the nine divided areas. In one embodiment of the present invention, the nine-divided region may be an area divided into a shape of a 3x3 matrix, and the central region may be an area corresponding to two rows and two columns of the matrix. Also, it is possible to set the first to eighth areas in the clockwise direction starting from the upper left area of the center area of the guide icon.

FIG. 3 is a diagram illustrating guide icons generated according to an embodiment of the present invention.

Referring to FIG. 3, the portable electronic device 100 includes a display screen 101 and a home key 102. A general icon 103 and a guide icon 10 according to an embodiment of the present invention may be displayed on the display screen 101. [ The user can load the home screen which is the basic screen of the portable electronic device 100 on the display screen 101 by pressing the home key 102. [

As illustrated in FIG. 3, the guide icon 10 according to an embodiment of the present invention may have a circle shape, and the guide icon may include an area divided into nine by 3 by 3 matrix shapes. At this time, the first to eighth regions 12-1 to 12-8 excluding the central region 11 are set clockwise starting from the upper left region 12-1 with respect to the central region 11 . The first to eighth regions 12-1 to 12-8 may be displayed with individual function icons 13-1 to 13-8. Individual function icons 13-1 to 13-8 may be icons of individual applications such as a music player and a folder launcher. A basic figure shape can be displayed in the central area 11 of the guide icon 10.

In addition to the example of FIG. 3, the shape of the guide icon 10 of the present invention can be variously adopted, and the individual function icons 13-1 to 13-8 in the guide icon 10 can be variously changed Of course it is. Further, the guide icon 10 can always be displayed on the display screen 101, but may not be displayed in a specific situation depending on the setting of the user.

The function setting unit 120 sets the first to eighth vectors having directional components in the first to eighth areas of the guide icon generated by the icon generating unit 110 in the direction of the center area. Then, the function setting unit 120 associates the functions of the individual function icons with the first to eighth vectors. For example, when the individual function icon 13-2 located in the second region 12-2 is an Internet browser icon, the function setting unit 120 sets the function of the central region 11 , The second vector 14-2 is associated with the execution function of the Internet browser application.

In the case where the guide icon has nine divided areas of a 3 by 3 matrix shape, the direction components of the first to eighth vectors are shifted from the upper side to the down direction, from the upper left side to the lower right side -down direction, a left-down direction in a right-up direction, a right-to-left direction, and an opposite direction to the directions described above.

4 is a diagram illustrating an example of direction vectors according to an embodiment of the present invention.

4 is an example of first to eighth vectors set when the first to eighth areas are allocated in a clockwise direction, as shown in FIG. In the example of FIG. 4, the first vector 14-1 traveling from the first region 12-1 to the central region 11 is a vector having a direction component of -45 degrees. The second vector extending from the second region 12-2 to the central region 11 is a vector having an orientation component of -90 DEG in (14-2), and the third to eighth vectors 14-3 to 14-8 may be set as vectors having direction components of angles plus -45 degrees each.

The function setting unit 120 associates the functions of the individual function icons displayed in the respective regions of the guide icon with the first to eighth vectors 14-1 to 14-8 described above. In the example of FIG. 4, when the Internet browser application icon 13-2 is displayed in the second area 12-2, the function corresponding to the second vector 14-2 may be the execution of the Internet browser application.

The display unit 130 displays and outputs information processed by the portable electronic device 100 on a display screen. In an embodiment of the present invention, the display unit 130 displays the guide icon 10 generated by the icon generating unit 110. The guide icon 10 of the present invention does not directly serve as a menu button, but provides a guide for the gesture direction when the user takes a gesture. The display unit 130 can always display the guide icon 10 on the display screen 101 regardless of the home screen or the application execution screen. Alternatively, the guide icon can be displayed only when the display screen 101 displays the home screen.

In the present invention, when the display unit 130 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 130 may be used as an input device in addition to the output device. The display unit 130 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). There may be two or more display units 130 depending on the implementation of the portable electronic device 100. The touch screen can be configured to detect the touch input position as well as the touch input area as well as the touch input pressure. In addition, the touch screen can be configured to detect not only the real-touch but also the proximity touch.

In this specification, the term "real-touch" refers to a case where a pointer is actually touched on the screen, and "proximity-touch" , But approaches a predetermined distance from the screen. In this specification, a pointer is a tool for touching or touching a specific portion of a displayed screen. For example, there are stylus fans and fingers.

The sensing unit 140 receives the user's gesture from the touch screen and detects the direction component of the gesture gesture. In an embodiment of the present invention, the user's gesture in which the sensing unit 140 detects a directional component may be a drag gesture in which a continuous touch is made from a start point to an end point. The direction component of the gesture detected by the sensing unit 140 may be a direction component of a vector extending from a start point to an end point of the gesture.

The sensing unit 140 of the present invention may include various sensors for sensing a touch or a proximity touch of the touch screen. An example of a sensor for sensing the touch of the touch screen is a tactile sensor. A tactile sensor is a sensor that detects the contact of a specific object with a degree or more that a person feels. The tactile sensor can sense various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point.

In addition, a proximity sensor is an example of a sensor for sensing the touch of the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. Therefore, the proximity sensor has a considerably longer life than the contact-type sensor and its utilization is also quite high.

Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

The vector matching unit 150 determines a vector having a direction component corresponding to the gesture direction component detected by the sensing unit 140 among the direction components of the first to eighth vectors set by the function setting unit 120. [

In an embodiment of the present invention as shown in FIG. 4, the direction components of the first to eighth vectors may differ by 45 degrees from each other. In this case, the vector matching unit 150 detects which direction vector component the user's gesture is closest to. For example, in the example of FIG. 4, the angle of the gesture direction component input by the user is the second vector 14

-2) from -22.5 DEG to -22.5 DEG from the angle of the direction component of the gesture direction component, the vector having the direction component corresponding to the gesture direction component can be determined as the second vector 14-2.

The function execution unit 160 executes the function of the individual function icon corresponding to the vector determined by the vector matching unit 150. [ The function execution unit 160 may eventually serve as a processor for determining and executing the performance function of the portable electronic device 100. [

In the embodiment of FIG. 4, the function of the individual function icon 13-2 corresponding to the second vector 14-2 is 'execution of the Internet browser application', and the vector having the direction component corresponding to the gesture direction component is In the case of the second vector 14-2, the function execution unit 160 executes the Internet browser application.

The communication unit 170 may include one or more components that allow the portable electronic device 100 and the external device 200 to communicate. For example, the communication unit 170 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, a wired Internet module, a short distance communication module, a location information module, and the like.

The broadcast receiving module receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel.

The mobile communication module transmits and receives radio signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module refers to a module for wireless Internet access, and the wireless Internet module can be built in or externally. Also, the wired Internet module means a module for a wired Internet connection.

The short-range communication module is a module for short-range communication. Near-field communication (Bluetooth), Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, WFD .

The location information module is also a module for identifying or obtaining the location of the portable electronic device 100. One example is the GPS (Global Position System) module. The GPS module receives position information from a plurality of satellites. Here, the location information may include coordinate information indicated by latitude and longitude.

The memory 180 may store programs for processing and controlling the control unit 190 and may store programs for storing input / output data (e.g., capture data, phonebook, message, still image, Function may be performed.

The memory 180 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory), a RAM (Random Access Memory) SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read- And an optical disc. In addition, the portable electronic device 100 may operate a web storage that performs a storage function of the memory 180 on the Internet.

The control unit 190 typically controls the overall operation of the portable electronic device 100. For example, the control unit 190 performs control and processing related to gesture recognition, voice call, data communication, video call, and the like. That is, the control unit 190 includes an icon generating unit 110, a function setting unit 120, a display unit 130, a sensing unit 140, a vector matching unit 150, a function execution unit 160, a communication unit 170 ) And the memory 180. [0035]

In addition, the control unit 190 may include a multimedia module for multimedia playback. The multimedia module may be implemented in the control unit 190 or may be implemented separately from the control unit 190. [

Meanwhile, the external device 200 according to an exemplary embodiment of the present invention refers to a device capable of communicating with the portable electronic device 100. Examples of the external device 200 include a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigation device, , And a digital CE device (Consumer Elctronics). Digital CE (Consumer Elctronics) devices may include DTV (Digital Television), IPTV (Internet Protocol Television), a refrigerator having a display device, an air conditioner, and a printer.

Meanwhile, the external device 200 according to an embodiment of the present invention may include a server device (e.g., a social networking service server).

2 is a flowchart illustrating a method of performing a function by a gesture according to an embodiment of the present invention.

First, a guide icon including an individual function icon displayed in each of the first to eighth regions except the central region among the nine divided regions and nine divided regions is generated (S1).

Next, the first to eighth vectors having directional components proceeding from the first to eighth areas of the guide icon to the center area are set (S2).

Next, the first to eighth vectors are associated with the function of each individual function icon displayed in the area of the guide icon (S3).

Next, the generated guide icon is displayed on the display screen (S4).

Next, the user's gesture is input, and a gesture direction component that proceeds from the start point to the end point of the gesture input is detected (S5).

Finally, a vector having a direction component corresponding to the gesture direction component among the direction components of the first to eighth vectors is determined, and the function of the individual function icon corresponding to the determined vector is executed (S6).

5 is an example of a screen for executing an application with a gesture according to an embodiment of the present invention.

First, the user can obtain an intuitive recognition as to what function can be performed according to the direction of the gesture through the guide icon 10 displayed at the upper left corner of FIG. For example, a user can execute a music application by inputting a gesture having a direction from the upper right to the lower left through the guide icon 10, and inputting a gesture having a direction from the upper end to the lower end to execute an Internet browser application .

5, the user can input a gesture having a vector component proceeding from the upper right corner to the left side of the display screen 101 through a touch. As described above, the sensing unit 140 detects the gesture direction component at the user's gesture input, and the vector matching unit 150 detects the gesture direction component among the direction components of the first to eighth vectors 14-1 to 14-8. A vector having a direction component corresponding to the direction component is determined. 5, the sensing unit 140 detects a vector in the direction of the end point from the start point of the user gesture, and the vector matching unit 150 determines that the third vector 14-3 is a vector corresponding to the gesture direction component You can decide.

5, since the individual function icon corresponding to the third vector 14-3 is the music reproduction application icon 13-3 located in the third region 12-3, Executes the music playback application as shown on the right screen.

In an embodiment of the present invention illustrated in FIG. 5, the user can easily execute various functions without touching a plurality of times with only one gesture, despite the limited display screen size.

FIG. 6 is an example of a case where a screen is switched to a gesture according to an embodiment of the present invention.

6A is an example in which the home screens 105, 105-1 to 105-8 that can be applied in the present invention are displayed on the display screen 101 in a reduced form. Referring to FIG. 6A, the home screen of the present invention may have nine home screens 105, 105-1 to 105-8 arranged in a 3 by 3 matrix. The display screen 101 shown in FIG. 6A may be a screen of a home screen full view mode which is executed by a double touch using two fingers to view the configuration of the home screen. In the embodiment of FIG. 6A, the home screen currently displayed on the display screen 101 may be displayed as a home screen 105 located at the center.

In the conventional portable electronic device, it is possible to have a plurality of home screens, but it is possible to switch between the home screens only through the left and right touches, and the number of times of switching increases as the number of home screens increases. For example, in order to move from the first home screen to the fifth home screen, a total of four drag gestures moving left and right are required.

On the other hand, according to one embodiment of the present invention, up to nine home screens can be extended, and movement from the current home screen to all other home screens can be performed with a single touch.

FIG. 6B illustrates a method of switching a home screen through a gesture according to an embodiment of the present invention. In the portable electronic device 100 of FIG. 6B, the left side shows the central home screen 105 on the display screen 101, and the right side shows the home screen 105-1 on the upper left side of the display screen 101 It is a state.

Referring to FIG. 6B, the guide icon 10 displays a rectangular screen shape in the shape of each individual function icon 13-1 to 13-8 unlike the guide icon 10 in FIG. At this time, although the shapes of the individual function icons are the same, the function setting unit 120 sets the individual function icons 13-1 to 13-8 and the vectors 14-1 to 14-8 to switch to different home screens Function can be matched. For example, the first vector 14-1 may correspond to a function of switching from the current home screen 105 to the upper left home screen 105-1. Alternatively, the third vector 14-3 may correspond to a function of switching from the current home screen 105 to the upper right home screen 105-3.

The user can see the shape of the individual function icon displayed on the guide icon 10 and obtain information that the home screen can be switched through the current gesture.

In the embodiment of FIG. 6B, if the currently displayed home screen is the central home screen 105 of FIG. 6A, the user may view the current home screen as the home screen 105-1 on the upper left side As shown in FIG. 6B, it is possible to take a gesture to advance from the upper left to the lower right for switching.

6, the direction component of the gesture proceeding from the upper left to the lower right corresponds to the direction component of the first vector 14-1, and the function setting unit 120 sets the first The function of switching from the current home screen 105 set to the function corresponding to the vector 14-1 to the home screen 105-1 in the upper left corner can be executed as shown in the right figure. In the same way, when a gesture corresponding to the direction component of the second vector 14-2 is input, the switching function from the current home screen 105 to the top home screen 105-2 can be executed . When the home screen is switched, a screen transition effect such as roll-in or roll-up can be applied at the same time as the user's gesture in order to smooth the transition shape.

The individual function icons 13-1 to 13-8 of the guide icon 10 of the present invention can be freely changed according to the settings of the user. It is also possible to assume that an individual function icon for application execution and a screen switching icon are displayed together in one guide icon 10.

In addition, the guide icon 10 of FIG. 5 and the guide icon 10 of FIG. 6 may be switched or changed by user setting and displayed. In particular, when the dp home screen of the display screen 101 is displayed, the screen switching guide icon 10 shown in Fig. 6 and the corresponding function are executed. When a menu screen or an application screen other than the home screen is displayed, And a function corresponding to the guide icon 10 can be executed. That is, the user can switch the screen when the display screen 101 is in the state of the home screen by using the function execution method using the gesture gesture of the present invention, and can execute another application in the case of the application screen.

The individual function icons located in the first to eighth regions 12-1 to 12-8 of the guide icon 10 may indicate the function of switching to another guide icon 10.

7 is a diagram illustrating an example of switching a plurality of guide icons using a gesture according to an embodiment of the present invention.

As shown in FIG. 7, a plurality of guide icons may exist. Referring to FIG. 7, it can be seen that an individual function icon in the form of a guide icon 10 is displayed in the fifth region 12-5 of the first guide icon 10-1. In this case, the function setting unit 120 can set the function corresponding to the fifth vector 14-5 of the first guide icon 10-1 to the execution of the second guide icon 10-2. That is, when the first guide icon 10-1 is being displayed, the user can take the gesture in the direction of the fifth vector 14-5 and execute the second guide icon 102. [ In this case, the guide icon 10 displayed on the display screen 101 is changed from the first guide icon 10-1 to the second guide icon 10-2.

It is also noted that the individual function icons in the shape of the guide icons 10 are displayed in the first area 12-1 and the seventh area 12-7 of the second guide icon 10-2. In this case, the function setting unit 120 sets the function corresponding to the first vector 14-1 of the second guide icon 10-1 to the execution of the first guide icon 10-1, 7 vector 14-7 can be set to the execution of the third guide icon 10-3. The user can display the first guide icon 10-1 on the display screen 101 by taking a gesture in the direction of the first vector 14-1 when the second guide icon 10-2 is being displayed, It is possible to display the third guide icon 10-3 by taking a gesture in the direction of the vector 14-7.

Finally, it can be seen that an individual function icon in the form of a guide icon 10 is displayed in the third region 12-3 of the third guide icon 10-3. In this case, the function setting unit 120 may set the function corresponding to the third vector 14-3 of the third guide icon 10-3 to the execution of the second guide icon 10-2.

8 is a screen for changing the setting of the guide icon according to an embodiment of the present invention.

The user can tap the guide icon 10 to change the positions and functions of the individual function icons 13-1 to 13-8 of the guide icon 10 as shown in the left figure of FIG. When the guide icon 10 is tapped, an enlarged guide icon 10a is displayed as shown on the right side of FIG.

The user can tap one of the individual function icons 13-1 to 13-8 or the first to eighth areas 12-1 to 12-8 in the enlarged guide icon 10-a to select The set function can be changed. Functions that can be applied to the individual function icons include an application execution function and a screen switching function. When the function applied to the individual function icon is changed, the shape of the individual function icon is also changed to a shape which is changed to the corresponding function.

For example, the icon 13-3 located in the third area 12-3 is a music application icon, but the user can select a message application from the list displayed by tapping the third area 12-3, The function icon 13-3 can be changed to the message application icon. At this time, it goes without saying that the function corresponding to the third vector 14-3 is changed to the execution of the message application.

Further, the user can touch and hold one of the individual function icons 13-1 to 13-8 for a predetermined time to change the position between the individual function icons 13-1 to 13-8. For example, when changing the position of the icon 13-1 located in the first area 12-1 and the icon 13-2 located in the second area 12-2, the first area 12 -1), the icon 13-1 is kept in a state of being touched for a predetermined time or longer, and then the icon 13-1 is dragged and dropped into the second region 12-2, Can be changed.

The method and the portable electronic device according to an embodiment of the present invention can perform various functions by various gestures out of the spatial limitation of the screen and provide guide icons for gestures, Intuitive perception of the relationship is possible.

The method according to an embodiment of the present invention can be implemented in the form of a program command which can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It belongs to the scope of right.

100: portable electronic device 110: icon generation unit
120: function setting unit 130: display unit
140: sensing unit 150: vector matching unit
160: function execution unit 170: communication unit
180: memory 190:
200: External device

Claims (10)

And an individual function icon displayed in the first to eighth areas except for the center area among the nine divided areas and the nine divided areas, and it is possible to execute a specific application corresponding to the icon by taking a gesture in a specific direction from the user, An icon generator for generating a guide icon for providing a visual guide to the method of the gesture so as to be able to switch;
A function setting unit for setting first to eighth vectors having directional components extending from the first to eighth regions to the central region and mapping the functions of the individual function icons to the first to eighth vectors;
A display unit for displaying the guide icon on a display screen;
A sensing unit for receiving a gesture of the user on an area different from a display screen area displaying the guide icon and detecting a gesture direction component;
A vector matching unit for determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors;
A function execution unit for executing the function corresponding to the determined vector;
A control unit for controlling the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit; And
A memory which is executed by the control unit and stores a program for operating the icon generating unit, the function setting unit, the display unit, the sensing unit, the vector matching unit, and the function executing unit;
≪ / RTI >
The method according to claim 1,
Wherein the function corresponding to the first to eighth vectors is at least one of a screen switching function, a specific application executing function, and a switching function to another guide icon.
The method according to claim 1,
When the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens,
Wherein the function corresponding to the first to eighth vectors is a specific application execution function when the gesture is input in a state where a screen other than the home screen is displayed on the display screen.
The method according to claim 1,
When the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed,
Wherein the individual function icons displayed in each of the first to eighth areas can be changed in position and function according to a gesture of a user input on the enlarged guide icon.
delete And an individual function icon displayed in the first to eighth areas except for the center area among the nine divided areas and the nine divided areas, and it is possible to execute a specific application corresponding to the icon by taking a gesture in a specific direction from the user, Creating a guide icon that provides a visual guide to the orientation of the gesture to enable switching;
Setting first to eighth vectors having directional components that proceed from the first to eighth areas to the center area, and mapping the functions of the individual function icons to the first to eighth vectors;
Displaying the guide icon on a display screen;
Receiving a user's gesture on an area other than a display screen area displaying the guide icon and detecting a gesture direction component;
Determining a vector having a direction component corresponding to the gesture direction component among direction components of the first to eighth vectors; And
Executing the function corresponding to the determined vector;
The method comprising the steps of:
The method according to claim 6,
The functions corresponding to the first to eighth vectors include a screen switching function, a specific application execution function, or a switching function to a different guide icon.
The method according to claim 6,
When the gesture is input while the home screen is displayed on the display screen, the function corresponding to the first to eighth vectors is a function for switching to the first to eighth home screens,
Wherein the function corresponding to the first to eighth vectors is a specific application execution function, when the gesture is input while a screen other than the home screen is displayed on the display screen.
The method according to claim 6,
When the guide icon is tapped, an enlarged guide icon for editing each individual function icon of the guide icon is displayed,
Wherein the individual function icon displayed in each of the first to eighth areas is a gesture in which a position and a function can be changed according to a gesture of a user input on the enlarged guide icon.
delete
KR20130103445A 2013-08-29 2013-08-29 Method for activating function using gesture and mobile device thereof KR101507595B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR20130103445A KR101507595B1 (en) 2013-08-29 2013-08-29 Method for activating function using gesture and mobile device thereof
PCT/KR2013/011525 WO2015030313A1 (en) 2013-08-29 2013-12-12 Method for executing function using gesture, portable electronic apparatus therefor, and computer-readable recording medium having program recorded thereon therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130103445A KR101507595B1 (en) 2013-08-29 2013-08-29 Method for activating function using gesture and mobile device thereof

Publications (2)

Publication Number Publication Date
KR20150025631A KR20150025631A (en) 2015-03-11
KR101507595B1 true KR101507595B1 (en) 2015-04-07

Family

ID=52586834

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130103445A KR101507595B1 (en) 2013-08-29 2013-08-29 Method for activating function using gesture and mobile device thereof

Country Status (2)

Country Link
KR (1) KR101507595B1 (en)
WO (1) WO2015030313A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110066981A1 (en) 2009-09-16 2011-03-17 International Business Machines Corporation Placement of items in cascading radial menus
WO2013119008A1 (en) 2012-02-07 2013-08-15 부가벤쳐스 Electronic apparatus and computer-implemented method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102333A1 (en) * 2009-10-30 2011-05-05 Wayne Carl Westerman Detection of Gesture Orientation on Repositionable Touch Surface
KR20110116712A (en) * 2010-04-20 2011-10-26 주식회사 인프라웨어 Computer-readable media containing the program of ui method using gesture
US20130019175A1 (en) * 2011-07-14 2013-01-17 Microsoft Corporation Submenus for context based menu system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110066981A1 (en) 2009-09-16 2011-03-17 International Business Machines Corporation Placement of items in cascading radial menus
WO2013119008A1 (en) 2012-02-07 2013-08-15 부가벤쳐스 Electronic apparatus and computer-implemented method

Also Published As

Publication number Publication date
WO2015030313A1 (en) 2015-03-05
KR20150025631A (en) 2015-03-11

Similar Documents

Publication Publication Date Title
US11886252B2 (en) Foldable device and method of controlling the same
US10401964B2 (en) Mobile terminal and method for controlling haptic feedback
CN108139778B (en) Portable device and screen display method of portable device
AU2014200250B2 (en) Method for providing haptic effect in portable terminal, machine-readable storage medium, and portable terminal
US9013368B1 (en) Foldable mobile device and method of controlling the same
EP3629674B1 (en) Mobile terminal and control method therefor
KR102127930B1 (en) Mobile terminal and method for controlling the same
US8994678B2 (en) Techniques for programmable button on bezel of mobile terminal
EP2669786A2 (en) Method for displaying item in terminal and terminal using the same
EP2778884A2 (en) Electronic device and method for controlling screen display using temperature and humidity
EP2741207B1 (en) Method and system for providing information based on context, and computer-readable recording medium thereof
KR102168648B1 (en) User terminal apparatus and control method thereof
US20160260408A1 (en) Method for sharing screen with external display device by electronic device and electronic device
US11003328B2 (en) Touch input method through edge screen, and electronic device
KR20150081012A (en) user terminal apparatus and control method thereof
KR101929316B1 (en) Method and apparatus for displaying keypad in terminal having touchscreen
US20150106706A1 (en) Electronic device and method for controlling object display
KR20160004590A (en) Method for display window in electronic device and the device thereof
US20150067570A1 (en) Method and Apparatus for Enhancing User Interface in a Device with Touch Screen
KR20170053410A (en) Apparatus and method for displaying a muliple screen in electronic device
KR20170007966A (en) Method and apparatus for smart device manipulation utilizing sides of device
KR101507595B1 (en) Method for activating function using gesture and mobile device thereof
KR102306535B1 (en) Method for controlling device and the device
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof
JP2022179592A (en) Information display device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
LAPS Lapse due to unpaid annual fee