KR20140136393A - Method for providing user interface and smart device thereof - Google Patents

Method for providing user interface and smart device thereof Download PDF

Info

Publication number
KR20140136393A
KR20140136393A KR1020140059932A KR20140059932A KR20140136393A KR 20140136393 A KR20140136393 A KR 20140136393A KR 1020140059932 A KR1020140059932 A KR 1020140059932A KR 20140059932 A KR20140059932 A KR 20140059932A KR 20140136393 A KR20140136393 A KR 20140136393A
Authority
KR
South Korea
Prior art keywords
touch
targets
touch event
user interface
finger
Prior art date
Application number
KR1020140059932A
Other languages
Korean (ko)
Inventor
조홍구
조병화
Original Assignee
조홍구
조병화
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 조홍구, 조병화 filed Critical 조홍구
Publication of KR20140136393A publication Critical patent/KR20140136393A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed are a user interface providing method and a smart device forming the same. At this point, the smart device executes a step of making the smart device output a wall paper having multiple targets arranged while a user is holding the smart device with one hand - the targets include first targets arranged in a position which a finger does not reach and second targets arranged in a position which the finger reaches -, a step of making the smart device sense a touch motion touching and dragging a random point of the wallpaper with one finger of the hand and then taking the finger off, and a step of making the smart device provide a user interface to select the first targets by moving at least one target among the first targets.

Description

METHOD FOR PROVIDING USER INTERFACE AND SMART DEVICE THEREOF FIELD OF THE INVENTION [0001]

The present invention relates to a method for providing a user interface and a smart device for implementing the same.

Smart devices such as smartphones, tablet notebooks, and tablet PCs are now largely dominated by 5-inch screens. This large-screen smart device has the advantage of displaying a lot of information because of its large screen size, but it is difficult to hold it with one hand. In other words, there is an inconvenience to use a different hand that does not hold a smart device to select a target placed at a long distance on the screen where the smart device is held by the hand, out of the reach of the hand.

SUMMARY OF THE INVENTION The present invention provides a method for providing a user interface for selecting a target to be executed by a user with a single hand in a smart device on a large screen, and a smart device implementing the same.

According to one aspect of the present invention, there is provided a method for providing a user interface, the method comprising the steps of: when a user grasps a smart device with one hand, the smart device displays a plurality of targets, And outputting the sorted background image, wherein the smart device includes a plurality of second targets arranged at a position where the finger touches the first target, Detecting a tear off action, and providing a user interface that allows the smart device to move one or more targets of the first targets to select the first targets.

Wherein providing the user interface comprises:

And moving the desktop screen on which the first targets are arranged to a position where the finger touches, in correspondence with the movement direction of the finger.

Wherein providing the user interface comprises:

And moving the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.

Wherein providing the user interface comprises:

One or more targets of the second targets may not be displayed on the screen corresponding to the moving direction of the touch of the finger.

Wherein the step of moving the finger to a position to which the finger touches comprises:

The method of claim 1, further comprising the steps of: switching to a screen-shiftable state when the touch operation is sensed; displaying a background image when the screen is transitionable; And moving the finger to a position where the finger touches the finger in accordance with the movement direction of the finger.

The background screen image may include:

When a predetermined time elapses, it can be converted to another image or converted to another image when the screen is ready to be moved.

Wherein providing the user interface comprises:

Aligning the first targets by moving them to a position where the finger touches them, and aligning one or more targets of the second targets in the remaining area of the desktop.

Wherein the sensing comprises:

A step of generating a push-touch event by an operation of pressing a desktop, and

Determining whether the pressing operation is continued for a predetermined time,

Wherein providing the user interface comprises:

If the pressing operation is continued for a predetermined time, the user interface can be provided.

Wherein the sensing comprises:

A step of generating a touch event by an operation of pressing a desktop, a step of generating a moving touch event by an operation of moving on the desktop together with a pressing touch event, the step of providing the user interface according to a moving touch event, Generating a touch termination event for terminating the touch on the desktop, and restoring the desktop to the original screen according to the touch termination event.

Wherein,

Determining whether the touch termination state continues for a predetermined time after the touch termination event is transmitted, and restoring the original screen when the predetermined period of time has elapsed.

Wherein the providing of the user interface in response to the mobile touch event comprises:

Calculating a moving coordinate by interpreting the moving touch event; if the difference value between the starting point and the ending point of the X coordinate of the moving coordinate is larger than a predetermined positive threshold value or smaller than a predetermined negative threshold value, And providing the user interface if the difference value is greater than the negative threshold value and less than the positive threshold value.

The positive threshold value and the negative threshold value,

Coordinate value which is an intermediate value of the entire width of the desktop.

Wherein the providing of the user interface in response to the mobile touch event comprises:

Calculating a moving coordinate by interpreting the moving touch event, performing a desktop switching according to the moving coordinate when it is determined that the slope according to the moving coordinate is close to 0, And if so, providing the user interface.

Wherein the step of providing the user interface in response to the mobile touch event comprises: when a left-down moving touch event occurs while holding the smart device with the left hand, When a left-up moving touch event occurs while holding the smart device with the left hand, moving the targets, which are not touched by the finger arranged on the lower right of the desktop, Providing a user interface for moving the finger to a position where the finger touches the finger, and when a touch event of moving downward in a state of holding the smart device with the right hand occurs, The position where the finger touches When a right-up movement event occurs while holding the smart device with the right hand, a target that is not touching the finger arranged on the lower left desktop is moved to a position The method comprising the steps of:

Wherein the second targets include a predefined specific icon,

Wherein providing the user interface comprises:

When the specific icon is touched and touched, the screen mode is changed to move the background screen on which the first targets are aligned to the position touched by the finger corresponding to the moving direction of the touch of the finger, The finger can be moved and aligned to the touch position and one or more targets of the second targets can be aligned in the remaining area of the desktop.

After providing the user interface,

After the screen movement mode is performed, the desktop can be restored as it is when the predetermined time elapses.

According to another aspect of the present invention, a smart device includes a plurality of targets, the plurality of targets having a first target arranged at a position where the finger does not touch, Wherein the second target is arranged in a position where the touch screen is touched, and the touch screen is touched by pulling out an arbitrary point on the desktop with the finger of the hand while outputting the aligned desktop screen, A touch event input unit for generating a touch event according to the touch operation when the touch event is detected, and a touch input unit for providing a user interface for selecting one of the first targets by moving one or more targets among the first targets, And an event processing unit.

The touch event processing unit,

It is possible to move the desktop screen on which the first targets are aligned to the position where the finger touches, corresponding to the moving direction of the finger.

The touch event processing unit,

The second targets may be moved in the same direction as the first targets corresponding to the moving direction of the touch of the finger.

The touch event processing unit,

And one or more targets of the second targets may not be displayed on the screen corresponding to the moving direction of the touch of the finger.

The touch event processing unit,

Align and move the first targets to a position where the finger touches them and align one or more targets of the second targets to the remaining area of the desktop.

The smart device includes a clock signal generating unit generating a periodic clock signal, a clock number recording unit counting a clock signal output from the clock signal generating unit to record the number of clocks, And an elapsed time measuring unit for measuring the touch related elapsed time and providing the measured elapsed time to the touch event processing unit.

Wherein the elapsed time measuring unit comprises:

An on-touch timer for measuring a touch continuation elapsed time using the number of clocks recorded in the clock number recording unit from a moment when the touch event triggerer touches the touch panel; And an off-touch timer for measuring the elapsed time of the touch removal using the number of clocks recorded in the number recording unit.

The touch event processing unit,

A touch event notification module for receiving a touch event received from the touch input unit, a touch event event processing module for receiving a touch event generated by an operation of the touch event player pressing the touch panel from the touch event notification module, A movement touch event processing module for receiving a movement touch event generated by a movement of the touch event protector after pressing the touch panel from the touch event notification module; A touch end event processing module for receiving a touch end event from the touch event notification module, a touch end event processing module for receiving from the touch event notification module a target execution touch event generated by an operation of selecting a target or a pressing operation by the touch event trigger Touch timer according to a request of each of the target execution touch event processing module, the push-touch event processing module, the mobile touch event processing module, the touch end event processing module, and the target execution touch event processing module, Touch timer according to a request from the on-touch timer control module, the touch-down event processing module, the moving touch event processing module, the touch end event processing module, and the target execution touch event processing module, An on-off timer control module for stopping or driving the touch event processing module, a screen control module for implementing the user interface according to a request of the mobile touch event processing module, And a target instruction execution module that implements the function or operation to perform the function.

Wherein the push-touch event processing module comprises:

Touch timer to stop the operation of the off-touch timer, and when the screen-movable state is in a non-movable state, an operation start time of the on-touch timer is set to a time when the touch- The on-touch timer is started, the operation elapsed time of the on-timer during the occurrence of the push-touch event is measured, and if the time exceeds the threshold time, the user interface implementation is determined, The driving can be stopped.

The mobile touch event processing module includes:

The controller may analyze the moving touch event to obtain moving touch coordinates, and if the screen movement enabling state is movable, the user interface may be implemented according to the moving touch coordinates.

The touch end event processing module includes:

The method comprising: setting the time at which the touch termination event occurs, as the off-touch timer driving start time after driving the off-touch timer, measuring the elapsed time of the off-touch timer during the touch termination, Touch timer is stopped, the control unit sets the operation start time of the off-touch timer to the off-timer operation limit time, and then controls the operation of the on- Stop,

Wherein the screen control module comprises:

The desktop can be restored according to a request of the touch end event processing module.

Wherein the target execution touch event processing module comprises:

After transferring the target information to the target command execution module, if the screen movement enable state is impossible, the driving of the on-touch timer and the off-touch timer is stopped, and the desktop can be restored to its original state.

According to the embodiment of the present invention, it is possible to solve the inconvenience of using both hands in order to select a target to be executed by a user in a smart device of a large screen in the past.

FIG. 1 illustrates a desktop and a target of a smart device to which an embodiment of the present invention is applied.
2 shows the overall configuration of a smart device according to an embodiment of the present invention.
3 is a block diagram illustrating a detailed configuration of the touch event input unit of FIG.
4 is a block diagram showing a detailed configuration of the touch event processing unit of FIG.
5 to 8 illustrate an operation of performing a screen movement mode according to an embodiment of the present invention.
Figure 9 illustrates the target alignment mode performing operation when the smart device is held in the left hand according to one embodiment of the present invention.
FIG. 10 shows an operation of performing a target alignment mode when the smart device is held by the right hand according to another embodiment of the present invention.
11 shows a coordinate system of a desktop according to an embodiment of the present invention.
FIG. 12 shows orbit coordinates according to an embodiment of the present invention.
13 is a diagram showing movement of a target according to an orbit on which a finger is moved according to an embodiment of the present invention.
14 shows the kind of the orbit on which the finger moves according to the embodiment of the present invention.
FIG. 15 shows a series of processes of a touch-touch event processing process according to an embodiment of the present invention.
16 shows a series of processes of the mobile touch event processing process according to the embodiment of the present invention.
17 shows a series of processes of the touch end event processing process according to the embodiment of the present invention.
18 shows a series of processes of the target instruction execution event processing process according to the embodiment of the present invention.
FIG. 19 illustrates an operation of performing a screen movement mode according to another embodiment of the present invention.
FIG. 20 illustrates a method of performing a screen movement mode according to another embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when an element is referred to as "comprising ", it means that it can include other elements as well, without excluding other elements unless specifically stated otherwise.

Also, the terms of " part ", "... module" in the description mean units for processing at least one function or operation, which may be implemented by hardware or software or a combination of hardware and software.

Hereinafter, a method for providing a user interface according to an embodiment of the present invention and a smart device implementing the same will be described in detail with reference to the drawings.

Here, touching or pressing a finger in this specification includes not only directly touching the smart device but also placing the finger in a position close to the smart device.

FIG. 1 illustrates a desktop and a target of a smart device to which an embodiment of the present invention is applied.

Referring to FIG. 1, a plurality of targets are arranged on a desktop P1 of the smart device 100. FIG. At this time, the plurality of targets include the first target P3 and the second target P5.

Here, when the user holds the smart device 100 with one hand, the finger does not reach some targets P3 of the plurality of targets. That is, these targets P3 represent the first target P3, which is difficult for the left hand ligament 200 to reach when the smart device 100 is held with the left hand. In addition, the targets arranged at the position where the left hand loop 200 touches are the second target P5.

The first target P3 and the second target P5 may be icons, buttons, or the like.

At this time, the case where the smart device 100 is held by the left hand is shown, but the present invention is not limited to this, and the same operation is also applied to the case of using the right hand.

The desktop P1 is used to execute a first (home) screen in which an app icon, a gazette, etc. appearing when the smart device 100 is started, an app icon or a gadget on the (home) screen (URL) link, a character input window, and the like, which are displayed on the screen of the display unit.

2 is a block diagram illustrating a detailed configuration of the touch event input unit of FIG. 2, FIG. 4 is a detailed view of the touch event processing unit of FIG. 2, and FIG. Fig.

2, the smart device 100 includes a touch event input unit 110, a touch event processing unit 130, an elapsed time measuring unit 150, a clock number writing unit 170, and a clock signal generating unit 190.

The touch event input unit 110 generates a touch event when a finger 200 or a touch event such as a touch pen is detected. The touch event input unit 110 may be implemented as shown in FIG.

3, the touch event input unit 110 includes a touch panel 112 and a display unit 113 attached to the touch screen 111 to sense a touch operation of the touch event trigger. At this time, the display device 113 is displayed on the desktop P1 and the desktop P1 displayed on the display device 113 of the touch screen 111, and is displayed on a target (Or gadget) P3.

Referring again to FIG. 2, the touch event processing unit 130 receives a touch event from the touch event input unit 110. Then, the movement of the desktop P1 and the target P3 is controlled in conjunction with the touch event and the operation time of the on-touch timer 151 and the off-touch timer 153.

When the touch event trigger generates an event at a predetermined position on the touch screen 111, it is the screen P1 or the target P3. In the case of the target P3, in general, the command of the target P3 is executed. However, in the embodiment of the present invention, the step of separately processing the event generated in the target P3 to move the target P3 before executing the command Leave.

At this time, the touch event processing unit 130 implements a user interface that allows the user to easily select the target P3 whose finger 200 is difficult to reach. That is, the touch event processing unit 130 moves one or more targets among the first targets when a touch operation to remove the touch target is detected after the touch event processing unit 130 is touched with a finger touching an arbitrary point on the desktop, To provide a user interface that allows selection of targets. This user interface is possible in two embodiments.

According to one embodiment, in conjunction with the touch event and the operation time of the on-touch timer 151 and the off-touch timer 153, the entire background P1 is pulled or pushed to move the target (P3) are moved to a place where the finger 200 is easy to touch. That is, when an arbitrary point on the desktop P1 is touched with the finger 200, the screen P1 is moved to the screen moving mode. The target alignment state of the screen does not change, but the screen on which the target with which the finger 200 does not touch is moved to the screen that the finger 200 touches.

Specifically, the desktop screen on which the first target P3 is aligned is moved to a position where the finger touches the screen in accordance with the moving direction of the finger. The second target P5 is moved in the same direction as the first target P3 in accordance with the moving direction of the touch of the finger. At this time, one or more targets of the second target P5 are not displayed on the screen corresponding to the moving direction of the touch of the finger.

Here, when a touch operation for pulling or pushing the whole desktop P1 is detected, the smart device 100 switches to the screen movable state. As described above, when the screen is shifted to the movable state, the background image is displayed. Then, the background image in which the first targets are aligned is moved to a position where the finger touches the background image corresponding to the moving direction of the finger. That is, in a state in which a background image is displayed, the background image on which the first targets are arranged on the background image is separately moved.

At this time, the background image may be converted into another image when a predetermined time elapses, or may be converted to another image when the screen is shifted to a movable state.

Also, the background image may be a corporate log image of a smart device manufacturer or an image selected by a user.

According to another embodiment of the present invention, when the screen movement enable state is set to the movable state in conjunction with the touch event and the operation time of the on-touch timer 151 and the off-touch timer 153, The target P3 positioned on the screen, which is difficult to reach, is moved to a position near the finger 200 to align the target P3 with ease. That is, the first target P3 is moved and aligned to the position where the finger is touched, and the one or more targets of the second target P5 are aligned in the remaining area of the desktop.

Here, the touch event processing unit 130 may be implemented as shown in FIG.

4, the touch event processing unit 130 includes a touch event notification module 131, a touch event processing module 132, a mobile touch event processing module 133, a touch end event processing module 134, A touch event processing module 135, an on-touch timer control module 136, an off-touch timer control module 137, a screen control module 138, and a target instruction execution module 139.

The touch event notification module 131 notifies the touch event processing unit 132 of the touch event received by the touch event input unit 110. [

At this time, the touch event is classified into a push-touch event, a moving touch event, a touch end event, and a target execution touch event. The touch event processing module is subdivided into the touch event processing module 132, the moving touch event processing module 133, the touch end event processing module 134, and the target execution touch event processing module 135, do.

The touch event processing module 132 receives the touch event from the touch event notification module 131 and processes the touch event. At this time, the push-touch event is generated by an operation that the touch event trigger presses the touch panel 112.

The mobile touch event processing module 133 receives the mobile touch event from the touch event notification module 131 and processes the mobile touch event. At this time, the moving touch event is generated by a movement of the touch event trigger after the touch panel 112 is pressed.

The touch end event processing module 134 receives the touch end event from the touch event notification module 131 and processes it. At this time, the touch end event is generated when the touch event trigger object is depressed to depress the touch panel 112.

The target execution touch event processing module 135 receives the target execution touch event from the touch event notification module 131 and processes the target execution touch event. At this time, the target execution touch event is generated by an operation or a pressing operation in which the touch event trigger selects the target P3 of the desktop P1.

The on-touch timer control module 136 drives or stops the on-touch timer 151 according to a request from each of the touch event processing modules 132, 133, 134, and 135.

The off-touch timer control module 137 drives or stops the off-touch timer 153 in response to a request from each of the touch event processing modules 132, 133, 134, and 135.

The screen control module 138 controls the screen P1 or the target P1 according to a request from each of the touch event processing modules 132, 133, 134 and 135, the on-touch timer control module 136 and the off- (P3).

When the target instruction is received from the target execution touch event processing module 135, the target instruction execution module 139 executes a function or an operation corresponding to a predetermined target that is the target of the target instruction execution event.

Referring again to FIG. 2, the elapsed time measuring unit 150 measures an elapsed time after touching, and includes an on-touch timer 151 and an off-touch timer 153.

The on-touch timer 151 measures the elapsed time of the touch using the number of clocks recorded in the clock-number recording unit 170 from the moment when the touch event triggerer touches the touch panel (112 in Fig. 3).

The off-off timer 153 measures the elapsed time of the touch release event using the number of clocks recorded in the clock number recording unit 170 from the moment when the touch event triggered object is separated from the touch panel (112 of FIG. 3).

The clock number recording unit 170 records the number of clocks at which the clock signal generated by the clock signal generating unit 190 counts up. And outputs the number of clocks to the elapsed time measuring unit 150.

The clock signal oscillator 190 generates a clock signal regularly to synchronize the operation of various hardware circuits in the computer system and outputs the clock signal to the clock number recording unit 170. [

On the other hand, a user interface that facilitates selection of the target P3 in which the finger 200 is difficult to reach as described above will be specifically described in each example.

5 to 8 illustrate an operation of performing a screen movement mode according to an embodiment of the present invention, which corresponds to an embodiment of moving a screen.

5, the finger 200 of the hand side (assuming that the smart device 100 is caught with the left hand at the lower left portion) (not shown in Fig. 5) RTI ID = 0.0 > P1 < / RTI >

Figure pat00001
. It does not matter what point on the desktop P1 is. Regardless of where the target P3 is located.

Then, the finger 200 is pulled to the lower left so as to easily select the dotted line and colored targets P3 while continuing the pressing operation with the finger 200

Figure pat00002
. Then, as shown in Fig. 6, the target P3 is pulled to a desired position, and then the finger 200 is detached from the desktop P1
Figure pat00003
. Then, as shown in Fig. 7, the dotted line and colored targets P3 are displayed on the desktop P1 in a state where they are moved to a position close to the left hand.

At this time,

Figure pat00004
,
Figure pat00005
,
Figure pat00006
Is repeated within a predetermined time, the dotted and colored targets P3 can be pulled closer to the left hand. That is, it can be pulled repeatedly within a predetermined time, not once.

Then, as shown in FIG. 8, if the desired target A_1_4 is selected and executed within a predetermined time with the dotted and colored targets P3 being moved, that is, within a predetermined time after the target is moved, The function or operation corresponding to the specific target is executed.

If the specific target P3 is not selected within a predetermined time after the dotted and colored targets P3 are moved, the screen is restored to the screen shown in Fig. 5, that is, the first screen (Display 1_A_1).

Next, FIGS. 9 and 10 illustrate an operation of performing a target alignment mode according to an embodiment of the present invention, which corresponds to an embodiment in which a screen is moved and a target is moved.

FIG. 9 shows an operation of performing the target alignment mode when the smart device is held by the left hand according to an embodiment of the present invention, and corresponds to a case where the target is moved to the left with the left hand lock.

9 (a), when the screen movable state occurs in the lower left of the screen, as shown in FIG. 9 (b), the dotted line and colored targets P3 are displayed on the desktop (P1). That is, the targets A_1_1, A_1_2, A_1_3, and A_1_4 located at the top of the desktop P1 and the targets A_2_4, A_3_4, A_4_4, and A_5_4 located in a region relatively far from the finger 200 are displayed on the desktop P1 And arranges the remaining targets P3 on the remaining area of the desktop P1.

FIG. 10 illustrates the operation of performing the target alignment mode when the smart device is held by the right hand according to another embodiment of the present invention, and corresponds to a case where the target is moved to the right with the right finger.

10 (a), when the screen moveable state occurs in the lower right portion of the screen, the dotted line and colored targets P3 are displayed on the screen of the desktop 200, (P1). That is, the targets A_1_1, A_1_2, A_1_3, and A_1_4 located at the top of the desktop P1 and the targets A_2_1, A_3_1, A_4_1, and A_5_1 located in a region relatively far from the finger 200 are displayed on the desktop P1 And arranges the remaining targets P3 on the remaining area of the desktop P1.

8, if the target A_1_4 is selected and executed within a predetermined time after the movement of the dotted and colored targets P3, that is, if a specific target is executed within a predetermined time after the target is moved, The function or operation corresponding to the target is executed. Also, if the target P3 is not selected within a predetermined time after the dotted and colored targets P3 are moved and the touch ends, the target P3 is returned to the original position.

FIG. 11 shows a coordinate system of a desktop according to an embodiment of the present invention. FIG. 12 shows orbit coordinates according to an embodiment of the present invention. FIG. FIG. 14 is a view showing the movement of the target along the moving orbit, and FIG. 14 shows the kind of the orbit on which the finger moves according to the embodiment of the present invention.

Referring to FIG. 11, the desktop P1 has a coordinate system. It is assumed that the upper left terminal point of the desktop P1 is (0,0) as the start point and the lower right terminal point of the desktop P1 is as (600, 800) as the end point.

Referring to FIG. 12, orbital coordinates are mapped according to the number of orbital coordinates. When the orbit on which the finger is moved is the orbit 1 as shown in Fig. 13, it can be seen that the finger 200 is moved to the lower left. That is, when the finger 200 moves to the trajectory 1, the target P3 of the desktop P1 is moved by the amount of coordinate movement in the order of the number of trajectories.

Specifically, the coordinate movement between the number 1 of the orbit coordinates and the number 2 of the orbit coordinates is shifted by 1 in the x-axis to the left of the x-axis and 2 in the y-axis in the original coordinates of the targets P3 on the desktop P1. That is, the targets P3 of the desktop P1 are moved in conjunction with the trajectory of movement of the finger 200. When the finger 200 goes out of the desktop P1, the coordinate of the finger 200 is taken as the coordinates of the finger 200 removed.

Here, as shown in Fig. 14, the kinds of trajectories on which the fingers are moved may vary.

14 (a) shows a trajectory in which the finger is moved to the lower left. Fig. 14 (b) shows a trajectory in which the finger moves to the upper left. 14 (c) shows a trajectory in which the finger moves horizontally to the left. 14 (d) shows a trajectory in which the finger moves downward to the right. Fig. 14 (e) shows a trajectory in which the finger moves to the upper right. 14 (f) shows a trajectory in which the finger moves horizontally to the right.

If the touch event occurs, the touch event processor 130 determines whether the touch event is to perform a user interface implementation according to an embodiment of the present invention, that is, to perform a target movement (P3) of the desktop P1, It is necessary to distinguish whether to perform the operation of moving to the screen P1 or the next screen P1.

The method of distinguishing this is as follows. According to the embodiment of the present invention, the touch event processing unit 130 sets the x coordinate = 300, which is an intermediate value of the entire desktop width, on the desktop P1 having the coordinate system shown in Fig. 11 as a moving boundary value do.

When the difference between the starting point and the ending point of the x coordinate of the trajectory on which the finger 200 moves is larger than the positive movement boundary value, the touch event processing unit 130 regards the touch event processing as movement to the next desktop. Therefore, the current desktop is moved to the next desktop and displayed.

The touch event processing unit 130 regards the movement to the previous desktop when the difference between the start point and the end point of the x coordinate of the trajectory on which the finger 200 moves is smaller than the negative movement boundary value. Therefore, the current desktop is moved to the previous desktop and displayed.

After the movement, the target P3 of the corresponding desktop P1 is displayed at the initial position, such as the screen shown in Fig. 5, that is, the first screen (Display 1_A_1).

On the other hand, when the difference between the starting point of the x coordinate of the finger and the ending point of the x coordinate is larger than the negative moving boundary value and smaller than the positive moving boundary value, (P1) or move the targets (P3).

If the touch event processing unit 130 determines that the screen touch movement event is a movement toward the left or a straight line as a result of analyzing the screen touch movement event, the touch event processing unit 130 moves the desktop P1 to the previous desktop or the next desktop. Here, whether or not the movement is close to a straight line is determined by judging whether the slope of the start point and the end point falls below a predetermined value. On the other hand, if it is determined that the movement is not close to a straight line, that is, if the slope of the starting point and the ending point has a slope equal to or larger than a predetermined value, the desktop P1 is moved or the target P3 is moved as described above.

Now, a method of providing a user interface according to an embodiment of the present invention will be described.

FIG. 15 shows a series of processes of a touch-touch event processing process according to an embodiment of the present invention.

Referring to FIG. 15, when the touch event trigger unit 110 performs a predetermined operation on the touch screen 111, the touch event input unit 110 generates a push-touch event (S101). Here, the push-touch event occurs according to an operation of the touch event triggering member pressing the screen of the touch screen 111. [

The touch event notification module 131 receives a touch event from the touch event input unit 110 (S103) and transmits the touch event to the touch event processing module 132 (S105).

The touch event processing module 132 analyzes the touch event to acquire the touch coordinates (S107). That is, the x-coordinate (downX1) and the y-coordinate (downY1) of the position where the push-touch event occurs are obtained.

Also, the touch-touch event processing module 132 requests the off-touch timer control module 137 to stop driving (S109). Then, the off-touch timer control module 137 requests the off-touch timer 153 to stop the operation (S111) and stops the operation of the off-touch timer 153 (S113).

On the other hand, the touch-touch event processing module 132 determines whether the screen is movable or not movable (S115) by checking the screen movable state (isMovable).

At this time, if it is possible to move, the step ends. On the other hand, if it is determined that the mobile terminal is unable to move, the time at which the push-touch event occurs is set to the operation start time (startOnTouchTime) of the on-touch timer 151. Then, the on-touch timer control module 136 requests the on-touch timer driving (S117). At this time, upon receiving the touch-touch event, the operation of the on-touch timer 151 is started.

The on-touch timer control module 136 requests the on-touch timer 151 to drive the on-touch timer (S119) and drives the on-touch timer 151 (S121).

The clock signal generating unit 190 generates a clock signal (S123) and transmits the clock signal to the clock number writing unit 170 (S125). Then, the clock number recording unit 170 records the number of clocks (S127), and then transfers the clock number to the on-touch timer 151 (S129).

The on-touch timer 151 acquires the number of received clocks and the current number of clocks while the on-touch timer operation is started (S131, S133). Then, the number of clocks is transmitted to the touch event processing module 132 (S135).

Then, the touch-touch event processing module 132 starts the operation of the on-touch timer 151 at the same time when the touch-touch event is received and sets the on-timer operation elapsed time (onContinuationTime) startOnTouchTime) and the current time (nowTime) (S137).

The touch event processing module 132 determines whether the elapsed time measured in step S137 exceeds the predefined threshold, that is, the on-timer operation limit time limitOnTouchTime (S139).

At this time, if it is not exceeded, step S137 is performed. That is, the screen movable state (isMovable) is kept in the unmovable state.

On the other hand, if it is exceeded, the touch-touch event processing module 132 determines that the touch-screen event processing module 132 is in the movable state and determines the screen or the target movement mode (S141). The touch-touch event processing module 132 requests the on-touch timer control module 136 to stop the on-touch timer (S143). Then, the on-touch timer control module 136 stops driving the on-touch timer 151 (S145, S147).

16 shows a series of processes of the mobile touch event processing process according to the embodiment of the present invention.

Referring to FIG. 16, when the touch event triggerer performs a predetermined operation on the touch screen 111, the touch event input unit 110 generates a movement touch event (S201). Here, the moving touch event is a touch event in which the touch event trigger moves on the touch screen while the screen of the touch screen 111 is pressed.

The touch event notification module 131 receives the moving touch event from the touch event input unit 110 (S203) and transfers it to the moving touch event processing module 133 (S205).

The moving touch event processing module 133 analyzes the moving touch event to acquire the moving touch coordinates (S207). That is, the x-coordinate (downX1) and the y-coordinate (downY1) of the touch screen on which the moving touch event occurs are obtained.

At this time, the mobile touch event processing module 133 determines whether the movable state (isMovable) is movable or not movable (S209).

If it is not in the movable state, the step ends.

If it is in a movable state, a movement distance (leftMargin) on the x coordinate and a movement distance (topMargin) on the y coordinate obtained in step S207 are calculated (S211).

Here, the moving distance of the moving touch event can be obtained by the difference between the coordinates of the previous moving touch event and the coordinates of the current moving touch event. The movement distance (topMargin) on the x coordinate is the difference between the x coordinate (downX2) of the current movement event and the x coordinate (downX1) of the previous movement event. (DownY2) of the previous touch event and the y coordinate (downY1) of the previous movement event.

The moving touch event processing module 133 transmits the screen moving distance calculated in step S211 to the screen control module 138 (S213). Then, the screen control module 138 moves the desktop P1 or the target P3 by the screen movement distance, that is, the movement distance (leftMargin) on the x coordinate and the movement distance (topMargin) on the y coordinate (S215). At this time, movement of the target P3 means target alignment.

17 shows a series of processes of the touch end event processing process according to the embodiment of the present invention.

Referring to FIG. 17, when the touch event trigger unit 110 performs a predetermined operation on the touch screen 111, the touch event input unit 110 generates a touch end event (S301). Here, the touch end event is generated by an operation in which the touch event trigger falls on the screen of the touch screen.

The touch event notification module 131 receives a touch end event from the touch event input unit 110 (S303) and transmits the touch end event to the touch end event processing module 134 (S305).

Meanwhile, the touch event notification module 131 may receive a target touch end event that occurs when the touch event triggering object falls away from the target of the touch screen 111. At this time, no action is taken in particular.

When the touch end event is received, the touch end event processing module 134 requests the off-touch timer control module 137 to drive (S307) and drives the off-touch timer 153 (S309 and S311). Then, the off-touch timer start time (startOffTouchTime) is set (S313), and the time is set to the time when the touch end event occurs. At this time, step S307 is performed simultaneously with the setting (S313).

The touch end event processing module 134 calculates an off-touch timer elapsed time (offContinuationTime) (S315). At this time, the off-touch timer elapsed time during the touch termination is calculated by using the operation start time (startOffTouchTime) of the off-timer and the current time (nowTime).

The touch end event processing module 134 determines whether the off timer operation elapsed time (offContinuationTime) exceeds the off timer operation limit time (limitOffTouchTime) (S317). Is determined by determining whether the difference value between the off timer operation elapsed time (offContinuationTime) and the off timer operation limit time (limitOffTouchTime) is greater than zero.

If so, the touch end event processing module 134 requests the off-touch timer control module 137 to stop the operation (S319) and stops the operation of the off-touch timer 153 (S329 and S323).

The touch end event processing module 134 sets the screen movable state (isMovable) to the unmovable state (S325) after step S323, and keeps the movable state when it does not exceed.

Next, the touch end event processing module 134 determines whether the screen is moved (S327). That is, it confirms the screen movable state (isMovable) and ends the step if possible.

However, if the screen movable state (isMovable) is impossible, the touch end event processing module 134 sets the operation start time (startOffTouchTime) of the off-touch timer to the off-timer operation limit time (limitOffTouchTime) The operation of the touch timer 151 is stopped (S331). Then, the touch end event is transmitted to the screen control module 138 (S333), and then the screen and the target are returned to their original positions (S335).

18 shows a series of processes of the target instruction execution event processing process according to the embodiment of the present invention.

Referring to FIG. 18, when the touch event trigger activates the touch screen 111, the touch event input unit 110 generates a target execution touch event (S401). Here, the target execution touch event is generated when the touch event trigger presses the target of the touch screen.

The touch event notification module 131 receives the target execution touch event from the touch event input unit 110 (S403) and transfers it to the target execution touch event processing module 135 (S405).

The target execution touch event processing module 135 transfers information of the target where the target execution touch event occurs to the target command execution module 139 (S407). Then, the target instruction execution module 139 implements a function or an operation corresponding to the target. Here, the target information transmitted in step S407 has a predetermined command such as an icon, a button, and a URL link. The target command execution module 139 activates the corresponding application when the target is an icon. If it is a URL (URL) link, it moves to the corresponding URL (URL) link. If it is a button, it performs the corresponding function. That is, it executes various commands that the target has.

On the other hand, the target execution touch event processing module 135 determines whether or not the screen is moved (S411), and stops driving the on-touch timer 151 and the off-touch timer 153 when the screen movement enable state (isMovable) (S413). And restores the screen to its original position.

According to the above description, when the finger touches the screen, the on-off timers 151 and 153 are activated, and the screen moving mode is automatically switched when the pressed state and the detached state continue for a predetermined time or more.

On the other hand, the screen movement mode switching method may be implemented in the following embodiments.

FIG. 19 illustrates an operation of performing a screen movement mode according to another embodiment of the present invention, and FIG. 20 illustrates a method of performing a screen movement mode according to another embodiment of the present invention.

Referring to FIG. 19, an icon P7 for switching the screen movement mode is always displayed on the desktop screen on which the finger is touched. At this time, if the user touches the icon P7 with a finger, the desktop can switch the screen P3 to a screen movement mode in which the target P3 that the finger does not touch can be pulled to the area of the finger. Then, the screen and the target are restored to their original state after a predetermined time according to the procedure of FIG.

Referring to FIG. 20, when the touch event player touches the icon P7 displayed on the touch screen 111, the touch event input unit 110 generates a push-touch event (S501).

The touch event notification module 131 receives the touch event from the touch event input unit 110 in step S503 and transmits the touch event to the touch event processing module 132 in step S505.

The touch event processing module 132 analyzes the received touch event in step S505 (S507), and determines whether the touch panel touches the scroll icon P7 (step S509).

If the touch of the scroll icon P7 is touched, the screen or the target movement mode is switched (S511).

The embodiments of the present invention described above are not implemented only by the apparatus and method, but may be implemented through a program for realizing the function corresponding to the configuration of the embodiment of the present invention or a recording medium on which the program is recorded.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It belongs to the scope of right.

Claims (28)

Wherein the smart device includes a plurality of targets, wherein the plurality of targets include first targets arranged at positions not touching the finger and second targets arranged at positions where the finger touches the smart targets while the smart device is held by a user with one hand Outputting an aligned wallpaper,
Sensing the touching operation after the smart device is pulled out with a finger touching an arbitrary point of the desktop with the finger of the hand and then removing the touching operation;
Providing a user interface that allows the smart device to move one or more targets of the first targets to select the first targets
And providing the user interface to the user interface.
The method according to claim 1,
Wherein providing the user interface comprises:
And moving a desktop screen on which the first targets are aligned in a direction corresponding to a moving direction of the finger.
3. The method of claim 2,
Wherein providing the user interface comprises:
And moving the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
The method of claim 3,
Wherein providing the user interface comprises:
Wherein one or more targets of the second targets are not displayed on the screen in correspondence with the moving direction of the touch of the finger
3. The method of claim 2,
Wherein the step of moving the finger to a position to which the finger touches comprises:
When the touch operation is detected, switching to a screen movable state,
Displaying the background image when the screen is transitionable, and
Moving a desktop screen in which the first targets are arranged separately from the background screen image to a position where the finger touches the screen in correspondence with a moving direction of the finger's touch;
And providing the user interface to the user interface.
6. The method of claim 5,
The background screen image may include:
And when the predetermined time elapses, the image is converted into another image or is converted into another image when the screen is shifted to the movable state.
The method according to claim 1,
Wherein providing the user interface comprises:
Aligning the first targets by moving them to a position where the finger touches them, and aligning one or more targets of the second targets in the remaining area of the desktop.
The method according to claim 1,
Wherein the sensing comprises:
A step of generating a push-touch event by an operation of pressing a desktop, and
Determining whether the pressing operation is continued for a predetermined time,
Wherein providing the user interface comprises:
And if the pressing operation is continued for a predetermined time, providing the user interface.
The method according to claim 1,
Wherein the sensing comprises:
A step of generating a push-touch event by an operation of pressing a desktop,
A step of generating a movement touch event by an operation of moving on the desktop together with a push-touch event,
Providing the user interface in accordance with a mobile touch event,
Generating a touch termination event for terminating a touch on the desktop; and
Restoring the desktop to the original screen according to the touch end event
And providing the user interface to the user interface.
10. The method of claim 9,
Wherein,
Determining whether the touch termination state continues for a predetermined time after the touch termination event is delivered, and
And restoring the original screen if the predetermined period of time has elapsed
And providing the user interface to the user interface.
10. The method of claim 9,
Wherein the providing of the user interface in response to the mobile touch event comprises:
Interpreting the movement touch event to calculate movement coordinates,
Performing a desktop transition according to the movement coordinates when a difference between a start point and an end point of the X coordinate of the movement coordinate is larger than a predetermined positive threshold value or smaller than a predetermined negative threshold value,
If the difference value is greater than the negative threshold and less than the positive threshold, providing the user interface
And providing the user interface to the user interface.
12. The method of claim 11,
The positive threshold value and the negative threshold value,
And an x coordinate value that is an intermediate value of the entire width of the desktop.
10. The method of claim 9,
Wherein the providing of the user interface in response to the mobile touch event comprises:
Interpreting the movement touch event to calculate movement coordinates,
Performing a desktop transition according to the movement coordinates when it is determined that the slope according to the movement coordinates is close to 0, and
Providing the user interface if the slope according to the movement coordinates is greater than or equal to a predetermined slope threshold,
And providing the user interface to the user interface.
10. The method of claim 9,
Wherein the providing of the user interface in response to the mobile touch event comprises:
Providing a user interface for moving a target, which is not touching the finger arranged on the upper right side of the desktop, to a position where the finger touches when a left-down movement event occurs while holding the smart device with the left hand,
Providing a user interface for moving targets, which are not touching the finger arranged on a lower right-hand side desktop, to a position where the finger touches when a left-up movement event occurs while holding the smart device with the left hand,
Providing a user interface for moving a target that is not touching the finger arranged on the upper left desktop to a position where the finger touches when a right lower touch event occurs while holding the smart device with the right hand, And
Providing a user interface for moving targets, which are not touching the finger arranged on the lower left desktop, to a position where the finger touches when a right-up movement event occurs while holding the smart device with the right hand
And providing the user interface to the user interface.
The method according to claim 1,
Wherein the second targets include a predefined specific icon,
Wherein providing the user interface comprises:
When the specific icon is touched and touched, the screen mode is changed to move the background screen on which the first targets are aligned to the position touched by the finger corresponding to the moving direction of the touch of the finger, And aligning one or more targets of the second targets in the remaining area of the desktop.
16. The method of claim 15,
After providing the user interface,
And restoring the desktop as it is when a predetermined time elapses after the screen movement mode is performed.
Wherein the smart device includes a plurality of targets, wherein the plurality of targets include first targets arranged at positions not touching the finger and second targets arranged at positions where the finger touches the smart targets while the smart device is held by a user with one hand When the user touches an arbitrary point on the desktop with the finger of the hand while displaying the sorted desktop screen and then pulls the finger while touching the touch screen, A touch event input unit for generating
A touch event processing unit for providing a user interface for selecting one of the first targets by moving one or more targets of the first targets according to the touch event;
/ RTI >
18. The method of claim 17,
The touch event processing unit,
And moves the desktop screen on which the first targets are arranged to a position where the finger touches, in correspondence with the movement direction of the finger.
19. The method of claim 18,
The touch event processing unit,
And moves the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
20. The method of claim 19,
The touch event processing unit,
Wherein one or more targets of the second targets do not display on the screen in correspondence with a moving direction of the touch of the finger.
18. The method of claim 17,
The touch event processing unit,
Aligning the first targets by moving them to a position where the finger is touched, and aligning one or more targets of the second targets in the remaining area of the desktop.
18. The method of claim 17,
A clock signal oscillation unit for generating a periodic clock signal,
A clock number recording section for counting the clock signal output from the clock signal oscillation section and recording the number of clocks,
And an elapsed time measuring unit for measuring a touch related elapsed time using the number of clocks output by the clock number recording unit and providing the elapsed time to the touch event processing unit,
Lt; / RTI >
23. The method of claim 22,
Wherein the elapsed time measuring unit comprises:
An on-touch timer for measuring a touch continuation elapsed time using the number of clocks recorded in the clock number recording unit from the moment the touch event triggered the touch panel,
And an off-touch timer for measuring the elapsed time of the touch release using the number of clocks recorded in the clock number recording unit from the moment the touch event triggerer leaves the touch panel
/ RTI >
18. The method of claim 17,
The touch event processing unit,
A touch event notification module for notifying a touch event received from the touch input unit,
A touch event processing module for receiving a touch event generated by an operation of the touch event player pressing the touch panel from the touch event notification module,
A touch event processing module for receiving a touch event generated by an operation of the touch event player after pressing the touch panel from the touch event notification module,
A touch end event processing module for receiving a touch end event generated when the touch event trigger unit is removed from the touch panel, from the touch event notification module;
A target execution touch event processing module for receiving from the touch event notification module a target execution touch event generated by an operation or a pressing operation of the touch event trigger,
An on-touch timer control for driving or stopping the on-touch timer according to a request from each of the touch-down event processing module, the moving touch event processing module, the touch end event processing module and the target execution touch event processing module module,
An off-touch timer control for driving or stopping the off-touch timer according to a request of each of the touch-down event processing module, the moving touch event processing module, the touch end event processing module and the target execution touch event processing module, module,
A screen control module for implementing the user interface according to a request of the mobile touch event processing module,
A target instruction execution module that implements a function or an action corresponding to a target based on target information received from the target execution touch event processing module,
Lt; / RTI >
25. The method of claim 24,
Wherein the push-touch event processing module comprises:
Touch timer to stop the operation of the off-touch timer, and when the screen-movable state is in a non-movable state, an operation start time of the on-touch timer is set to a time when the touch- The on-touch timer is started, the operation elapsed time of the on-timer during the occurrence of the push-touch event is measured, and if the time exceeds the threshold time, the user interface implementation is determined, Smart device that stops driving.
25. The method of claim 24,
The mobile touch event processing module includes:
Analyzing the moving touch event to obtain moving touch coordinates, and if the screen movable state is in a movable state, implementing the user interface according to the moving touch coordinates.
25. The method of claim 24,
The touch end event processing module includes:
The method comprising: setting the time at which the touch termination event occurs, as the off-touch timer driving start time after driving the off-touch timer, measuring the elapsed time of the off-touch timer during the touch termination, Touch timer is stopped, the control unit sets the operation start time of the off-touch timer to the off-timer operation limit time, and then controls the operation of the on- Stop,
Wherein the screen control module comprises:
And restoring the desktop to the original state at the request of the touch end event processing module.
25. The method of claim 24,
Wherein the target execution touch event processing module comprises:
And transmits the target information to the target command execution module, and stops the operation of the on-touch timer and the off-touch timer when the screen movement enable state is impossible, and restores the desktop.
KR1020140059932A 2013-05-20 2014-05-19 Method for providing user interface and smart device thereof KR20140136393A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20130056746 2013-05-20
KR1020130056746 2013-05-20

Publications (1)

Publication Number Publication Date
KR20140136393A true KR20140136393A (en) 2014-11-28

Family

ID=52456667

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140059932A KR20140136393A (en) 2013-05-20 2014-05-19 Method for providing user interface and smart device thereof

Country Status (1)

Country Link
KR (1) KR20140136393A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170050910A (en) * 2015-11-02 2017-05-11 에스케이텔레콤 주식회사 Apparatus and method for controlling touch in a touch screen

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170050910A (en) * 2015-11-02 2017-05-11 에스케이텔레콤 주식회사 Apparatus and method for controlling touch in a touch screen

Similar Documents

Publication Publication Date Title
TW212236B (en)
US8907907B2 (en) Display device with touch panel, event switching control method, and computer-readable storage medium
RU2552637C2 (en) Device, system and method of remote control
EP3413163A1 (en) Method for processing data collected by touch panel, and terminal device
JP5479414B2 (en) Information processing apparatus and control method thereof
CN103513865A (en) Touch control equipment and method and device for controlling touch control equipment to configure operation mode
US20140002393A1 (en) Controlling a cursor on a touch screen
KR101156610B1 (en) Method for input controlling by using touch type, and computer-readable recording medium with controlling program using touch type
WO1996024095A1 (en) Method and an apparatus for simulating the states of a mechanical button on a touch-sensitive input device
CN103513882A (en) Touch control equipment control method and device and touch control equipment
CN104536643B (en) A kind of icon drag method and terminal
US9274702B2 (en) Drawing device, drawing control method, and drawing control program for drawing graphics in accordance with input through input device that allows for input at multiple points
CN103513817A (en) Touch control equipment and method and device for controlling touch control equipment to configure operation mode
US9846529B2 (en) Method for processing information and electronic device
CN104076972A (en) A device and a method for selecting a touch screen hot spot
CN105808129B (en) Method and device for quickly starting software function by using gesture
CN104951213A (en) Method for preventing false triggering of edge sliding gesture and gesture triggering method
CN103513886A (en) Touch control device and target object moving method and device of touch control device
JP4653297B2 (en) Control device, electronic device, and medium
WO2018046000A1 (en) Touch operation method and device
JP2012113645A (en) Electronic apparatus
KR20140136393A (en) Method for providing user interface and smart device thereof
CN105808080B (en) Method and device for quickly copying object by utilizing gesture
WO2016206438A1 (en) Touch screen control method and device and mobile terminal
CN107506132B (en) Display device, display method of display device, and storage medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E601 Decision to refuse application