KR20140136393A - Method for providing user interface and smart device thereof - Google Patents
Method for providing user interface and smart device thereof Download PDFInfo
- Publication number
- KR20140136393A KR20140136393A KR1020140059932A KR20140059932A KR20140136393A KR 20140136393 A KR20140136393 A KR 20140136393A KR 1020140059932 A KR1020140059932 A KR 1020140059932A KR 20140059932 A KR20140059932 A KR 20140059932A KR 20140136393 A KR20140136393 A KR 20140136393A
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- targets
- touch event
- user interface
- finger
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a method for providing a user interface and a smart device for implementing the same.
Smart devices such as smartphones, tablet notebooks, and tablet PCs are now largely dominated by 5-inch screens. This large-screen smart device has the advantage of displaying a lot of information because of its large screen size, but it is difficult to hold it with one hand. In other words, there is an inconvenience to use a different hand that does not hold a smart device to select a target placed at a long distance on the screen where the smart device is held by the hand, out of the reach of the hand.
SUMMARY OF THE INVENTION The present invention provides a method for providing a user interface for selecting a target to be executed by a user with a single hand in a smart device on a large screen, and a smart device implementing the same.
According to one aspect of the present invention, there is provided a method for providing a user interface, the method comprising the steps of: when a user grasps a smart device with one hand, the smart device displays a plurality of targets, And outputting the sorted background image, wherein the smart device includes a plurality of second targets arranged at a position where the finger touches the first target, Detecting a tear off action, and providing a user interface that allows the smart device to move one or more targets of the first targets to select the first targets.
Wherein providing the user interface comprises:
And moving the desktop screen on which the first targets are arranged to a position where the finger touches, in correspondence with the movement direction of the finger.
Wherein providing the user interface comprises:
And moving the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
Wherein providing the user interface comprises:
One or more targets of the second targets may not be displayed on the screen corresponding to the moving direction of the touch of the finger.
Wherein the step of moving the finger to a position to which the finger touches comprises:
The method of
The background screen image may include:
When a predetermined time elapses, it can be converted to another image or converted to another image when the screen is ready to be moved.
Wherein providing the user interface comprises:
Aligning the first targets by moving them to a position where the finger touches them, and aligning one or more targets of the second targets in the remaining area of the desktop.
Wherein the sensing comprises:
A step of generating a push-touch event by an operation of pressing a desktop, and
Determining whether the pressing operation is continued for a predetermined time,
Wherein providing the user interface comprises:
If the pressing operation is continued for a predetermined time, the user interface can be provided.
Wherein the sensing comprises:
A step of generating a touch event by an operation of pressing a desktop, a step of generating a moving touch event by an operation of moving on the desktop together with a pressing touch event, the step of providing the user interface according to a moving touch event, Generating a touch termination event for terminating the touch on the desktop, and restoring the desktop to the original screen according to the touch termination event.
Wherein,
Determining whether the touch termination state continues for a predetermined time after the touch termination event is transmitted, and restoring the original screen when the predetermined period of time has elapsed.
Wherein the providing of the user interface in response to the mobile touch event comprises:
Calculating a moving coordinate by interpreting the moving touch event; if the difference value between the starting point and the ending point of the X coordinate of the moving coordinate is larger than a predetermined positive threshold value or smaller than a predetermined negative threshold value, And providing the user interface if the difference value is greater than the negative threshold value and less than the positive threshold value.
The positive threshold value and the negative threshold value,
Coordinate value which is an intermediate value of the entire width of the desktop.
Wherein the providing of the user interface in response to the mobile touch event comprises:
Calculating a moving coordinate by interpreting the moving touch event, performing a desktop switching according to the moving coordinate when it is determined that the slope according to the moving coordinate is close to 0, And if so, providing the user interface.
Wherein the step of providing the user interface in response to the mobile touch event comprises: when a left-down moving touch event occurs while holding the smart device with the left hand, When a left-up moving touch event occurs while holding the smart device with the left hand, moving the targets, which are not touched by the finger arranged on the lower right of the desktop, Providing a user interface for moving the finger to a position where the finger touches the finger, and when a touch event of moving downward in a state of holding the smart device with the right hand occurs, The position where the finger touches When a right-up movement event occurs while holding the smart device with the right hand, a target that is not touching the finger arranged on the lower left desktop is moved to a position The method comprising the steps of:
Wherein the second targets include a predefined specific icon,
Wherein providing the user interface comprises:
When the specific icon is touched and touched, the screen mode is changed to move the background screen on which the first targets are aligned to the position touched by the finger corresponding to the moving direction of the touch of the finger, The finger can be moved and aligned to the touch position and one or more targets of the second targets can be aligned in the remaining area of the desktop.
After providing the user interface,
After the screen movement mode is performed, the desktop can be restored as it is when the predetermined time elapses.
According to another aspect of the present invention, a smart device includes a plurality of targets, the plurality of targets having a first target arranged at a position where the finger does not touch, Wherein the second target is arranged in a position where the touch screen is touched, and the touch screen is touched by pulling out an arbitrary point on the desktop with the finger of the hand while outputting the aligned desktop screen, A touch event input unit for generating a touch event according to the touch operation when the touch event is detected, and a touch input unit for providing a user interface for selecting one of the first targets by moving one or more targets among the first targets, And an event processing unit.
The touch event processing unit,
It is possible to move the desktop screen on which the first targets are aligned to the position where the finger touches, corresponding to the moving direction of the finger.
The touch event processing unit,
The second targets may be moved in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
The touch event processing unit,
And one or more targets of the second targets may not be displayed on the screen corresponding to the moving direction of the touch of the finger.
The touch event processing unit,
Align and move the first targets to a position where the finger touches them and align one or more targets of the second targets to the remaining area of the desktop.
The smart device includes a clock signal generating unit generating a periodic clock signal, a clock number recording unit counting a clock signal output from the clock signal generating unit to record the number of clocks, And an elapsed time measuring unit for measuring the touch related elapsed time and providing the measured elapsed time to the touch event processing unit.
Wherein the elapsed time measuring unit comprises:
An on-touch timer for measuring a touch continuation elapsed time using the number of clocks recorded in the clock number recording unit from a moment when the touch event triggerer touches the touch panel; And an off-touch timer for measuring the elapsed time of the touch removal using the number of clocks recorded in the number recording unit.
The touch event processing unit,
A touch event notification module for receiving a touch event received from the touch input unit, a touch event event processing module for receiving a touch event generated by an operation of the touch event player pressing the touch panel from the touch event notification module, A movement touch event processing module for receiving a movement touch event generated by a movement of the touch event protector after pressing the touch panel from the touch event notification module; A touch end event processing module for receiving a touch end event from the touch event notification module, a touch end event processing module for receiving from the touch event notification module a target execution touch event generated by an operation of selecting a target or a pressing operation by the touch event trigger Touch timer according to a request of each of the target execution touch event processing module, the push-touch event processing module, the mobile touch event processing module, the touch end event processing module, and the target execution touch event processing module, Touch timer according to a request from the on-touch timer control module, the touch-down event processing module, the moving touch event processing module, the touch end event processing module, and the target execution touch event processing module, An on-off timer control module for stopping or driving the touch event processing module, a screen control module for implementing the user interface according to a request of the mobile touch event processing module, And a target instruction execution module that implements the function or operation to perform the function.
Wherein the push-touch event processing module comprises:
Touch timer to stop the operation of the off-touch timer, and when the screen-movable state is in a non-movable state, an operation start time of the on-touch timer is set to a time when the touch- The on-touch timer is started, the operation elapsed time of the on-timer during the occurrence of the push-touch event is measured, and if the time exceeds the threshold time, the user interface implementation is determined, The driving can be stopped.
The mobile touch event processing module includes:
The controller may analyze the moving touch event to obtain moving touch coordinates, and if the screen movement enabling state is movable, the user interface may be implemented according to the moving touch coordinates.
The touch end event processing module includes:
The method comprising: setting the time at which the touch termination event occurs, as the off-touch timer driving start time after driving the off-touch timer, measuring the elapsed time of the off-touch timer during the touch termination, Touch timer is stopped, the control unit sets the operation start time of the off-touch timer to the off-timer operation limit time, and then controls the operation of the on- Stop,
Wherein the screen control module comprises:
The desktop can be restored according to a request of the touch end event processing module.
Wherein the target execution touch event processing module comprises:
After transferring the target information to the target command execution module, if the screen movement enable state is impossible, the driving of the on-touch timer and the off-touch timer is stopped, and the desktop can be restored to its original state.
According to the embodiment of the present invention, it is possible to solve the inconvenience of using both hands in order to select a target to be executed by a user in a smart device of a large screen in the past.
FIG. 1 illustrates a desktop and a target of a smart device to which an embodiment of the present invention is applied.
2 shows the overall configuration of a smart device according to an embodiment of the present invention.
3 is a block diagram illustrating a detailed configuration of the touch event input unit of FIG.
4 is a block diagram showing a detailed configuration of the touch event processing unit of FIG.
5 to 8 illustrate an operation of performing a screen movement mode according to an embodiment of the present invention.
Figure 9 illustrates the target alignment mode performing operation when the smart device is held in the left hand according to one embodiment of the present invention.
FIG. 10 shows an operation of performing a target alignment mode when the smart device is held by the right hand according to another embodiment of the present invention.
11 shows a coordinate system of a desktop according to an embodiment of the present invention.
FIG. 12 shows orbit coordinates according to an embodiment of the present invention.
13 is a diagram showing movement of a target according to an orbit on which a finger is moved according to an embodiment of the present invention.
14 shows the kind of the orbit on which the finger moves according to the embodiment of the present invention.
FIG. 15 shows a series of processes of a touch-touch event processing process according to an embodiment of the present invention.
16 shows a series of processes of the mobile touch event processing process according to the embodiment of the present invention.
17 shows a series of processes of the touch end event processing process according to the embodiment of the present invention.
18 shows a series of processes of the target instruction execution event processing process according to the embodiment of the present invention.
FIG. 19 illustrates an operation of performing a screen movement mode according to another embodiment of the present invention.
FIG. 20 illustrates a method of performing a screen movement mode according to another embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.
Throughout the specification, when an element is referred to as "comprising ", it means that it can include other elements as well, without excluding other elements unless specifically stated otherwise.
Also, the terms of " part ", "... module" in the description mean units for processing at least one function or operation, which may be implemented by hardware or software or a combination of hardware and software.
Hereinafter, a method for providing a user interface according to an embodiment of the present invention and a smart device implementing the same will be described in detail with reference to the drawings.
Here, touching or pressing a finger in this specification includes not only directly touching the smart device but also placing the finger in a position close to the smart device.
FIG. 1 illustrates a desktop and a target of a smart device to which an embodiment of the present invention is applied.
Referring to FIG. 1, a plurality of targets are arranged on a desktop P1 of the
Here, when the user holds the
The first target P3 and the second target P5 may be icons, buttons, or the like.
At this time, the case where the
The desktop P1 is used to execute a first (home) screen in which an app icon, a gazette, etc. appearing when the
2 is a block diagram illustrating a detailed configuration of the touch event input unit of FIG. 2, FIG. 4 is a detailed view of the touch event processing unit of FIG. 2, and FIG. Fig.
2, the
The touch
3, the touch
Referring again to FIG. 2, the touch
When the touch event trigger generates an event at a predetermined position on the
At this time, the touch
According to one embodiment, in conjunction with the touch event and the operation time of the on-
Specifically, the desktop screen on which the first target P3 is aligned is moved to a position where the finger touches the screen in accordance with the moving direction of the finger. The second target P5 is moved in the same direction as the first target P3 in accordance with the moving direction of the touch of the finger. At this time, one or more targets of the second target P5 are not displayed on the screen corresponding to the moving direction of the touch of the finger.
Here, when a touch operation for pulling or pushing the whole desktop P1 is detected, the
At this time, the background image may be converted into another image when a predetermined time elapses, or may be converted to another image when the screen is shifted to a movable state.
Also, the background image may be a corporate log image of a smart device manufacturer or an image selected by a user.
According to another embodiment of the present invention, when the screen movement enable state is set to the movable state in conjunction with the touch event and the operation time of the on-
Here, the touch
4, the touch
The touch
At this time, the touch event is classified into a push-touch event, a moving touch event, a touch end event, and a target execution touch event. The touch event processing module is subdivided into the touch
The touch
The mobile touch
The touch end
The target execution touch
The on-touch
The off-touch
The
When the target instruction is received from the target execution touch
Referring again to FIG. 2, the elapsed
The on-
The off-
The clock
The
On the other hand, a user interface that facilitates selection of the target P3 in which the
5 to 8 illustrate an operation of performing a screen movement mode according to an embodiment of the present invention, which corresponds to an embodiment of moving a screen.
5, the
Then, the
At this time,
, , Is repeated within a predetermined time, the dotted and colored targets P3 can be pulled closer to the left hand. That is, it can be pulled repeatedly within a predetermined time, not once.Then, as shown in FIG. 8, if the desired target A_1_4 is selected and executed within a predetermined time with the dotted and colored targets P3 being moved, that is, within a predetermined time after the target is moved, The function or operation corresponding to the specific target is executed.
If the specific target P3 is not selected within a predetermined time after the dotted and colored targets P3 are moved, the screen is restored to the screen shown in Fig. 5, that is, the first screen (Display 1_A_1).
Next, FIGS. 9 and 10 illustrate an operation of performing a target alignment mode according to an embodiment of the present invention, which corresponds to an embodiment in which a screen is moved and a target is moved.
FIG. 9 shows an operation of performing the target alignment mode when the smart device is held by the left hand according to an embodiment of the present invention, and corresponds to a case where the target is moved to the left with the left hand lock.
9 (a), when the screen movable state occurs in the lower left of the screen, as shown in FIG. 9 (b), the dotted line and colored targets P3 are displayed on the desktop (P1). That is, the targets A_1_1, A_1_2, A_1_3, and A_1_4 located at the top of the desktop P1 and the targets A_2_4, A_3_4, A_4_4, and A_5_4 located in a region relatively far from the
FIG. 10 illustrates the operation of performing the target alignment mode when the smart device is held by the right hand according to another embodiment of the present invention, and corresponds to a case where the target is moved to the right with the right finger.
10 (a), when the screen moveable state occurs in the lower right portion of the screen, the dotted line and colored targets P3 are displayed on the screen of the
8, if the target A_1_4 is selected and executed within a predetermined time after the movement of the dotted and colored targets P3, that is, if a specific target is executed within a predetermined time after the target is moved, The function or operation corresponding to the target is executed. Also, if the target P3 is not selected within a predetermined time after the dotted and colored targets P3 are moved and the touch ends, the target P3 is returned to the original position.
FIG. 11 shows a coordinate system of a desktop according to an embodiment of the present invention. FIG. 12 shows orbit coordinates according to an embodiment of the present invention. FIG. FIG. 14 is a view showing the movement of the target along the moving orbit, and FIG. 14 shows the kind of the orbit on which the finger moves according to the embodiment of the present invention.
Referring to FIG. 11, the desktop P1 has a coordinate system. It is assumed that the upper left terminal point of the desktop P1 is (0,0) as the start point and the lower right terminal point of the desktop P1 is as (600, 800) as the end point.
Referring to FIG. 12, orbital coordinates are mapped according to the number of orbital coordinates. When the orbit on which the finger is moved is the
Specifically, the coordinate movement between the
Here, as shown in Fig. 14, the kinds of trajectories on which the fingers are moved may vary.
14 (a) shows a trajectory in which the finger is moved to the lower left. Fig. 14 (b) shows a trajectory in which the finger moves to the upper left. 14 (c) shows a trajectory in which the finger moves horizontally to the left. 14 (d) shows a trajectory in which the finger moves downward to the right. Fig. 14 (e) shows a trajectory in which the finger moves to the upper right. 14 (f) shows a trajectory in which the finger moves horizontally to the right.
If the touch event occurs, the
The method of distinguishing this is as follows. According to the embodiment of the present invention, the touch
When the difference between the starting point and the ending point of the x coordinate of the trajectory on which the
The touch
After the movement, the target P3 of the corresponding desktop P1 is displayed at the initial position, such as the screen shown in Fig. 5, that is, the first screen (Display 1_A_1).
On the other hand, when the difference between the starting point of the x coordinate of the finger and the ending point of the x coordinate is larger than the negative moving boundary value and smaller than the positive moving boundary value, (P1) or move the targets (P3).
If the touch
Now, a method of providing a user interface according to an embodiment of the present invention will be described.
FIG. 15 shows a series of processes of a touch-touch event processing process according to an embodiment of the present invention.
Referring to FIG. 15, when the touch
The touch
The touch
Also, the touch-touch
On the other hand, the touch-touch
At this time, if it is possible to move, the step ends. On the other hand, if it is determined that the mobile terminal is unable to move, the time at which the push-touch event occurs is set to the operation start time (startOnTouchTime) of the on-
The on-touch
The clock
The on-
Then, the touch-touch
The touch
At this time, if it is not exceeded, step S137 is performed. That is, the screen movable state (isMovable) is kept in the unmovable state.
On the other hand, if it is exceeded, the touch-touch
16 shows a series of processes of the mobile touch event processing process according to the embodiment of the present invention.
Referring to FIG. 16, when the touch event triggerer performs a predetermined operation on the
The touch
The moving touch
At this time, the mobile touch
If it is not in the movable state, the step ends.
If it is in a movable state, a movement distance (leftMargin) on the x coordinate and a movement distance (topMargin) on the y coordinate obtained in step S207 are calculated (S211).
Here, the moving distance of the moving touch event can be obtained by the difference between the coordinates of the previous moving touch event and the coordinates of the current moving touch event. The movement distance (topMargin) on the x coordinate is the difference between the x coordinate (downX2) of the current movement event and the x coordinate (downX1) of the previous movement event. (DownY2) of the previous touch event and the y coordinate (downY1) of the previous movement event.
The moving touch
17 shows a series of processes of the touch end event processing process according to the embodiment of the present invention.
Referring to FIG. 17, when the touch
The touch
Meanwhile, the touch
When the touch end event is received, the touch end
The touch end
The touch end
If so, the touch end
The touch end
Next, the touch end
However, if the screen movable state (isMovable) is impossible, the touch end
18 shows a series of processes of the target instruction execution event processing process according to the embodiment of the present invention.
Referring to FIG. 18, when the touch event trigger activates the
The touch
The target execution touch
On the other hand, the target execution touch
According to the above description, when the finger touches the screen, the on-off
On the other hand, the screen movement mode switching method may be implemented in the following embodiments.
FIG. 19 illustrates an operation of performing a screen movement mode according to another embodiment of the present invention, and FIG. 20 illustrates a method of performing a screen movement mode according to another embodiment of the present invention.
Referring to FIG. 19, an icon P7 for switching the screen movement mode is always displayed on the desktop screen on which the finger is touched. At this time, if the user touches the icon P7 with a finger, the desktop can switch the screen P3 to a screen movement mode in which the target P3 that the finger does not touch can be pulled to the area of the finger. Then, the screen and the target are restored to their original state after a predetermined time according to the procedure of FIG.
Referring to FIG. 20, when the touch event player touches the icon P7 displayed on the
The touch
The touch
If the touch of the scroll icon P7 is touched, the screen or the target movement mode is switched (S511).
The embodiments of the present invention described above are not implemented only by the apparatus and method, but may be implemented through a program for realizing the function corresponding to the configuration of the embodiment of the present invention or a recording medium on which the program is recorded.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It belongs to the scope of right.
Claims (28)
Sensing the touching operation after the smart device is pulled out with a finger touching an arbitrary point of the desktop with the finger of the hand and then removing the touching operation;
Providing a user interface that allows the smart device to move one or more targets of the first targets to select the first targets
And providing the user interface to the user interface.
Wherein providing the user interface comprises:
And moving a desktop screen on which the first targets are aligned in a direction corresponding to a moving direction of the finger.
Wherein providing the user interface comprises:
And moving the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
Wherein providing the user interface comprises:
Wherein one or more targets of the second targets are not displayed on the screen in correspondence with the moving direction of the touch of the finger
Wherein the step of moving the finger to a position to which the finger touches comprises:
When the touch operation is detected, switching to a screen movable state,
Displaying the background image when the screen is transitionable, and
Moving a desktop screen in which the first targets are arranged separately from the background screen image to a position where the finger touches the screen in correspondence with a moving direction of the finger's touch;
And providing the user interface to the user interface.
The background screen image may include:
And when the predetermined time elapses, the image is converted into another image or is converted into another image when the screen is shifted to the movable state.
Wherein providing the user interface comprises:
Aligning the first targets by moving them to a position where the finger touches them, and aligning one or more targets of the second targets in the remaining area of the desktop.
Wherein the sensing comprises:
A step of generating a push-touch event by an operation of pressing a desktop, and
Determining whether the pressing operation is continued for a predetermined time,
Wherein providing the user interface comprises:
And if the pressing operation is continued for a predetermined time, providing the user interface.
Wherein the sensing comprises:
A step of generating a push-touch event by an operation of pressing a desktop,
A step of generating a movement touch event by an operation of moving on the desktop together with a push-touch event,
Providing the user interface in accordance with a mobile touch event,
Generating a touch termination event for terminating a touch on the desktop; and
Restoring the desktop to the original screen according to the touch end event
And providing the user interface to the user interface.
Wherein,
Determining whether the touch termination state continues for a predetermined time after the touch termination event is delivered, and
And restoring the original screen if the predetermined period of time has elapsed
And providing the user interface to the user interface.
Wherein the providing of the user interface in response to the mobile touch event comprises:
Interpreting the movement touch event to calculate movement coordinates,
Performing a desktop transition according to the movement coordinates when a difference between a start point and an end point of the X coordinate of the movement coordinate is larger than a predetermined positive threshold value or smaller than a predetermined negative threshold value,
If the difference value is greater than the negative threshold and less than the positive threshold, providing the user interface
And providing the user interface to the user interface.
The positive threshold value and the negative threshold value,
And an x coordinate value that is an intermediate value of the entire width of the desktop.
Wherein the providing of the user interface in response to the mobile touch event comprises:
Interpreting the movement touch event to calculate movement coordinates,
Performing a desktop transition according to the movement coordinates when it is determined that the slope according to the movement coordinates is close to 0, and
Providing the user interface if the slope according to the movement coordinates is greater than or equal to a predetermined slope threshold,
And providing the user interface to the user interface.
Wherein the providing of the user interface in response to the mobile touch event comprises:
Providing a user interface for moving a target, which is not touching the finger arranged on the upper right side of the desktop, to a position where the finger touches when a left-down movement event occurs while holding the smart device with the left hand,
Providing a user interface for moving targets, which are not touching the finger arranged on a lower right-hand side desktop, to a position where the finger touches when a left-up movement event occurs while holding the smart device with the left hand,
Providing a user interface for moving a target that is not touching the finger arranged on the upper left desktop to a position where the finger touches when a right lower touch event occurs while holding the smart device with the right hand, And
Providing a user interface for moving targets, which are not touching the finger arranged on the lower left desktop, to a position where the finger touches when a right-up movement event occurs while holding the smart device with the right hand
And providing the user interface to the user interface.
Wherein the second targets include a predefined specific icon,
Wherein providing the user interface comprises:
When the specific icon is touched and touched, the screen mode is changed to move the background screen on which the first targets are aligned to the position touched by the finger corresponding to the moving direction of the touch of the finger, And aligning one or more targets of the second targets in the remaining area of the desktop.
After providing the user interface,
And restoring the desktop as it is when a predetermined time elapses after the screen movement mode is performed.
A touch event processing unit for providing a user interface for selecting one of the first targets by moving one or more targets of the first targets according to the touch event;
/ RTI >
The touch event processing unit,
And moves the desktop screen on which the first targets are arranged to a position where the finger touches, in correspondence with the movement direction of the finger.
The touch event processing unit,
And moves the second targets in the same direction as the first targets corresponding to the moving direction of the touch of the finger.
The touch event processing unit,
Wherein one or more targets of the second targets do not display on the screen in correspondence with a moving direction of the touch of the finger.
The touch event processing unit,
Aligning the first targets by moving them to a position where the finger is touched, and aligning one or more targets of the second targets in the remaining area of the desktop.
A clock signal oscillation unit for generating a periodic clock signal,
A clock number recording section for counting the clock signal output from the clock signal oscillation section and recording the number of clocks,
And an elapsed time measuring unit for measuring a touch related elapsed time using the number of clocks output by the clock number recording unit and providing the elapsed time to the touch event processing unit,
Lt; / RTI >
Wherein the elapsed time measuring unit comprises:
An on-touch timer for measuring a touch continuation elapsed time using the number of clocks recorded in the clock number recording unit from the moment the touch event triggered the touch panel,
And an off-touch timer for measuring the elapsed time of the touch release using the number of clocks recorded in the clock number recording unit from the moment the touch event triggerer leaves the touch panel
/ RTI >
The touch event processing unit,
A touch event notification module for notifying a touch event received from the touch input unit,
A touch event processing module for receiving a touch event generated by an operation of the touch event player pressing the touch panel from the touch event notification module,
A touch event processing module for receiving a touch event generated by an operation of the touch event player after pressing the touch panel from the touch event notification module,
A touch end event processing module for receiving a touch end event generated when the touch event trigger unit is removed from the touch panel, from the touch event notification module;
A target execution touch event processing module for receiving from the touch event notification module a target execution touch event generated by an operation or a pressing operation of the touch event trigger,
An on-touch timer control for driving or stopping the on-touch timer according to a request from each of the touch-down event processing module, the moving touch event processing module, the touch end event processing module and the target execution touch event processing module module,
An off-touch timer control for driving or stopping the off-touch timer according to a request of each of the touch-down event processing module, the moving touch event processing module, the touch end event processing module and the target execution touch event processing module, module,
A screen control module for implementing the user interface according to a request of the mobile touch event processing module,
A target instruction execution module that implements a function or an action corresponding to a target based on target information received from the target execution touch event processing module,
Lt; / RTI >
Wherein the push-touch event processing module comprises:
Touch timer to stop the operation of the off-touch timer, and when the screen-movable state is in a non-movable state, an operation start time of the on-touch timer is set to a time when the touch- The on-touch timer is started, the operation elapsed time of the on-timer during the occurrence of the push-touch event is measured, and if the time exceeds the threshold time, the user interface implementation is determined, Smart device that stops driving.
The mobile touch event processing module includes:
Analyzing the moving touch event to obtain moving touch coordinates, and if the screen movable state is in a movable state, implementing the user interface according to the moving touch coordinates.
The touch end event processing module includes:
The method comprising: setting the time at which the touch termination event occurs, as the off-touch timer driving start time after driving the off-touch timer, measuring the elapsed time of the off-touch timer during the touch termination, Touch timer is stopped, the control unit sets the operation start time of the off-touch timer to the off-timer operation limit time, and then controls the operation of the on- Stop,
Wherein the screen control module comprises:
And restoring the desktop to the original state at the request of the touch end event processing module.
Wherein the target execution touch event processing module comprises:
And transmits the target information to the target command execution module, and stops the operation of the on-touch timer and the off-touch timer when the screen movement enable state is impossible, and restores the desktop.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130056746 | 2013-05-20 | ||
KR1020130056746 | 2013-05-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140136393A true KR20140136393A (en) | 2014-11-28 |
Family
ID=52456667
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140059932A KR20140136393A (en) | 2013-05-20 | 2014-05-19 | Method for providing user interface and smart device thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140136393A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20170050910A (en) * | 2015-11-02 | 2017-05-11 | 에스케이텔레콤 주식회사 | Apparatus and method for controlling touch in a touch screen |
-
2014
- 2014-05-19 KR KR1020140059932A patent/KR20140136393A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20170050910A (en) * | 2015-11-02 | 2017-05-11 | 에스케이텔레콤 주식회사 | Apparatus and method for controlling touch in a touch screen |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TW212236B (en) | ||
US8907907B2 (en) | Display device with touch panel, event switching control method, and computer-readable storage medium | |
RU2552637C2 (en) | Device, system and method of remote control | |
EP3413163A1 (en) | Method for processing data collected by touch panel, and terminal device | |
JP5479414B2 (en) | Information processing apparatus and control method thereof | |
CN103513865A (en) | Touch control equipment and method and device for controlling touch control equipment to configure operation mode | |
US20140002393A1 (en) | Controlling a cursor on a touch screen | |
KR101156610B1 (en) | Method for input controlling by using touch type, and computer-readable recording medium with controlling program using touch type | |
WO1996024095A1 (en) | Method and an apparatus for simulating the states of a mechanical button on a touch-sensitive input device | |
CN103513882A (en) | Touch control equipment control method and device and touch control equipment | |
CN104536643B (en) | A kind of icon drag method and terminal | |
US9274702B2 (en) | Drawing device, drawing control method, and drawing control program for drawing graphics in accordance with input through input device that allows for input at multiple points | |
CN103513817A (en) | Touch control equipment and method and device for controlling touch control equipment to configure operation mode | |
US9846529B2 (en) | Method for processing information and electronic device | |
CN104076972A (en) | A device and a method for selecting a touch screen hot spot | |
CN105808129B (en) | Method and device for quickly starting software function by using gesture | |
CN104951213A (en) | Method for preventing false triggering of edge sliding gesture and gesture triggering method | |
CN103513886A (en) | Touch control device and target object moving method and device of touch control device | |
JP4653297B2 (en) | Control device, electronic device, and medium | |
WO2018046000A1 (en) | Touch operation method and device | |
JP2012113645A (en) | Electronic apparatus | |
KR20140136393A (en) | Method for providing user interface and smart device thereof | |
CN105808080B (en) | Method and device for quickly copying object by utilizing gesture | |
WO2016206438A1 (en) | Touch screen control method and device and mobile terminal | |
CN107506132B (en) | Display device, display method of display device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |