KR101835952B1 - Apparatus and method for controlling scroll of screen - Google Patents

Apparatus and method for controlling scroll of screen Download PDF

Info

Publication number
KR101835952B1
KR101835952B1 KR1020160013025A KR20160013025A KR101835952B1 KR 101835952 B1 KR101835952 B1 KR 101835952B1 KR 1020160013025 A KR1020160013025 A KR 1020160013025A KR 20160013025 A KR20160013025 A KR 20160013025A KR 101835952 B1 KR101835952 B1 KR 101835952B1
Authority
KR
South Korea
Prior art keywords
screen
interface
reference direction
touch input
time
Prior art date
Application number
KR1020160013025A
Other languages
Korean (ko)
Other versions
KR20170092020A (en
Inventor
남인현
Original Assignee
선문대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 선문대학교 산학협력단 filed Critical 선문대학교 산학협력단
Priority to KR1020160013025A priority Critical patent/KR101835952B1/en
Publication of KR20170092020A publication Critical patent/KR20170092020A/en
Application granted granted Critical
Publication of KR101835952B1 publication Critical patent/KR101835952B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

A processor for outputting to the screen an interface including a sensor section for detecting a contact time of the touch input and at least one reference direction indicating a direction in which the screen is to be moved, There is provided a screen movement control device for classifying which time segment of the three time segments is included and moving the screen by the corresponding size between the classified time segments along the selected reference direction.

Description

[0001] APPARATUS AND METHOD FOR CONTROLLING SCROLL OF SCREEN [0002]

The present invention relates to a screen motion control apparatus and method, and more particularly, to an apparatus and method for controlling screen motion based on a user's touch input.

With the recent development of mobile technology, devices using touch sensors such as smart phones and smart pads are widely used. The user inputs a command by pressing or dragging the screen to which the touch sensor is attached.

If the operation of rubbing the screen is performed excessively, the following side effects may be caused.

First, the user's fingerprint may be damaged.

Second, since the end of the user's hand is restrained to the two-dimensional plane on the screen, the finger and the carpal joint may be burdened. This can cause damage to the ligaments associated with the finger and carpal joints. Particularly, the probability of occurrence of the problem increases as the size of the screen decreases, because the user moves the elbow less as the size of the screen decreases.

In addition, a device or an operating system that supports an operation of rubbing a screen today can not perform precise screen movement based on an operation of rubbing the screen. The device or operating system performs movement regardless of the distance the fingertip moves on the screen. Therefore, it is difficult for the user to precisely move the screen or the map by rubbing the screen in a situation requiring precise movement such as a design drawing, a military and medical related screen or a map.

The present invention proposes a screen motion control apparatus and method capable of performing screen motion by touching an interface on a screen without performing a screen rubbing operation.

The present invention proposes a screen movement control device and method capable of adjusting the distance to move the screen by adjusting the contact time when touching the interface on the screen.

The present invention proposes a screen movement control apparatus and method that can utilize a screen more efficiently by outputting an interface to only one of the sides of the screen.

According to an embodiment of the present invention, there is provided a display device including a processor for outputting an interface including a sensor unit for detecting a touch time of a touch input and at least one reference direction indicating a direction in which a screen is to be moved, Classifying which of the three time periods the contact time is included when the reference direction is selected by the touch input, moving the screen by a corresponding size between the classified time periods along the selected reference direction, Is provided.

According to an embodiment of the present invention, there is provided a method of controlling a touch input device, comprising: outputting an interface including at least one reference direction indicating a direction in which a screen is to be moved to a screen; A step of classifying which of the three time periods the contact time is included, and a step of moving the screen by a corresponding size between the classified time periods along the selected reference direction Method is provided.

According to an embodiment of the present invention, a screen movement control apparatus and method capable of performing screen movement by touching an interface on a screen without performing a screen rubbing operation can be provided.

According to an embodiment of the present invention, a screen movement control apparatus and method capable of adjusting a distance to move a screen by adjusting a contact time when an interface on a screen is touched can be provided.

According to an embodiment of the present invention, it is possible to provide a screen movement control apparatus and method that can utilize a screen more efficiently by outputting an interface to only one of the sides of the screen.

1 is a diagram illustrating a structure of a screen motion control apparatus according to an embodiment of the present invention.
2 is a diagram illustrating a user interface screen output by the screen motion control apparatus according to an exemplary embodiment of the present invention. FIG. 3 is a block diagram illustrating a screen motion control apparatus according to an exemplary embodiment of the present invention. Fig.
4A and 4B are views illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention moves a screen based on a touch input of a user.
5 is a flowchart illustrating an operation of outputting an interface by the screen motion control apparatus according to an embodiment of the present invention.
6A and 6B are diagrams illustrating an example in which the scroll control apparatus according to an exemplary embodiment of the present invention outputs an interface to only one side of a screen.
7A to 7C are views showing an example of a detection area according to an embodiment of the present invention.

It is to be understood that the specific structural or functional descriptions of embodiments of the present invention disclosed herein are presented for the purpose of describing embodiments only in accordance with the concepts of the present invention, May be embodied in various forms and are not limited to the embodiments described herein.

Embodiments in accordance with the concepts of the present invention are capable of various modifications and may take various forms, so that the embodiments are illustrated in the drawings and described in detail herein. However, it is not intended to limit the embodiments according to the concepts of the present invention to the specific disclosure forms, but includes changes, equivalents, or alternatives falling within the spirit and scope of the present invention.

The terms first, second, or the like may be used to describe various elements, but the elements should not be limited by the terms. The terms may be named for the purpose of distinguishing one element from another, for example without departing from the scope of the right according to the concept of the present invention, the first element being referred to as the second element, Similarly, the second component may also be referred to as the first component.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Expressions that describe the relationship between components, for example, "between" and "immediately" or "directly adjacent to" should be interpreted as well.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms " comprises " or " having ", and the like, are used to specify one or more of the features, numbers, steps, operations, But do not preclude the presence or addition of steps, operations, elements, parts, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the meaning of the context in the relevant art and, unless explicitly defined herein, are to be interpreted as ideal or overly formal Do not.

Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. However, the scope of the patent application is not limited or limited by these embodiments. Like reference symbols in the drawings denote like elements.

1 is a diagram illustrating a structure of a screen motion control apparatus according to an embodiment of the present invention.

According to an embodiment of the present invention, the device 101 may be a smartphone or a smartpad, which can change a screen based on a touch input of a user. Although the device 101 of FIG. 1 is shown on the basis of an embodiment of a smartphone or smart pad, the device 101 may be a display device including a personal computer, a laptop, . The screen motion control apparatus according to one embodiment can be mounted as a part of the device 101. [

Referring to FIG. 1, the screen motion control apparatus according to an exemplary embodiment may operate in conjunction with a screen 102 displaying a screen. The screen 102 may be any one of a liquid crystal display (LCD), a light emitting polymer display (LPD), a plasma display panel (PDP), or a light emitting diode (LED) have. In addition, the screen 102 may include electronic ink.

Referring to FIG. 1, the scroll control apparatus according to an exemplary embodiment may include a sensor unit 103 for detecting a touch input from a user. In response to at least one touch input, the sensor unit 103 can output the coordinates and contact time of each of the touch inputs. The sensor unit 103 can detect a touch input according to any one of a depressurization type, an electrostatic type, an infrared type, and a surface acoustic wave type.

Referring to FIG. 1, the screen motion control apparatus according to one embodiment may include a processor 104 that moves a screen displayed on the screen 102 based on a touch input of a user. The processor 104 may determine a direction in which to move the screen and a distance to move based on the touch input detected by the sensor unit 103. The processor 104 may display an interface on the screen 102 to collect information about movement of the screen from the user. The interface may include a reference direction indicating the direction to move the screen.

When the user touches the reference direction in the interface, the sensor unit 103 can detect the coordinates of the touch input and the contact time of the touch input. Processor 104 may identify the selected reference direction based on the coordinates of the detected touch input. In addition, the processor 104 may determine the distance to move the screen based on the detected contact time. The processor can move the screen based on the identified reference direction and the determined travel distance.

According to one embodiment of the present invention, the processor 104 may be a central processing unit (CPU) or a chipset. The processor 104 may fetch the instruction recording the scroll control method from the internal memory or the external memory. The processor 104 may perform a scrolling control method according to an embodiment based on the fetched instruction.

The scroll control device according to one embodiment may be provided in a stand-alone manner. That is, a screen movement control apparatus including only the screen 102, the sensor unit 103, and the processor 104 may be provided.

2A and 2B are diagrams illustrating a user interface screen output by the screen motion control apparatus according to an embodiment of the present invention. The screen motion control apparatus can output an interface for collecting information on the movement of the screen on the screen along each side of the screen.

The scroll control device may be implemented in a smart phone or a smart pad. In this case, the direction of the screen may be divided into a portrait mode in which long sides of the screen are perpendicular to the ground surface and a landscape mode in which long sides of the screen are horizontal to the ground surface. FIG. 2A is a diagram illustrating a user interface screen output by the screen motion controller in the portrait mode. FIG. FIG. 2B is a diagram illustrating a user interface screen output by the screen motion control apparatus in the landscape mode. Referring to FIGS. 2A and 2B, the interface output by the scroll control device can maintain the same shape regardless of the direction of the screen. Thus, the interface has rotational symmetry.

Referring to Figs. 2A and 2B, the interface may include a reference direction indicating a direction in which to move the screen. The scroll control device can output three reference directions on each side of the screen. The three reference directions output to the respective sides are determined in three directions other than the direction toward the center of the screen with respect to each side among the top direction, bottom direction, left direction, and right direction of the screen. Referring to FIG. 2A, the screen motion control device outputs the upper reference direction 202, the left reference direction 201, and the right reference direction 203 to the upper side of the screen. Since the bottom reference direction 204 faces the center of the screen with respect to the upper side, the screen motion control apparatus does not output the lower reference direction 204 to the upper side.

Referring to FIGS. 2A and 2B, the screen motion control device forms a rectangular area along each side of the screen, and displays a reference direction within each area. However, the form of the interface displayed by the scroll control device is not limited to the rectangular shape. As another example, the scroll control device may form an elliptical or circular area, and may display a reference direction within each area. Further, the screen motion control device can display only the reference direction without displaying the formed area to the user.

The screen motion control device can receive the direction in which the screen is to be moved by the user by outputting the interface. For example, when the user selects the left reference direction 201, the screen motion control device can move the screen to the left direction. Further, the screen motion control device can determine to what extent the user moves in the leftward direction based on the contact time when the user touches the hand end in the left reference direction 201. [

According to one embodiment, the scroll control device can output an interface to only one side of the screen. That is, the screen motion control apparatus can output only the upper side of the interface, and only the upper reference direction 202, the left reference direction 201, and the right reference direction 203 can be output to the screen at this time. The scroll control device can select either side of the screen to output the interface based on the touch input of the user or the software to be executed. This will be described later in detail.

FIG. 3 is a flowchart illustrating an operation performed by the scroll control apparatus according to an exemplary embodiment of the present invention. The screen motion control apparatus can move the screen by executing software for the screen motion control method. The software for the scrolling control method may include a form of instruction code for the scrolling control method.

In step 310, the screen motion control apparatus according to an exemplary embodiment may output an interface including a reference direction indicating a direction in which a screen is to be moved to a screen. The scroll control device can output the interface based on the request of the running software. For example, the map navigation software may request the scroll control device to output an interface. The screen motion control device may overlap the interface with the screen generated by the map searching software in response to the request.

According to another embodiment, the scroll control device can output the interface based on the touch input of the user. That is, the scroll control device may not output the interface until the user performs the touch input.

In step 320, the scroll control device according to an exemplary embodiment may detect a touch input of a user. The scroll control device can identify which reference direction the user has selected based on the coordinates of the touch input. If the coordinate of the touch input is at the boundary of the reference direction, the scroll control device can identify which reference direction has been selected according to the measured pressure. That is, the screen motion control apparatus can identify that the user has selected the reference direction to which the pressure is increased.

Further, the screen motion control device can measure the time (contact time) when the user performs the touch input. The scroll control device can identify a contact time of 0.2 seconds or less.

In step 330, the scroll control device according to one embodiment can identify which time period the contact time is included. When the contact time is between 0 seconds and 0.2 seconds, the scroll control device can classify the contact time into the first time interval. When the contact time is between 0.2 seconds and 0.5 seconds, the scroll control device can classify the contact time into the second time interval. When the contact time is 0.5 seconds or more, the scroll control device can classify the contact time into the third time zone.

In step 340, the screen motion control apparatus according to one embodiment can move the screen based on the touch input of the user. The scroll control device can move in real time while the user touches the reference direction. According to another embodiment, the scroll control device can move the screen after the user stops the touch input.

The scroll control device can determine the direction in which to move the screen based on the selected reference direction. The screen motion control apparatus can determine the distance to move the screen based on the time interval including the contact time and the width of the screen corresponding to the selected reference direction.

More specifically, when the contact time is included in the first time period, the screen motion control apparatus can move the screen along the reference direction selected to be 1/3 to 1/2 times the width of the screen corresponding to the selected reference direction have. When the contact time is included in the second time slot, the screen motion control apparatus can move the screen along a reference direction selected by a size of one-half to one-half of the width of the screen corresponding to the selected reference direction. When the contact time is included in the third time period, the screen motion control device can move the screen along the reference direction selected by one to two times the width of the screen corresponding to the selected reference direction.

Therefore, the scroll control device can move the screen to three different lengths with respect to the three pointers. The screen motion control apparatus discretely determines the distance to move the screen according to the time interval, so that the user can more precisely control the distance to move the screen. This will be described in more detail with reference to Figs. 4A to 4B.

4A and 4B are views illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention moves a screen based on a touch input of a user. 4A is a diagram showing a moment when a user selects a reference direction.

Referring to FIG. 4A, a map may be displayed on screen 402. The user can move the map by rubbing the screen 402 directly or by selecting a reference direction included in the interface. Assume that the user has touched the right reference direction 401 included in the interface for 0.15 seconds.

The scroll control device can determine the direction to move the screen 402 or the distance to move the screen 402 based on the touch input of the user. Since the direction in which the screen 402 is moved is determined based on the selected reference direction, the screen 402 can move in the right direction.

The distance to move the screen 402 may be determined based on the contact time of the touch input and the width of the screen corresponding to the selected reference direction. Since the right reference direction 401 has been selected, the distance to move the screen 402 can be determined based on the horizontal width 403 of the screen.

The scroll control device can determine the distance to move the screen based on the time period including the contact time. According to the above assumption, the user has touched the right reference direction 401 for 0.15 seconds, and 0.15 seconds is included for the first time period. Accordingly, the screen motion control device can move the screen 402 along the right reference direction 401 by a magnitude of 1/3 to 1/2 times the width 403 of the screen.

4B is a diagram showing a result of moving the screen based on the touch input of the user by the scroll control device. Referring to FIG. 4B, it can be seen that the screen 402 has moved about 1/3 times the width 403 of the screen along the right reference direction 401. Since the screen 402 is moved discretely along the time span, the distance by which the user moves the screen can be more accurately determined. In addition, since the distance to move the screen 402 can be set differently for each of the three time periods, the user can more freely control the distance to move the screen 402.

According to one embodiment, the scroll control device may move the screen based on the measured contact time when the user releases his / her hand on the screen 402. [ In this case, the screen motion control device will move the screen after the user releases his or her hand.

According to another embodiment, the screen motion control device can move the screen in real time while a user is making a touch input. For example, assume that the user touches the right reference direction 401 for 0.4 seconds. The screen motion control device moves the screen 402 along the right reference direction 401 by a factor of 1/3 to 1/2 times the width 403 while the contact time is included in the first time period. When the contact time exceeds 0.2 seconds and the contact time is included in the second time slot, the screen movement control device displays the right reference direction 401 by a magnitude of 1/2 to 1 times the width 403, The screen 402 can be moved further.

0.2 second as a boundary value between the first time interval and the second time interval exemplified in the above embodiments considers the sensitivity and responsiveness of the sensor portion capable of sensing a touch input on a recent touch panel , The user can determine the duration of the touch input, which can distinguish between the differences, heuristically. As another example, 0.5 seconds as the boundary value of the second time interval and the third time can also be determined based on the electrical specifications of the sensor section and the user's experience.

When the screen motion controller divides the contact time into the respective time periods, if the interval between the time periods is too short, the screen 402 will be moved more than the user's intention. On the other hand, if the interval between the respective time periods is too long, the user can touch the screen 402 for a long time. This can make the user uncomfortable.

The 0.2 second and 0.5 second set to the boundary value between the first time period and the third time period in the above embodiments allow the user to move the screen 402 a lot without moving the screen 402 more than the user's intention It is a boundary value that can maintain the contact time without great inconvenience. That is, when the boundary values between the first time interval and the third time interval are set to 0.2 second and 0.5 second, respectively, the user can reach the third time zone without great inconvenience, and further, the first time interval and the second time interval By easily distinguishing the time zones, the distance to move the screen 402 can be adjusted according to the intention.

In addition, the boundary value between the first time interval and the second time interval of 0.2 sec is a minimum time for which the sensor unit can detect the presence or absence of touch by the user and can measure the contact time, and a minimum time Is a boundary value set in consideration of. If the boundary value between the first time zone and the second time zone is set to be shorter than 0.2 second, the sensor unit may not accurately identify the presence or absence of the user's touch.

5 is a flowchart illustrating an operation of outputting an interface by the screen motion control apparatus according to an embodiment of the present invention. The screen motion control apparatus according to one embodiment may perform the operation of FIG. 5 in step 310 of FIG.

In step 510, the scroll control device according to an exemplary embodiment may detect a touch input of a user. The scroll control device can determine whether to output the interface based on the coordinates of the touch input. When the user performs a gesture to move the screen, the scroll control device can decide to output the interface. For example, if the user performs a gesture of rubbing the screen, or if the user touches the edge of the screen, the scroll control device may decide to output the interface.

In step 520, the screen motion control apparatus according to one embodiment can select the side of the screen on which to output the interface, based on the touch input. The screen motion control device can output the interface to the side of the screen located closest to the coordinate of the touch input.

According to another embodiment, the scroll control device can divide the screen into a constant detection area. The scroll control device can select the side of the screen based on the detection area including the coordinates of the touch input. The detection area set on the screen by the scroll control device will be described in detail later.

In step 530, the screen motion control apparatus according to an exemplary embodiment may output an interface to a side of a selected screen. Thereby, the screen motion control device outputs the interface to only one side of the screen, thereby increasing the utilization of the screen.

6A and 6B are diagrams illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention outputs an interface to only one side of the screen 600.

6A is a view showing a state in which a user touches a specific coordinate of the screen 600. [ Referring to FIG. 6A, when the map searching software is running, a map may be displayed on the screen 600. FIG. In order to increase the utilization of the screen, the screen motion control device may not output an interface capable of moving the screen while there is no touch input. When the user touches the specific coordinate 610 of the screen, the scroll control device can determine whether to output the interface based on the specific coordinate 610. [

Referring to FIG. 6A, since the specific coordinate 610 is located at the edge of the screen, the scroll control device can decide to output the interface. Furthermore, the screen motion control device can determine to output the interface to the right side since the right side is closest to the specific coordinate 610 among the sides of the screen. According to another embodiment, when the specific coordinate 610 is located at the edge of the screen, the screen motion control device can output an interface to the variable portion of the screen.

6B is a diagram showing an example in which the scroll control device outputs the interface 620 to the right side of the screen. Referring to FIG. 6B, the scroll control device may not output the interface to the other side except the right side. Since the screen motion control apparatus does not output the interface to the side of the screen that is distant from the specific coordinate 610, the utilization of the screen can be increased.

According to one embodiment of the present invention, the screen motion control device can output an interface to a side of a screen corresponding to a detection area included, based on whether a specific coordinate 610 is included in a specific detection area of the screen. The screen motion control device can set the detection area by dividing the screen before the interface is displayed.

7A to 7C are views showing an example of a detection area according to an embodiment of the present invention. When the user touches the coordinates in the specific detection area, the screen motion control device can output the interface to the side of the screen corresponding to the specific detection area.

Referring to FIG. 7A, the screen motion controller can set five detection areas 701, 702, 703, 704, and 705 on the screen. The detection areas 701, 702, 703, and 704 are areas set by the number of sides of the screen, and can be set to include each side of the screen. For example, the detection area 701 includes the right side of the screen, the detection area 704 includes the upper side of the screen, the detection area 703 includes the left side of the screen, and the detection area 702 ) May include the bottom side of the screen. The detection area 705 may be set to be not included in the detection area 705 by being spaced from a predetermined distance from all sides of the screen. The distance that the detection area 705 is spaced apart from the side of the screen may be set differently for each side. When the user selects the detection area 701, the scroll control device can output the interface 706 to the right side associated with the detection area 701. [

The scroll control device may not output the interface if the selected detection area is not related to the side of the screen. Referring to FIG. 7A, it can be seen that the detection area 705 does not include any sides of the screen. Accordingly, when the user selects the detection area 705, the scroll control device may not display the interface.

Referring to FIG. 7B, the screen motion control device can set four detection areas 711, 712, 713, and 714 on the screen based on two diagonal lines connecting opposing vertexes of the screen. The screen motion control device can detect which one of the four detection areas 711, 712, 713, and 714 the user selects and then output the interface to the side of the screen corresponding to the selected detection area.

Referring to FIG. 7B, when the user selects the detection area 712, the scroll control device can output the interface 715 to the lower side of the screen corresponding to the detection area 712.

Referring to FIG. 7C, the scroll control device can set four detection areas 721, 722, 723, and 724 on the screen based on the vertical direction and the horizontal direction of the screen. Likewise, the scroll control device can output the interface to the side of the screen corresponding to the detection area selected by the user.

The scroll control device may not display the set detection area on the screen. However, according to another embodiment, the screen motion control device can display the set detection area on the screen for the convenience of the user.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

101: Device
102: Screen
103:
104: Processor

Claims (11)

A processor for outputting to the screen an interface including at least one reference direction indicating a direction in which to move the screen based on the first touch input; And
A sensor unit which is input after the first touch input and detects contact time of the second touch input input corresponding to the interface,
Lt; / RTI >
The processor comprising:
When the reference direction is selected by the second touch input,
Classifying which of the three time periods the contact time is included, moving the screen by the corresponding size between the classified time periods along the selected reference direction,
Determines whether or not the interface is output or the side to which the interface is to be output from the sides of the screen based on which of the plurality of detection areas for dividing the screen includes the first touch input,
Wherein the plurality of detection areas comprise:
A detection area set for the number of sides of the screen for outputting the interface to any one of the sides of the screen and configured to include each side of the screen and a detection area set for each side of the screen, And a detection area set apart by a set distance,
The three time periods are,
A first period of time ranging from 0 second to less than 0.2 second in which the size of moving the screen along the selected reference direction is determined to be less than 1/2 times greater than 1/3 times the width of the screen;
A second time period in which a size for moving the screen along the selected reference direction is 0.2 seconds or longer but less than 0.5 seconds to determine the size to be less than 1 time from 1/2 times the width of the screen; And
A third time period of 0.5 seconds or more in which the size for moving the screen along the selected reference direction is determined to be less than twice the width of the screen,
And a control unit for controlling the display unit.
delete delete delete delete delete The method according to claim 1,
The processor comprising:
The remaining three directions excluding the direction toward the center of the screen are displayed on the side of the screen as the reference direction from among the top direction, bottom direction, left direction, and right direction of the screen, The screen movement control apparatus comprising:
Determining whether to output, based on the first touch input, an interface including at least one reference direction indicating a direction to move the screen on the screen;
Detecting the contact time of the second touch input when the interface is output to the screen and the reference direction is selected through the second touch input input corresponding to the interface;
Classifying which of the three time periods the contact time is included; And
Moving the screen along the selected reference direction by a corresponding magnitude between the classified time periods,
Lt; / RTI >
Wherein the determining comprises:
Determining whether to output the interface based on which of the plurality of detection areas in which the first touch input divides the screen is included in the detection area,
Wherein the plurality of detection areas comprise:
A detection area set to the number of the screens for outputting the interface to any one of the sides of the screen and configured to include each side of the screen and a detection area set for each side of the screen so as not to output the interface, And a detection area that is set apart by a distance,
The three time periods are,
A first period of time ranging from 0 second to less than 0.2 second in which the size of moving the screen along the selected reference direction is determined to be less than 1/2 times greater than 1/3 times the width of the screen;
A second time period in which a size for moving the screen along the selected reference direction is 0.2 seconds or longer but less than 0.5 seconds to determine the size to be less than 1 time from 1/2 times the width of the screen; And
A third time period of 0.5 seconds or more in which the size for moving the screen along the selected reference direction is determined to be less than twice the width of the screen,
And controlling the movement of the screen.
delete delete A computer-readable recording medium on which a program for executing the method of claim 8 is recorded.
KR1020160013025A 2016-02-02 2016-02-02 Apparatus and method for controlling scroll of screen KR101835952B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160013025A KR101835952B1 (en) 2016-02-02 2016-02-02 Apparatus and method for controlling scroll of screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160013025A KR101835952B1 (en) 2016-02-02 2016-02-02 Apparatus and method for controlling scroll of screen

Publications (2)

Publication Number Publication Date
KR20170092020A KR20170092020A (en) 2017-08-10
KR101835952B1 true KR101835952B1 (en) 2018-03-08

Family

ID=59652305

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160013025A KR101835952B1 (en) 2016-02-02 2016-02-02 Apparatus and method for controlling scroll of screen

Country Status (1)

Country Link
KR (1) KR101835952B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101182076B1 (en) * 2011-03-31 2012-09-11 선문대학교 산학협력단 Apparatus and method of screen scrolling for portable terminals with touch screen

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101182076B1 (en) * 2011-03-31 2012-09-11 선문대학교 산학협력단 Apparatus and method of screen scrolling for portable terminals with touch screen

Also Published As

Publication number Publication date
KR20170092020A (en) 2017-08-10

Similar Documents

Publication Publication Date Title
US9910527B2 (en) Interpretation of pressure based gesture
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
US9678606B2 (en) Method and device for determining a touch gesture
US9746975B2 (en) Capacitive measurement processing for mode changes
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
US9798417B2 (en) Thermal baseline relaxation
US8743065B2 (en) Method of identifying a multi-touch rotation gesture and device using the same
KR20100108116A (en) Apparatus and method for recognizing touch gesture
US20120249599A1 (en) Method of identifying a multi-touch scaling gesture and device using the same
US20160196034A1 (en) Touchscreen Control Method and Terminal Device
US9280284B2 (en) Method, apparatus and computer readable medium for polygon gesture detection and interaction
US20120249487A1 (en) Method of identifying a multi-touch shifting gesture and device using the same
US20160349871A1 (en) Capacitive stereoscopic image sensing
KR102224932B1 (en) Apparatus for processing user input using vision sensor and method thereof
CN105468214B (en) Location-based object classification
US20140176448A1 (en) Detecting a gesture
KR20160019449A (en) Disambiguation of indirect input
JP2020170297A (en) Input device
CN107272971B (en) Grip management
KR102210045B1 (en) Apparatus and method for contrlling an input of electronic device having a touch device
KR101393733B1 (en) Touch screen control method using bezel area
US10078406B2 (en) Capacitive side position extrapolation
KR101835952B1 (en) Apparatus and method for controlling scroll of screen
KR101422447B1 (en) Method and apparatus for changing page of e-book using pressure modeling
US9244608B2 (en) Method and system for gesture identification

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right