KR20170092020A - Apparatus and method for controlling scroll of screen - Google Patents
Apparatus and method for controlling scroll of screen Download PDFInfo
- Publication number
- KR20170092020A KR20170092020A KR1020160013025A KR20160013025A KR20170092020A KR 20170092020 A KR20170092020 A KR 20170092020A KR 1020160013025 A KR1020160013025 A KR 1020160013025A KR 20160013025 A KR20160013025 A KR 20160013025A KR 20170092020 A KR20170092020 A KR 20170092020A
- Authority
- KR
- South Korea
- Prior art keywords
- screen
- reference direction
- touch input
- interface
- time
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
The present invention relates to a screen motion control apparatus and method, and more particularly, to an apparatus and method for controlling screen motion based on a user's touch input.
With the recent development of mobile technology, devices using touch sensors such as smart phones and smart pads are widely used. The user inputs a command by pressing or dragging the screen to which the touch sensor is attached.
If the operation of rubbing the screen is performed excessively, the following side effects may be caused.
First, the user's fingerprint may be damaged.
Second, since the end of the user's hand is restrained to the two-dimensional plane on the screen, the finger and the carpal joint may be burdened. This can cause damage to the ligaments associated with the finger and carpal joints. Particularly, the probability of occurrence of the problem increases as the size of the screen decreases, because the user moves the elbow less as the size of the screen decreases.
In addition, a device or an operating system that supports an operation of rubbing a screen today can not perform precise screen movement based on an operation of rubbing the screen. The device or operating system performs movement regardless of the distance the fingertip moves on the screen. Therefore, it is difficult for the user to precisely move the screen or the map by rubbing the screen in a situation requiring precise movement such as a design drawing, a military and medical related screen or a map.
The present invention proposes a screen motion control apparatus and method capable of performing screen motion by touching an interface on a screen without performing a screen rubbing operation.
The present invention proposes a screen movement control device and method capable of adjusting the distance to move the screen by adjusting the contact time when touching the interface on the screen.
The present invention proposes a screen movement control apparatus and method that can utilize a screen more efficiently by outputting an interface to only one of the sides of the screen.
According to an embodiment of the present invention, there is provided a display device including a processor for outputting an interface including a sensor unit for detecting a touch time of a touch input and at least one reference direction indicating a direction in which a screen is to be moved, Classifying which of the three time periods the contact time is included when the reference direction is selected by the touch input, moving the screen by a corresponding size between the classified time periods along the selected reference direction, Is provided.
According to an embodiment of the present invention, there is provided a method of controlling a touch input device, comprising: outputting an interface including at least one reference direction indicating a direction in which a screen is to be moved to a screen; A step of classifying which of the three time periods the contact time is included, and a step of moving the screen by a corresponding size between the classified time periods along the selected reference direction Method is provided.
According to an embodiment of the present invention, a screen movement control apparatus and method capable of performing screen movement by touching an interface on a screen without performing a screen rubbing operation can be provided.
According to an embodiment of the present invention, a screen movement control apparatus and method capable of adjusting a distance to move a screen by adjusting a contact time when an interface on a screen is touched can be provided.
According to an embodiment of the present invention, it is possible to provide a screen movement control apparatus and method that can utilize a screen more efficiently by outputting an interface to only one of the sides of the screen.
1 is a diagram illustrating a structure of a screen motion control apparatus according to an embodiment of the present invention.
2 is a diagram illustrating a user interface screen output by the screen motion control apparatus according to an exemplary embodiment of the present invention. FIG. 3 is a diagram illustrating a screen motion control apparatus according to an exemplary embodiment of the present invention, Fig.
4A and 4B are views illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention moves a screen based on a touch input of a user.
5 is a flowchart illustrating an operation of outputting an interface by the screen motion control apparatus according to an embodiment of the present invention.
6A and 6B are diagrams illustrating an example in which the scroll control apparatus according to an exemplary embodiment of the present invention outputs an interface to only one side of a screen.
7A to 7C are views showing an example of a detection area according to an embodiment of the present invention.
It is to be understood that the specific structural or functional descriptions of embodiments of the present invention disclosed herein are presented for the purpose of describing embodiments only in accordance with the concepts of the present invention, May be embodied in various forms and are not limited to the embodiments described herein.
Embodiments in accordance with the concepts of the present invention are capable of various modifications and may take various forms, so that the embodiments are illustrated in the drawings and described in detail herein. However, it is not intended to limit the embodiments according to the concepts of the present invention to the specific disclosure forms, but includes changes, equivalents, or alternatives falling within the spirit and scope of the present invention.
The terms first, second, or the like may be used to describe various elements, but the elements should not be limited by the terms. The terms may be named for the purpose of distinguishing one element from another, for example without departing from the scope of the right according to the concept of the present invention, the first element being referred to as the second element, Similarly, the second component may also be referred to as the first component.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Expressions that describe the relationship between components, for example, "between" and "immediately" or "directly adjacent to" should be interpreted as well.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms " comprises ", or " having ", and the like, are used to specify one or more of the features, numbers, steps, operations, elements, But do not preclude the presence or addition of steps, operations, elements, parts, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the meaning of the context in the relevant art and, unless explicitly defined herein, are to be interpreted as ideal or overly formal Do not.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. However, the scope of the patent application is not limited or limited by these embodiments. Like reference symbols in the drawings denote like elements.
1 is a diagram illustrating a structure of a screen motion control apparatus according to an embodiment of the present invention.
According to an embodiment of the present invention, the
Referring to FIG. 1, the screen motion control apparatus according to an exemplary embodiment may operate in conjunction with a
Referring to FIG. 1, the scroll control apparatus according to an exemplary embodiment may include a
Referring to FIG. 1, the screen motion control apparatus according to one embodiment may include a
When the user touches the reference direction in the interface, the
According to one embodiment of the present invention, the
The scroll control device according to one embodiment may be provided in a stand-alone manner. That is, a screen movement control apparatus including only the
2A and 2B are diagrams illustrating a user interface screen output by the screen motion control apparatus according to an embodiment of the present invention. The screen motion control apparatus can output an interface for collecting information on the movement of the screen on the screen along each side of the screen.
The scroll control device may be implemented in a smart phone or a smart pad. In this case, the direction of the screen may be divided into a portrait mode in which long sides of the screen are perpendicular to the ground surface and a landscape mode in which long sides of the screen are horizontal to the ground surface. FIG. 2A is a diagram illustrating a user interface screen output by the screen motion controller in the portrait mode. FIG. FIG. 2B is a diagram illustrating a user interface screen output by the screen motion control apparatus in the landscape mode. Referring to FIGS. 2A and 2B, the interface output by the scroll control device can maintain the same shape regardless of the direction of the screen. Thus, the interface has rotational symmetry.
Referring to Figs. 2A and 2B, the interface may include a reference direction indicating a direction in which to move the screen. The scroll control device can output three reference directions on each side of the screen. The three reference directions output to the respective sides are determined in three directions other than the direction toward the center of the screen with respect to each side among the top direction, bottom direction, left direction, and right direction of the screen. Referring to FIG. 2A, the screen motion control device outputs the
Referring to FIGS. 2A and 2B, the screen motion control device forms a rectangular area along each side of the screen, and displays a reference direction within each area. However, the form of the interface displayed by the scroll control device is not limited to the rectangular shape. As another example, the scroll control device may form an elliptical or circular area, and may display a reference direction within each area. Further, the screen motion control device can display only the reference direction without displaying the formed area to the user.
The screen motion control device can receive the direction in which the screen is to be moved by the user by outputting the interface. For example, when the user selects the
According to one embodiment, the scroll control device can output an interface to only one side of the screen. That is, the screen motion control apparatus can output only the upper side of the interface, and only the
FIG. 3 is a flowchart illustrating an operation performed by the scroll control apparatus according to an exemplary embodiment of the present invention. The screen motion control apparatus can move the screen by executing software for the screen motion control method. The software for the scrolling control method may include a form of instruction code for the scrolling control method.
In
According to another embodiment, the scroll control device can output the interface based on the touch input of the user. That is, the scroll control device may not output the interface until the user performs the touch input.
In
Further, the screen motion control device can measure the time (contact time) when the user performs the touch input. The scroll control device can identify a contact time of 0.2 seconds or less.
In
In
The scroll control device can determine the direction in which to move the screen based on the selected reference direction. The screen motion control apparatus can determine the distance to move the screen based on the time interval including the contact time and the width of the screen corresponding to the selected reference direction.
More specifically, when the contact time is included in the first time period, the screen motion control apparatus can move the screen along the reference direction selected to be 1/3 to 1/2 times the width of the screen corresponding to the selected reference direction have. When the contact time is included in the second time slot, the screen motion control apparatus can move the screen along a reference direction selected by a size of one-half to one-half of the width of the screen corresponding to the selected reference direction. When the contact time is included in the third time period, the screen motion control device can move the screen along the reference direction selected by one to two times the width of the screen corresponding to the selected reference direction.
Therefore, the scroll control device can move the screen to three different lengths with respect to the three pointers. The screen motion control apparatus discretely determines the distance to move the screen according to the time interval, so that the user can more precisely control the distance to move the screen. This will be described in more detail with reference to Figs. 4A to 4B.
4A and 4B are views illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention moves a screen based on a touch input of a user. 4A is a diagram showing a moment when a user selects a reference direction.
Referring to FIG. 4A, a map may be displayed on
The scroll control device can determine the direction to move the
The distance to move the
The scroll control device can determine the distance to move the screen based on the time period including the contact time. According to the above assumption, the user has touched the
4B is a diagram showing a result of moving the screen based on the touch input of the user by the scroll control device. Referring to FIG. 4B, it can be seen that the
According to one embodiment, the scroll control device may move the screen based on the measured contact time when the user releases his / her hand on the
According to another embodiment, the screen motion control device can move the screen in real time while a user is making a touch input. For example, assume that the user touches the
0.2 second as a boundary value between the first time interval and the second time interval exemplified in the above embodiments considers the sensitivity and responsiveness of the sensor portion capable of sensing a touch input on a recent touch panel , The user can determine the duration of the touch input, which can distinguish between the differences, heuristically. As another example, 0.5 seconds as the boundary value of the second time interval and the third time can also be determined based on the electrical specifications of the sensor section and the user's experience.
When the screen motion controller divides the contact time into the respective time periods, if the interval between the time periods is too short, the
The 0.2 second and 0.5 second set to the boundary value between the first time period and the third time period in the above embodiments allow the user to move the screen 402 a lot without moving the
In addition, the boundary value between the first time interval and the second time interval of 0.2 sec is a minimum time for which the sensor unit can detect the presence or absence of touch by the user and can measure the contact time, and a minimum time Is a boundary value set in consideration of. If the boundary value between the first time zone and the second time zone is set to be shorter than 0.2 second, the sensor unit may not accurately identify the presence or absence of the user's touch.
5 is a flowchart illustrating an operation of outputting an interface by the screen motion control apparatus according to an embodiment of the present invention. The screen motion control apparatus according to one embodiment may perform the operation of FIG. 5 in
In
In
According to another embodiment, the scroll control device can divide the screen into a constant detection area. The scroll control device can select the side of the screen based on the detection area including the coordinates of the touch input. The detection area set on the screen by the scroll control device will be described in detail later.
In
6A and 6B are diagrams illustrating an example in which the screen motion control apparatus according to an embodiment of the present invention outputs an interface to only one side of the
6A is a view showing a state in which a user touches a specific coordinate of the
Referring to FIG. 6A, since the specific coordinate 610 is located at the edge of the screen, the scroll control device can decide to output the interface. Furthermore, the screen motion control device can determine to output the interface to the right side since the right side is closest to the specific coordinate 610 among the sides of the screen. According to another embodiment, when the specific coordinate 610 is located at the edge of the screen, the screen motion control device can output an interface to the variable portion of the screen.
6B is a diagram showing an example in which the scroll control device outputs the
According to one embodiment of the present invention, the screen motion control device can output an interface to a side of a screen corresponding to a detection area included, based on whether a specific coordinate 610 is included in a specific detection area of the screen. The screen motion control device can set the detection area by dividing the screen before the interface is displayed.
7A to 7C are views showing an example of a detection area according to an embodiment of the present invention. When the user touches the coordinates in the specific detection area, the screen motion control device can output the interface to the side of the screen corresponding to the specific detection area.
Referring to FIG. 7A, the screen motion controller can set five
The scroll control device may not output the interface if the selected detection area is not related to the side of the screen. Referring to FIG. 7A, it can be seen that the
Referring to FIG. 7B, the screen motion control device can set four
Referring to FIG. 7B, when the user selects the
Referring to FIG. 7C, the scroll control device can set four
The scroll control device may not display the set detection area on the screen. However, according to another embodiment, the screen motion control device can display the set detection area on the screen for the convenience of the user.
The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.
The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
101: Device
102: Screen
103:
104: Processor
Claims (11)
A processor for outputting to the screen an interface including at least one reference direction indicating a direction to move the screen,
Lt; / RTI >
The processor comprising:
When the reference direction is selected by the touch input,
And classifies which of the three time periods the contact time is included, and moves the screen by a corresponding size between the classified time periods along the selected reference direction.
The three time periods are,
A first time interval between 0 and 0.2 seconds;
A second time interval between 0.2 seconds and 0.5 seconds; And
Third time zone of more than 0.5 seconds
And a control unit for controlling the display unit.
The processor comprising:
When the contact time is included in the first time slot,
Moving the screen along the selected reference direction to a size of 1/3 to 1/2 times the width of the screen based on the width of the screen corresponding to the selected reference direction,
When the contact time is included in the second time slot,
And moves the screen along the selected reference direction to a size that is 1/2 to 1 times the width of the screen based on the width of the screen corresponding to the selected reference direction.
The processor comprising:
When the contact time is included in the third time slot,
And moves the screen along the selected reference direction to a size of one to two times the width of the screen based on the width of the screen corresponding to the selected reference direction.
The processor comprising:
When the distance between the coordinates of the touch input and the sides of the screen closest to the coordinates of the touch input is smaller than a preset distance,
And outputs the interface to the side of the screen closest to the coordinate of the touch input.
The processor comprising:
Wherein the screen is divided into four detection areas based on two diagonal lines connecting opposing vertexes of the screen and a screen is displayed for outputting the interface to the side of the screen for the detection area including coordinates of the touch input controller.
The processor comprising:
The remaining three directions excluding the direction toward the center of the screen are displayed on the side of the screen as the reference direction from among the top direction, bottom direction, left direction, and right direction of the screen, The screen movement control apparatus comprising:
Detecting contact time of the touch input when the reference direction is selected through a touch input;
Classifying which of the three time periods the contact time is included; And
Moving the screen along the selected reference direction by a corresponding magnitude between the classified time periods,
And controlling the moving of the screen.
The corresponding size between the classified time periods may be,
Based on the width of the screen corresponding to the selected reference direction.
Wherein the outputting step comprises:
And outputting the interface to at least one side of each side of the screen based on coordinates of the touch input when the touch input is generated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160013025A KR101835952B1 (en) | 2016-02-02 | 2016-02-02 | Apparatus and method for controlling scroll of screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160013025A KR101835952B1 (en) | 2016-02-02 | 2016-02-02 | Apparatus and method for controlling scroll of screen |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170092020A true KR20170092020A (en) | 2017-08-10 |
KR101835952B1 KR101835952B1 (en) | 2018-03-08 |
Family
ID=59652305
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160013025A KR101835952B1 (en) | 2016-02-02 | 2016-02-02 | Apparatus and method for controlling scroll of screen |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101835952B1 (en) |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101182076B1 (en) * | 2011-03-31 | 2012-09-11 | 선문대학교 산학협력단 | Apparatus and method of screen scrolling for portable terminals with touch screen |
-
2016
- 2016-02-02 KR KR1020160013025A patent/KR101835952B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
KR101835952B1 (en) | 2018-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9910527B2 (en) | Interpretation of pressure based gesture | |
US10042546B2 (en) | Systems and methods to present multiple frames on a touch screen | |
US9880655B2 (en) | Method of disambiguating water from a finger touch on a touch sensor panel | |
JP6333568B2 (en) | Proximity motion recognition device using sensor and method using the device | |
US9678606B2 (en) | Method and device for determining a touch gesture | |
US20140189579A1 (en) | System and method for controlling zooming and/or scrolling | |
US10459614B2 (en) | System and method for controlling object motion based on touch | |
JP6109218B2 (en) | Method of adjusting moving direction of display object and terminal | |
US20140237408A1 (en) | Interpretation of pressure based gesture | |
KR20100108116A (en) | Apparatus and method for recognizing touch gesture | |
JP2016157491A (en) | Method for determining kind of touch, and touch input device for executing the same | |
US20160231860A1 (en) | Thermal baseline relaxation | |
US20120249599A1 (en) | Method of identifying a multi-touch scaling gesture and device using the same | |
US20120249440A1 (en) | method of identifying a multi-touch rotation gesture and device using the same | |
US9280284B2 (en) | Method, apparatus and computer readable medium for polygon gesture detection and interaction | |
JP2017506399A (en) | System and method for improved touch screen accuracy | |
US20150234472A1 (en) | User input processing method and apparatus using vision sensor | |
US11199963B2 (en) | Non-contact operation input device | |
JP2020170311A (en) | Input device | |
US9811218B2 (en) | Location based object classification | |
CN110413183B (en) | Method and equipment for presenting page | |
US20140176448A1 (en) | Detecting a gesture | |
KR20160019449A (en) | Disambiguation of indirect input | |
CN107272971B (en) | Grip management | |
KR101393733B1 (en) | Touch screen control method using bezel area |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right |