WO2013141464A1 - 터치 기반의 입력제어 방법 - Google Patents
터치 기반의 입력제어 방법 Download PDFInfo
- Publication number
- WO2013141464A1 WO2013141464A1 PCT/KR2012/010738 KR2012010738W WO2013141464A1 WO 2013141464 A1 WO2013141464 A1 WO 2013141464A1 KR 2012010738 W KR2012010738 W KR 2012010738W WO 2013141464 A1 WO2013141464 A1 WO 2013141464A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- point
- user
- input mode
- movement
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
Definitions
- the present invention relates to a technology for controlling user input on a touch basis in a user terminal such as a smart phone or a smart pad. More particularly, the present invention relates to a touch-based input control technology configured to appropriately distinguish and control an edit cursor operation and a control pointer movement operation by interpreting a user gesture provided on a touch basis in a user terminal.
- a small mobile device has a function of performing a memo or schedule through a text input function, inputting a text message, and searching information on the web through the Internet.
- buttons for text entry.
- the size of the buttons can only be made very small, which causes problems in use.
- a trend of adopting a method of displaying a virtual keyboard on a wide touch screen such as a smartphone (for example, an iPhone) or a smart pad (for example, an iPad) and performing text input thereon is performed.
- a smartphone for example, an iPhone
- a smart pad for example, an iPad
- touch-based information input technology is expected to expand further.
- information input means such as a touch screen or a trackpad is called a touch device.
- a touch-based mobile device it is generally not provided with a separate mechanical button.
- a soft button for controlling various functions and a user input is displayed on a touch screen, and a user command is realized through a touch input, or various information input is performed through a trackpad operation.
- Multi-touch touch screens are being adopted in mobile devices.
- Multi-touch has the advantage that the user can control the mobile device more conveniently by using multiple fingers at the same time.
- touch-based information input technology is continuously developed.
- the user can easily and quickly control the position of the editing cursor and the position of the control pointer according to the situation without the hassle of changing and setting the input mode according to the intention. What was needed was a way to do it.
- Portable information input device Korean Patent Application No. 10-2010-0025169
- An object of the present invention is to provide a technology for controlling user input on a touch basis in a user terminal such as a smart phone or a smart pad.
- an object of the present invention is to provide a touch-based input control technology configured to properly distinguish and control the edit cursor operation and the control pointer movement operation by interpreting a user gesture provided on a touch basis in the user terminal.
- a touch-based input control method for achieving the above object, the first step of implementing a virtual keyboard on the touch device; A second step of identifying a user touch on the touch device on the screen on which the virtual keyboard is displayed; A third step of identifying movement of the user touch; A fourth step of processing a keyboard stroke for a character corresponding to a touch point of the virtual keyboard when the touch is released in a state where a threshold exceeding event does not occur with respect to the user touch; A fifth step of recognizing a current input mode when a threshold exceeding event occurs for a user touch; When the input mode is the keyboard input mode, the sixth step of moving and implementing the editing cursor to correspond to the movement direction of the user touch; is configured to include.
- a seventh step of moving and implementing the control pointer to correspond to the moving direction and distance of the user touch may be further included.
- the touch-based input control method the first step of implementing a virtual keyboard on the touch device; Identifying a multi-touch on the virtual keyboard of the touch device; A third step of identifying movement of the multitouch; A fourth step of determining whether to release the touch to the multi-touch while the threshold exceeding event has not occurred for the multi-touch;
- the fifth mode is implemented by setting the input mode to the keyboard input mode and moving the editing cursor to correspond to the touch movement direction of the first point.
- the sixth step is configured to include.
- the threshold exceeding event includes at least one of a first event in which the movement distance of the user touch exceeds a preset threshold distance and a second event in which a holding time of the user touch exceeds a preset threshold time.
- the character input operation, the edit cursor movement operation, and the control pointer movement operation can be performed according to the situation easily and quickly according to the situation without the hassle of the prior art in which the input mode must be changed and set according to the input mode intended by the user. It provides convenience for automatic control.
- FIG. 1 illustrates a configuration of a user terminal suitable for implementing the present invention.
- FIG. 2 illustrates a virtual keyboard implemented on a touch screen.
- 3 is a diagram for inputting characters through a virtual keyboard
- Fig. 5 shows the editing cursor movement in the keyboard input mode.
- FIG. 6 illustrates mouse pointer movement in a focusing control mode.
- FIG. 7 illustrates implementation of left click and right click via multi-touch in a focusing control mode.
- FIG. 8 is a diagram for performing block setting through multi-touch.
- 9 is a diagram for performing an editing function through multi-touch.
- FIG. 10 is a flowchart illustrating a single touch-based input control method according to the present invention.
- FIG. 11 is a flowchart illustrating a multi-touch based input control method according to the present invention.
- FIG. 1 is a view showing the configuration of a user terminal 10 suitable for implementing a touch-based input control method according to the present invention
- Figures 2 to 9 are user terminals to which the touch-based input control method according to the present invention is applied
- FIG. 10 is a diagram illustrating a UI screen implemented on the touch screen 11 of FIG. 10.
- the user terminal 10 includes a touch screen 11, a controller 13, and a storage 14.
- the touch screen 11 is a virtual keyboard 12 is implemented.
- the touch screen 11 is provided as an example of the touch device in the present invention, but generally means that the touch input means and the display means are integrally combined, but the present invention is not limited thereto and includes only the touch input means.
- the virtual keyboard 12 generally refers to a keyboard in which a keyboard is graphically displayed on the touch screen 11 as a display screen and text input is performed on the touch screen, but in the present invention, a touch device having only touch input means is formed. It is a broad concept including a keyboard of a physical interface (PI) that includes printing and pasting a keyboard keyboard on a sticker.
- PI physical interface
- the virtual keyboard 12 is a keyboard formed on the touch screen 11 and may be formed in, for example, a qwerty form as shown in FIG. 2.
- a text sentence is created in the text input area 11a.
- the text input area 11a and the virtual keyboard 12 are generally implemented integrally on the touch screen 11 of the user terminal 10.
- the text input area 11a and the virtual keyboard 12 are implemented as separate hardware and are implemented by network means (eg, Bluetooth). Implementations that operate in conjunction are not excluded from the invention.
- the virtual keyboard 12 performs the text input function by the touch and at the same time the control unit 13 interprets the touch gesture operation to determine the input mode.
- the control unit 13 interprets the touch gesture operation to determine the input mode.
- the controller 13 includes a touch sensing module 13a, a focus module 13b, and a keyboard input module 13c.
- the control unit 13 may be classified into a case in which the user operates through a single touch on the screen of the virtual keyboard 12 according to the first embodiment and a case in which the user operates through multi-touch on the virtual keyboard 12 according to the second embodiment. ) Will be described in detail.
- the storage unit 14 may be configured through a RAM, a ROM, a flash memory, a hard disk, a memory card, a web disk, a cloud, and the like as a space for storing control program codes or various data of the user terminal 10.
- the touch sensing module 13a implements the virtual keyboard 12 on the touch screen 11 in response to a user's manipulation. Then, the touch sensing module 13a waits for a touch input for a point on the screen on which the virtual keyboard 12 is displayed to identify the occurrence.
- the touch sensing module 13a may include touch coordinates on the touch screen 11 corresponding to the touch point and preferably a virtual keyboard 12 corresponding to the touch point of the user.
- the corresponding character of the image is identified and stored in the storage unit 14 once.
- the touch sensing module 13a determines whether the movement degree of the touch point exceeds a preset threshold distance (acceptable range).
- the keyboard input module 13c touches the keyboard stroke for the character corresponding to the touch point on the virtual keyboard 12. Control the screen 11.
- the touch sensing module 13a determines whether the current input mode is the keyboard input mode or the focusing control mode.
- the control unit 13 generally interprets and determines the operating mode of the user terminal 10.
- the keyboard input module 13c controls the touch screen 11 to be implemented by moving an edit cursor on text as shown in FIG. 5.
- the input mode is preferably automatically set to the keyboard input mode in response to the movement of the editing cursor. Then, after determining whether or not to release the touch, when the touch is released, the virtual keyboard 12 is controlled to stop the movement of the editing cursor and to perform text editing at the current position.
- the focus module 13b is implemented by moving the control pointer to correspond to the moving direction and the moving distance of the user, as shown in FIG. 6.
- the input mode may be automatically set to the focusing control mode in response to the movement of the control pointer.
- control pointer may be implemented in the form of a mouse pointer or in a form that does not appear on the display screen.
- the actual position of the control pointer may be implemented at the same position as the touch point or may be implemented to correspond to only the moving direction and the distance while being different positions.
- the movement distance of the control pointer may be set to correspond to the movement distance exceeding the threshold distance from the touch point of the user. Then, after determining whether to release the touch, the touch screen 11 is controlled to control focusing at the corresponding position when the touch is released.
- a method of determining whether the holding time of the user touch exceeds a preset threshold time may be implemented.
- a threshold exceeding event such a situation in which the degree of movement of the touch point exceeds the threshold distance (allowable range) or the user's touch holding time exceeds the threshold time (allowable time) is called a threshold exceeding event.
- the touch sensing module 13a implements the virtual keyboard 12 on the touch screen 11 in response to a user's manipulation. Then, the touch sensing module 13a waits for multi-touch on the virtual keyboard 12 to identify the occurrence.
- the touch sensing module 13a stores the coordinates of two points on the touch screen 11 corresponding to the multi-touch point. It is stored temporarily in 14).
- the touch sensing module 13a identifies whether the user's touch operation moves from the first touch point for each of the multi-touches, and then determines whether the movement degree of the touch point exceeds the threshold distance (acceptable range). do.
- the touch screen 11 is controlled to perform page up / down.
- the touch sensing module 13a next determines whether a touch release event occurs for all of the multi-touch points.
- the touch sensing module 13a waits for a re-touch of the point where the multi-touches have been made and identifies the touch if the re-touch is input.
- the keyboard input module 13c moves the edit cursor to correspond to the retouch moving direction of the left point, thereby implementing a touch screen. (11). It is preferable that the input mode is automatically set to the keyboard input mode in response to the movement of the editing cursor.
- the focus module 13b moves the control pointer so as to correspond to the retouch movement direction and the movement distance of the right point.
- the input mode may be automatically set to the focusing control mode in response to the movement of the control pointer.
- the touch screen 11 is controlled to move the edit cursor or the control pointer according to each case.
- the keyboard input module 13c moves the editing cursor to correspond to the touch movement direction of the left point as shown in FIG. 5.
- the touch screen 11 is controlled to implement. It is preferable that the input mode is automatically set to the keyboard input mode in response to the movement of the editing cursor.
- the focus module 13b moves the control pointer to correspond to the touch movement direction and the movement distance of the right point as shown in FIG. 6.
- the touch screen 11 is controlled to implement by moving.
- the input mode is preferably set to the focusing control mode automatically.
- a text block may be selected or an editing function may be performed using multi-touch.
- the keyboard input module 13c may set a text block through a multi-touch operation after setting the user's keyboard input mode on the text input area 11a implemented in the touch screen 11.
- a user performs a second left touch in a contact state of one point (hereinafter, referred to as 'sequential multi-touch')
- these multi-touch points are continuously moved left or right (drag).
- the keyboard input module 13c may set a block in the text sentence.
- the text block “morning” is selected by the user's multi-touch and right drag operation.
- the focus module 13b may also receive a text block setting from the user after setting the focusing control mode of the user on the text input area 11a implemented in the touch screen 11.
- a text block setting from the user after setting the focusing control mode of the user on the text input area 11a implemented in the touch screen 11.
- an edit function (copy / paste / cut) popup window is generated, and then one of the edit functions is selected by continuously moving these multi-touch points. Can be.
- the focus module 13b may select and perform an editing function "cut" for the text block through such manipulation.
- the controller 13 implements the virtual keyboard 12 on the touch screen 11 at the user's request (S1).
- a touch device including a trackpad may be used to implement the technology of the present invention.
- the controller 13 identifies the single touch input provided on the screen on which the virtual keyboard 12 is displayed (S2).
- step S2 the controller 13 temporarily stores the touch coordinates on the touch screen 11 corresponding to the single touch point and the corresponding character of the virtual keyboard 12 in the storage unit 14. (S3).
- the control unit 13 determines whether the movement distance of the user touch from the first touch point of the single touch exceeds a preset threshold distance (allowable range) (S4), in the range within the threshold distance.
- a preset threshold distance allowable range
- the controller 13 controls the touch screen 11 to perform keyboard stroke processing on the character corresponding to the touch point on the virtual keyboard 12 (S10).
- step S4 the controller 13 recognizes the current input mode (S5).
- the control unit 13 is implemented by moving the control pointer to correspond to the moving direction and the moving distance of the single touch provided by the user (S7).
- the control pointer may be implemented in the form of a mouse pointer or in a form that does not appear on the display screen.
- the actual position of the control pointer may be implemented at the same position as the touch point or may be implemented to correspond to only the moving direction and the distance while being different positions.
- the movement distance of the control pointer may be set to correspond to the movement distance exceeding the threshold distance from the touch point of the user. Then, after determining whether to release the touch, the touch screen 11 is controlled to control focusing at the corresponding position when the touch is released.
- the control unit 13 controls the touch screen 11 to be implemented by moving the edit cursor on the text as shown in FIG. Then, after determining whether or not to release the touch, when the touch is released, the virtual keyboard 12 is controlled to stop the movement of the editing cursor and to perform text editing at the current position.
- FIG. 11 is a diagram illustrating a multi-touch based input control method according to a second embodiment of the present invention.
- the controller 13 implements the virtual keyboard 12 on the touch screen 11 at the user's request (S21).
- the controller 13 identifies the multi-touch input provided on the screen on which the virtual keyboard 12 is displayed (S22).
- the controller 13 determines whether the movement distance of the user touch from the first touch point of the multi-touch exceeds a preset threshold range (allowable range) (S24).
- the control unit 13 corresponds to the up / down / left / right directions in response to the user's touch movement operation.
- the touch screen 11 is controlled to perform scrolling or page up / down (S25).
- step S24 determines whether a release event for all of the multi-touch occurs (S27).
- step S32 determines whether there is a right touch release event. If the right touch release is identified and a touch movement is detected for the left point, the control unit 13 moves to the left point as shown in FIG. 5.
- the touch screen 11 is controlled to move and implement the editing cursor so as to correspond to the touch movement direction of (S33). It is preferable that the input mode is automatically set to the keyboard input mode in response to the movement of the editing cursor.
- the control unit 13 moves the control pointer to correspond to the touch movement direction and the movement distance of the right side.
- the input mode is preferably set to the focusing control mode automatically.
- step S27 when a touch release event for all the multi-touch occurs as a result of the determination in step S27, the control unit 13 waits for a re-touch for the point where the multi-touch has occurred, and identifies it when the re-touch is input (S28).
- the control unit 13 moves the edit cursor to correspond to the retouch moving direction of the left point.
- the touch screen 11 is controlled to (S30). It is preferable that the input mode is automatically set to the keyboard input mode in response to the movement of the editing cursor.
- control unit 13 moves the control pointer to correspond to the retouch movement direction and the movement distance of the right point.
- Control (S31) the input mode may be automatically set to the focusing control mode in response to the movement of the control pointer.
- FIG. 12 is a diagram illustrating a concept of moving icon focusing in a main menu of a user terminal through a touch input according to the present invention.
- control pointer in the focusing control mode may be implemented in various ways. That is, it may be implemented in the form of a mouse pointer or may not be displayed on the display screen.
- FIG. 12 is an example in which the control pointer is not displayed on the screen.
- a character name may be input as needed or an icon may be input on another application execution screen.
- the input control technique using touch has been described in relation to the selection switching between the keyboard input mode and the focusing control mode, the input of characters, the movement of an editing cursor, and the movement of control points for icons. The same applies.
- the invention can also be embodied in the form of computer readable codes on a computer readable recording medium.
- the computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored.
- Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, which may be implemented in the form of a carrier wave (eg, transmission over the Internet). .
- the computer readable recording medium can also store and execute computer readable code in a distributed manner over networked computer systems. And functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the technical field to which the present invention belongs.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (14)
- 터치디바이스에 가상키보드를 구현하는 제 1 단계;상기 가상키보드가 표시된 화면 상에서 상기 터치디바이스에 대한 사용자 터치를 식별하는 제 2 단계;상기 사용자 터치의 이동을 식별하는 제 3 단계;상기 사용자 터치에 대해 미리 설정된 임계초과 이벤트가 발생하지 않은 상태에서 상기 터치가 릴리스된 경우 상기 가상키보드의 터치 지점에 해당하는 문자에 대해 키보드 스트로크 처리하는 제 4 단계;상기 사용자 터치에 대해 임계초과 이벤트가 발생한 경우 현재의 입력모드를 인식하는 제 5 단계;상기 입력모드가 키보드 입력모드인 경우, 상기 사용자 터치의 이동 방향에 대응하도록 편집 커서를 이동시켜 구현하는 제 6 단계;를 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 1에 있어서,상기 임계초과 이벤트는 사용자 터치의 이동 거리가 미리 설정된 임계거리를 초과하는 제 1 이벤트와 사용자 터치의 유지 시간이 미리 설정된 임계시간을 초과하는 제 2 이벤트의 적어도 하나 이상을 포함하여 구성되는 것을 특징으로 하는 터치 기반의 입력제어 방법.
- 청구항 2에 있어서,상기 입력모드가 포커싱 제어모드인 경우, 상기 사용자 터치의 이동 방향 및 거리에 대응하도록 제어 포인터를 이동시켜 구현하는 제 7 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 3에 있어서,상기 제 7 단계에서 상기 제어 포인터의 이동거리는 사용자 터치의 최초 터치 지점으로부터 상기 임계거리를 초과한 이동 거리에 대응하도록 설정되는 것을 특징으로 하는 터치 기반의 입력제어 방법.
- 청구항 4에 있어서,상기 입력모드가 포커싱 제어모드인 경우, 상기 제어 포인터의 이동에 대응되는 사용자 터치의 좌측 또는 우측의 멀티터치 입력에 대응하여 각각 마우스 좌클릭 또는 마우스 우클릭 동작을 구현하는 제 8 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 2에 있어서,상기 입력모드가 키보드 입력모드인 상태에서 상기 터치디바이스 상에서 미리 설정된 제 1 순서로 순차적인 멀티터치를 구성한 후 연속적으로 상기 멀티터치 지점을 좌측 또는 우측으로 이동하면 상기 편집 커서에 대응되는 위치로부터 상기 좌측 또는 우측의 이동방향에 대응하여 텍스트 블록을 설정하는 제 9 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 6에 있어서,상기 터치디바이스 상에서 미리 설정된 제 2 순서로 순차적인 멀티터치를 구성한 후 연속적으로 상기 멀티터치 지점을 좌측 또는 우측으로 이동하면 상기 설정된 텍스트 블록에 대한 편집기능 윈도우를 표시하면서 상기 좌측 또는 우측의 이동방향에 대응하여 편집기능 선택을 수행하는 제 10 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 터치디바이스에 가상키보드를 구현하는 제 1 단계;상기 터치디바이스의 가상키보드 상에서 멀티터치를 식별하는 제 2 단계;상기 멀티터치의 이동을 식별하는 제 3 단계;상기 멀티터치에 대해 미리 설정된 임계초과 이벤트가 발생하지 않은 상태인 동안 상기 멀티터치에 대한 터치릴리스 여부를 판단하는 제 4 단계;상기 판단 결과, 상기 제 2 지점의 터치는 릴리스되고 상기 제 1 지점의 터치는 이동하는 경우에 입력모드를 키보드 입력모드로 설정하고 상기 제 1 지점의 터치 이동방향에 대응하도록 편집 커서를 이동시켜 구현하는 제 5 단계;상기 판단 결과, 상기 제 1 지점의 터치는 릴리스되고 상기 제 2 지점의 터치는 이동하는 경우에 입력모드를 포커싱 제어모드로 설정하고 상기 제 2 지점의 터치 이동 방향 및 거리에 대응하도록 제어 포인터를 이동시켜 구현하는 제 6 단계;를 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 8에 있어서,상기 제 4 단계의 판단 결과, 상기 멀티터치가 터치릴리스된 경우, 상기 멀티터치에 대응하는 지점에 대한 재터치를 대기하는 제 7 단계;상기 멀티터치 중에서 미리 설정된 제 1 지점에 대한 재터치가 식별되면 입력모드를 키보드 입력모드로 설정하고 상기 제 1 지점의 재터치 이동방향에 대응하도록 편집 커서를 이동시켜 구현하는 제 8 단계;상기 멀티터치 중에서 미리 설정된 제 2 지점에 대한 재터치가 식별되면 입력모드를 포커싱 제어모드로 설정하고 상기 제 2 지점의 재터치 이동 방향 및 거리에 대응하도록 제어 포인터를 이동시켜 구현하는 제 9 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 9에 있어서,상기 임계초과 이벤트는 사용자 터치의 이동 거리가 미리 설정된 임계거리를 초과하는 제 1 이벤트와 사용자 터치의 유지 시간이 미리 설정된 임계시간을 초과하는 제 2 이벤트의 적어도 하나 이상을 포함하여 구성되는 것을 특징으로 하는 터치 기반의 입력제어 방법.
- 청구항 10에 있어서,상기 멀티터치의 이동 거리가 상기 임계거리를 초과하는 경우에는 스크롤 업/다운 또는 페이지 업/다운 명령으로 처리하는 제 10 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 10에 있어서,상기 입력모드가 키보드 입력모드인 상태에서 상기 터치디바이스 상에서 미리 설정된 제 1 순서로 순차적인 멀티터치를 구성한 후 연속적으로 상기 멀티터치 지점을 좌측 또는 우측으로 이동하면 상기 편집 커서에 대응되는 위치로부터 상기 좌측 또는 우측의 이동방향에 대응하여 텍스트 블록을 설정하는 제 11 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 11에 있어서,상기 터치디바이스 상에서 미리 설정된 제 2 순서로 순차적인 멀티터치를 구성한 후 연속적으로 상기 멀티터치 지점을 좌측 또는 우측으로 이동하면 상기 설정된 텍스트 블록에 대한 편집기능 윈도우를 표시하면서 상기 좌측 또는 우측의 이동방향에 대응하여 편집기능 선택을 수행하는 제 12 단계;를 더 포함하여 구성되는 터치 기반의 입력제어 방법.
- 청구항 1 내지 청구항 13 중 어느 하나의 항에 따른 터치 기반의 입력제어 방법을 수행하는 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280071411.9A CN104205033A (zh) | 2012-03-20 | 2012-12-11 | 基于触摸的输入控制方法 |
US14/004,539 US20140145945A1 (en) | 2012-03-20 | 2012-12-11 | Touch-based input control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0028219 | 2012-03-20 | ||
KR1020120028219A KR101156610B1 (ko) | 2012-03-20 | 2012-03-20 | 터치 방식을 이용한 입력 제어 방법 및 이를 위한 입력 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013141464A1 true WO2013141464A1 (ko) | 2013-09-26 |
Family
ID=46607514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2012/010738 WO2013141464A1 (ko) | 2012-03-20 | 2012-12-11 | 터치 기반의 입력제어 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140145945A1 (ko) |
KR (1) | KR101156610B1 (ko) |
CN (1) | CN104205033A (ko) |
WO (1) | WO2013141464A1 (ko) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015065146A1 (ko) * | 2013-11-04 | 2015-05-07 | 삼성전자 주식회사 | 전자 장치 및 이의 어플리케이션 실행 방법 |
EP2924553A1 (en) * | 2014-03-18 | 2015-09-30 | BlackBerry Limited | Method and system for controlling movement of cursor in an electronic device |
CN104980572A (zh) * | 2014-04-11 | 2015-10-14 | Lg电子株式会社 | 移动终端及其控制方法 |
CN106796912A (zh) * | 2014-08-28 | 2017-05-31 | 三星电子株式会社 | 用于设置块的电子装置和方法 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101329584B1 (ko) * | 2012-10-22 | 2013-11-14 | 신근호 | 멀티터치 기반의 텍스트블록 설정에 따른 편집제공 방법 및 이를 위한 컴퓨터로 판독가능한 기록매체 |
KR102091235B1 (ko) * | 2013-04-10 | 2020-03-18 | 삼성전자주식회사 | 휴대 단말기에서 메시지를 편집하는 장치 및 방법 |
KR101516874B1 (ko) * | 2013-08-02 | 2015-05-04 | 주식회사 큐키 | 개선된 버추얼 키보드를 포함하는 장치 |
KR101544527B1 (ko) * | 2013-11-29 | 2015-08-13 | 주식회사 데이사이드 | 터치 방식을 이용한 입력 제어 방법 및 시스템 |
WO2016048279A1 (en) * | 2014-09-23 | 2016-03-31 | Hewlett-Packard Development Company, Lp | Determining location using time difference of arrival |
KR102057279B1 (ko) | 2014-10-02 | 2019-12-18 | 네이버 주식회사 | 개선된 버추얼 키보드를 포함하는 장치 |
US9880733B2 (en) * | 2015-02-17 | 2018-01-30 | Yu Albert Wang | Multi-touch remote control method |
JP6162299B1 (ja) * | 2016-07-28 | 2017-07-12 | レノボ・シンガポール・プライベート・リミテッド | 情報処理装置、入力切替方法、及びプログラム |
JP6822232B2 (ja) * | 2017-03-14 | 2021-01-27 | オムロン株式会社 | 文字入力装置、文字入力方法、および、文字入力プログラム |
WO2018218392A1 (zh) * | 2017-05-27 | 2018-12-06 | 深圳市柔宇科技有限公司 | 触摸操作的处理方法和触摸键盘 |
CN108399012A (zh) * | 2018-02-23 | 2018-08-14 | 上海康斐信息技术有限公司 | 一种集成鼠标功能的键盘 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080069292A (ko) * | 2007-01-23 | 2008-07-28 | 삼성전자주식회사 | 이동 단말의 마우스 기능 구현 방법 |
KR20090093250A (ko) * | 2008-02-29 | 2009-09-02 | 황재엽 | 터치식 가상 키보드 상의 투명 가상 마우스 방법 |
KR20100033214A (ko) * | 2008-09-19 | 2010-03-29 | 주식회사 텔로드 | 터치패드의 입력패턴 인식을 통한 입력모드 자동전환 방법 |
KR101013219B1 (ko) * | 2010-02-11 | 2011-02-14 | 라오넥스(주) | 터치 방식을 이용한 입력 제어 방법 및 시스템 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US8059101B2 (en) * | 2007-06-22 | 2011-11-15 | Apple Inc. | Swipe gestures for touch screen keyboards |
JP2010102662A (ja) * | 2008-10-27 | 2010-05-06 | Sharp Corp | 表示装置、携帯端末 |
KR101844366B1 (ko) * | 2009-03-27 | 2018-04-02 | 삼성전자 주식회사 | 터치 제스처 인식 장치 및 방법 |
EP2320312A1 (en) * | 2009-11-10 | 2011-05-11 | Research In Motion Limited | Portable electronic device and method of controlling same |
US20110285651A1 (en) * | 2010-05-24 | 2011-11-24 | Will John Temple | Multidirectional button, key, and keyboard |
-
2012
- 2012-03-20 KR KR1020120028219A patent/KR101156610B1/ko not_active IP Right Cessation
- 2012-12-11 CN CN201280071411.9A patent/CN104205033A/zh active Pending
- 2012-12-11 US US14/004,539 patent/US20140145945A1/en not_active Abandoned
- 2012-12-11 WO PCT/KR2012/010738 patent/WO2013141464A1/ko active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080069292A (ko) * | 2007-01-23 | 2008-07-28 | 삼성전자주식회사 | 이동 단말의 마우스 기능 구현 방법 |
KR20090093250A (ko) * | 2008-02-29 | 2009-09-02 | 황재엽 | 터치식 가상 키보드 상의 투명 가상 마우스 방법 |
KR20100033214A (ko) * | 2008-09-19 | 2010-03-29 | 주식회사 텔로드 | 터치패드의 입력패턴 인식을 통한 입력모드 자동전환 방법 |
KR101013219B1 (ko) * | 2010-02-11 | 2011-02-14 | 라오넥스(주) | 터치 방식을 이용한 입력 제어 방법 및 시스템 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015065146A1 (ko) * | 2013-11-04 | 2015-05-07 | 삼성전자 주식회사 | 전자 장치 및 이의 어플리케이션 실행 방법 |
GB2534100A (en) * | 2013-11-04 | 2016-07-13 | Samsung Electronics Co Ltd | Electronic apparatus and method for executing application thereof |
GB2534100B (en) * | 2013-11-04 | 2021-07-28 | Samsung Electronics Co Ltd | Electronic apparatus and method for executing application thereof |
US11379116B2 (en) | 2013-11-04 | 2022-07-05 | Samsung Electronics Co., Ltd. | Electronic apparatus and method for executing application thereof |
EP2924553A1 (en) * | 2014-03-18 | 2015-09-30 | BlackBerry Limited | Method and system for controlling movement of cursor in an electronic device |
US9436348B2 (en) | 2014-03-18 | 2016-09-06 | Blackberry Limited | Method and system for controlling movement of cursor in an electronic device |
CN104980572A (zh) * | 2014-04-11 | 2015-10-14 | Lg电子株式会社 | 移动终端及其控制方法 |
US9977539B2 (en) | 2014-04-11 | 2018-05-22 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9977541B2 (en) | 2014-04-11 | 2018-05-22 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
EP3327560A1 (en) * | 2014-04-11 | 2018-05-30 | LG Electronics Inc. | Mobile terminal and method for controlling the same |
CN106796912A (zh) * | 2014-08-28 | 2017-05-31 | 三星电子株式会社 | 用于设置块的电子装置和方法 |
US10725608B2 (en) | 2014-08-28 | 2020-07-28 | Samsung Electronics Co., Ltd | Electronic device and method for setting block |
Also Published As
Publication number | Publication date |
---|---|
CN104205033A (zh) | 2014-12-10 |
US20140145945A1 (en) | 2014-05-29 |
KR101156610B1 (ko) | 2012-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013141464A1 (ko) | 터치 기반의 입력제어 방법 | |
WO2014065499A1 (ko) | 멀티터치 기반의 텍스트블록 설정에 따른 편집제공 방법 | |
WO2015105271A1 (en) | Apparatus and method of copying and pasting content in a computing device | |
WO2019128732A1 (zh) | 一种图标管理的方法及装置 | |
WO2014084633A1 (en) | Method for displaying applications and electronic device thereof | |
WO2012108723A2 (en) | Information display apparatus having at least two touch screens and information display method thereof | |
WO2015016585A1 (en) | Electronic device and method of recognizing input in electronic device | |
WO2015030390A1 (en) | Electronic device and method for providing content according to field attribute | |
WO2014142471A1 (en) | Multi-input control method and system, and electronic device supporting the same | |
WO2013032234A1 (en) | Method of providing of user interface in portable terminal and apparatus thereof | |
WO2012053801A2 (en) | Method and apparatus for controlling touch screen in mobile terminal responsive to multi-touch inputs | |
WO2014129828A1 (en) | Method for providing a feedback in response to a user input and a terminal implementing the same | |
WO2014107005A1 (en) | Mouse function provision method and terminal implementing the same | |
AU2012214924A1 (en) | Information display apparatus having at least two touch screens and information display method thereof | |
CN103229141A (zh) | 管理用户界面中的工作空间 | |
WO2017209568A1 (ko) | 전자 장치 및 그의 동작 방법 | |
US20140351725A1 (en) | Method and electronic device for operating object | |
WO2014129787A1 (en) | Electronic device having touch-sensitive user interface and related operating method | |
WO2011090302A2 (ko) | 터치패널을 갖는 개인휴대단말기의 작동방법 | |
WO2016085186A1 (en) | Electronic apparatus and method for displaying graphical object thereof | |
WO2012093779A2 (ko) | 사용자 터치와 입김을 활용한 멀티 모달 인터페이스 지원 사용자 단말과 그 제어 방법 | |
WO2017200323A1 (ko) | 사용자 데이터를 저장하는 전자 장치 및 그 방법 | |
WO2013100727A1 (en) | Display apparatus and image representation method using the same | |
WO2014003448A1 (en) | Terminal device and method of controlling the same | |
WO2012118271A1 (ko) | 터치를 이용한 컨텐츠 제어방법, 장치, 이를 위한 기록매체 및 이를 포함하는 사용자 단말 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14004539 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12872120 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, FORM 1205N DATED 12-02-2015 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12872120 Country of ref document: EP Kind code of ref document: A1 |