WO2019151669A1 - Procédé de fourniture de clavier sur écran et dispositif informatique permettant d'effectuer cela - Google Patents

Procédé de fourniture de clavier sur écran et dispositif informatique permettant d'effectuer cela Download PDF

Info

Publication number
WO2019151669A1
WO2019151669A1 PCT/KR2019/000364 KR2019000364W WO2019151669A1 WO 2019151669 A1 WO2019151669 A1 WO 2019151669A1 KR 2019000364 W KR2019000364 W KR 2019000364W WO 2019151669 A1 WO2019151669 A1 WO 2019151669A1
Authority
WO
WIPO (PCT)
Prior art keywords
keyboard
key
mode
assigned
keys
Prior art date
Application number
PCT/KR2019/000364
Other languages
English (en)
Korean (ko)
Inventor
최원호
Original Assignee
최원호
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 최원호 filed Critical 최원호
Publication of WO2019151669A1 publication Critical patent/WO2019151669A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a method for providing an on-screen keyboard and a computing device performing the same.
  • the phablet generally refers to a smartphone having a large screen of 5 inches or more, and is a compound word of a phone and a tablet.
  • Such digital mobile devices have a touch-sensitive display and provide an on-screen keyboard as an interface for user input. The user can enter data using the on-screen keyboard.
  • LCDs liquid crystal displays
  • OLEDs organic light emitting diodes
  • notebooks with touch-sensitive displays have been released.
  • a physical keyboard for user input is not provided in combination with the main body and is sold separately.
  • laptop keyboards are sold as peripherals.
  • These notebooks provide an on-screen keyboard as an interface for user input.
  • the notebook user may input data by touching an object such as his finger or a stylus pen on the touch-sensitive display without using a keyboard.
  • laptops are manufactured with large screens that are 14 inches or 15 inches in size.
  • these devices provide an on-screen keyboard in the form of a qwerty keyboard, which is mostly used as a standard. According to the fingering of the QWERTY keyboard, the use of both hands is essential, so it is very difficult for the user to type with only one hand.
  • these devices convert only the physical keyboard that was previously available in the feature book (before smartphone launch) into an on-screen keyboard. Since such keyboards are not optimized for touch-sensitive displays, key input is slow and inconvenient.
  • An object of the present invention is to provide a method for providing an on-screen keyboard that can input keys quickly and conveniently using only one hand and a computing device performing the same.
  • a method for providing an on-screen keyboard comprising: displaying a keyboard having a predetermined layout, detecting a touch by a user on the touch sensing surface in a predetermined area of the keyboard; And determining a predetermined key input intended by the user based on the detection result, and performing a process corresponding to the determined predetermined key input, wherein the keyboard having the predetermined layout includes: (a) a first region corresponding to the first finger, (b) a second region adjacent to the first region and corresponding to the second and third fingers, and (c) spaced apart from the first region And a third area adjacent to the second area and corresponding to a fourth finger, wherein the first area, the second area and the third area are each (i) disposed above and having a first size.
  • the touch by the user includes one or more types of tap, swipe up, swipe down, swipe right, swipe left
  • the touch on the first key to which the two or more characters are assigned and the second key to which the two or more functions are assigned are determined by different key inputs according to the type of the touch.
  • the first key is provided in four.
  • the operation mode of the keyboard may include a first mode in which a first group of characters is assigned to a plurality of first keys and a default mode, a second mode in which a second group of characters is assigned, and a third And a third mode to which the letters of the group are assigned, and the one or more second keys to which the two or more functions are assigned include a second key to which the second mode operation function and the third mode operation function are assigned.
  • an operation mode of the keyboard includes a first mode and a second mode
  • the first mode is a mode in which a plurality of characters are assigned to a plurality of first keys
  • the second mode is a plurality of first modes.
  • One or more functions to which one key is assigned one or more of (i) a function of moving a cursor, (ii) a function for selecting a character, and (iii) a function for editing a character.
  • the second key includes a second key to which a conversion function from the first mode to the second mode is assigned.
  • the plurality of first keys include a plurality of arrow keys for moving the cursor up, down, left, and right.
  • the keyboard having the predetermined layout further includes a fourth area disposed adjacent to the first area or the third area, wherein the fourth area is configured to execute a moving function of the keyboard. And a key for moving the keyboard by a predetermined distance in one or more directions of up, down, left and right corresponding to the key input.
  • the keyboard moves in one or more directions of up, down, left, and right in response to the touch gesture.
  • a computing device includes a processor, a display device coupled to the processor, a touch sensing surface coupled to the processor, and one or more computer programs configured to be executed by the processor. And a memory configured to store the at least one computer program, the at least one computer program including instructions for performing the aforementioned on-screen keyboard providing method.
  • the layout of the keyboard may include (a) a first area corresponding to the first finger, (b) a second finger adjacent to the first area, and The second area corresponding to the third finger and (c) a third area spaced apart from the first area and adjacent to the second area and corresponding to the fourth finger may be conveniently used by the user. Key can be entered.
  • the first to third regions of the keyboard may each include (i) two or more first keys for inputting characters, and (ii) the keyboard of the keyboard.
  • One second key for executing a certain function one or more first keys are assigned two or more characters, one or more second keys are assigned two or more functions, and the first and second keys Since the touch is determined by different key inputs according to the type of the touch, the user can quickly input a key by using the characteristics of the touch sensing surface.
  • FIG. 1 is a flowchart schematically illustrating a method for providing a keyboard according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram illustrating a keyboard provided on a touch-sensitive display.
  • FIG. 3 is a schematic diagram illustrating a plurality of areas of a keyboard divided according to a user's finger.
  • FIG. 4 is a schematic diagram for describing a configuration of a first region, a second region, and a third region of FIG. 3.
  • FIG. 5 is a schematic diagram for describing a configuration of the fourth region of FIG. 3.
  • FIGS. 6A to 6D are layouts of operation modes of an exemplary English keyboard
  • FIGS. 6E to 6H are views schematically showing layouts of operation modes of an exemplary Korean keyboard.
  • FIG. 7 is a schematic diagram illustrating a user input gesture applied to an exemplary keyboard.
  • FIG. 8 is a schematic diagram for explaining movement of a keyboard using key input.
  • FIG. 9 is a schematic diagram for explaining movement of a keyboard by a user input gesture.
  • FIG. 10 is a block diagram schematically illustrating an exemplary computing device for performing a method of providing a keyboard according to an embodiment of the present invention.
  • spatially relative terms below “, “ beneath “, “ lower”, “ above “, “ upper” It can be used to easily describe a component's correlation with other components. Spatially relative terms are to be understood as including terms in different directions of components in use or operation in addition to the directions shown in the figures. For example, when flipping a component shown in the drawing, a component described as “below” or “beneath” of another component may be placed “above” the other component. Can be. Thus, the exemplary term “below” can encompass both an orientation of above and below. Components may be oriented in other directions as well, so spatially relative terms may be interpreted according to orientation.
  • On-screen keyboard means a keyboard that is provided virtually (non-physically) on the screen.
  • an "on-screen keyboard” may be simply referred to as a “keyboard”, and both may be used interchangeably.
  • the on-screen keyboard is provided by a computing device that includes a touch-sensitive surface and a display device.
  • the touch sensitive surface may be disposed on the surface of the computing device along the housing of the computing device.
  • the touch sensitive surface comprises one or more touch sensors.
  • the touch sensor detects contact or proximity of an object such as a user's finger or a stylus pen.
  • the touch sensitive surface may include a cover or film to protect the touch sensor.
  • the touch sensor may be provided as a capacitive sensor.
  • the touch sensing surface defines a predetermined coordinate system, and specifies a sensing position (a predetermined coordinate or a range of coordinates on the coordinate system) of the object according to the sensing result of the touch sensor.
  • a single touch-sensitive display may be provided in which the touch sensitive surface and the display device are combined.
  • a "character” includes all characters that can be input by typing a keyboard, such as a character language such as Korean, alphabet, and Chinese characters, numbers, symbols, and figures.
  • FIG. 1 is a flowchart schematically illustrating a method for providing a keyboard according to an embodiment of the present invention.
  • a keyboard having a predetermined layout S110
  • sensing a touch by a user on a touch sensing surface in a predetermined area of the keyboard S120
  • determining a predetermined key input intended by the user based on the detection result S130
  • performing a process corresponding to the determined predetermined key input S140
  • the touch by the user may be input by a predetermined type of gesture as described below.
  • the computing device may specify a location at which a touch by the user is input.
  • the key input intended by the user is not determined only by the position where the touch by the user is input, but is determined by further considering the type of the touch as described below.
  • the computing device may perform, for example, a character input, a predetermined function of a keyboard, a change in a setting or a state, etc. in response to the determined predetermined key input, but is not limited thereto.
  • FIG. 2 is a schematic diagram illustrating a keyboard provided on a touch-sensitive display.
  • a keyboard is provided on a touch sensitive display of computing device 300.
  • the keyboard may be located below the touch-sensitive display.
  • the present invention is not limited thereto, and the user may change the position of the keyboard up, down, left, or right as described below.
  • FIG. 3 is a schematic diagram illustrating a plurality of areas of a keyboard divided according to a user's finger.
  • the keyboard 200 is divided into a plurality of areas corresponding to each finger of one hand of a user.
  • the keyboard 200 includes a first area, a second area, a third area, and a fourth area.
  • the first area is an area that is generally disposed on the left side of the keyboard 200
  • the second area is an area that is disposed adjacent to the right side of the first area
  • the third area is adjacent to the right side of the second area and the keyboard 200 as a whole.
  • the first area corresponds to the first finger of the user
  • the second area corresponds to the second finger and the third finger of the user
  • the third area corresponds to the fourth finger of the user.
  • the first area corresponds to the user's index finger
  • the second area corresponds to the user's middle finger and ring finger
  • the third area corresponds to the user's possession.
  • the first area corresponds to the possession of the user
  • the second area corresponds to the user's ringing and middle finger
  • the third area corresponds to the user's index finger.
  • a predetermined area corresponds to a predetermined finger of the user means that a fingering method using a corresponding finger for touch input to the corresponding area is recommended.
  • the fourth area is an area which is disposed outside of the keyboard 200 as a whole.
  • the fourth region may be disposed adjacent to the right side of the third region, or may be disposed adjacent to the left side of the first region.
  • the fourth region may be disposed above or below the first region, the second region, or the third region.
  • the fourth area may be touch input by an arbitrary finger desired by the user.
  • the fourth area is because keys for setting or changing the state of the keyboard 200 are arranged, not for inputting a character or executing a predetermined function of the keyboard 200. That is, not for typing itself.
  • the guide may be displayed at a position corresponding to the thumb when the user uses the right hand or at a position corresponding to the thumb when the user uses the left hand.
  • a guide may be displayed below the first area or below the third area, but is not limited thereto.
  • FIG. 4 is a schematic diagram for describing a configuration of a first region, a second region, and a third region of FIG. 3.
  • each of the first area, the second area, and the third area of FIG. 3 includes two or more first keys 210 disposed above and one second key 220 disposed below. .
  • first keys 210 may be provided as shown in FIG. 4, but are not limited thereto.
  • the first key 210 has a first size and is a key for inputting a character.
  • the second key 220 has a second size larger than the first size and is a key for executing a predetermined function of the keyboard 200.
  • the second key 220 includes a tab (TAB) and a space (SPACE). Space), ENTER (new line or complete input), delitte (delete), shift (SHIFT; single case or vowel conversion), English case conversion (A / a), Korean vowel
  • the keys may be used to execute a function such as a keyboard change, a numeric keypad change 123, an arrow keypad change, and the like, but are not limited thereto.
  • a plurality of first keys 210 may be provided vertically side by side. Further, in some embodiments, a plurality of first keys 210 may be provided side by side horizontally. Further, in some embodiments, as shown in FIG. 4, a plurality of first keys 210 may be provided side by side vertically and horizontally. Also, in some embodiments, the first key 210 may be greater than or equal in size to the second key 220. In some embodiments, a plurality of second keys 220 may also be provided.
  • two or more characters are assigned to one or more first keys 210 of the plurality of first keys 210 of the first area, the second area, and the third area of FIG. 3.
  • three letters “y”, “b”, and “n” may be assigned to one first key 210.
  • two or more functions are assigned to one or more second keys 220 of the plurality of second keys 220 of the first area, the second area, and the third area of FIG. 3.
  • three functions of “TAB”, “SPACE”, and “ENTER” may be assigned to one second key 220.
  • More than one letter or function is assigned to a key because it uses a touch-sensitive display. This is because the touch-sensitive display can input various touch gestures in addition to taps as described below, and thus, a plurality of different key inputs can be processed according to the type of touch on one key. In addition, the space occupancy of the keyboard can also be improved.
  • FIG. 5 is a schematic diagram for describing a configuration of the fourth region of FIG. 3.
  • the fourth region of FIG. 3 includes one or more third keys 230.
  • three third keys 230 may be provided as shown in FIG. 5, but are not limited thereto.
  • the third key 230 is a key for setting or changing the state of the keyboard 200. Since the third key 230 is not for inputting a character or executing a predetermined function of the keyboard 200, the third key 230 has a size smaller than that of the first key 210 or the second key 220.
  • the third key 230 may include a plurality of keys 231, 232, and 233 for executing a language selection function, a keyboard hiding function, and a keyboard moving function. It is not limited.
  • the language selection function is executed, the keyboard on which a key board of a predetermined language is arranged is converted from a keyboard on which a key board of a predetermined language is arranged.
  • the keyboard hiding function is executed, the keyboard is hidden without displaying at least part of it. For example, the remaining areas of the keyboard except for the third key 232 may not be displayed. Or, the entire area of the keyboard may not be displayed. Also, the character input function by the keyboard is not executed.
  • third key 230 may further include keys for executing other functions not illustrated.
  • FIGS. 6A to 6D are layouts of operation modes of an exemplary English keyboard
  • FIGS. 6E to 6H are views schematically showing layouts of operation modes of an exemplary Korean keyboard.
  • FIG. 6A illustrates a first mode in which a plurality of characters of a first group are assigned to a plurality of first keys 210
  • FIG. 6B illustrates a plurality of characters of a second group assigned to a plurality of first keys 210
  • FIG. 6C illustrates a third mode in which a plurality of characters of a third group are assigned to the plurality of first keys 210.
  • the first mode may be a lowercase mode
  • the second mode may be an uppercase mode
  • the third mode may be a numeric input mode, but is not limited thereto.
  • the first mode is provided in a default mode. In each mode of operation, different groups of key boards are provided.
  • the fourth mode may be an arrow mode.
  • the plurality of first keys 210 are assigned one or more functions of (i) a cursor movement function, (ii) a function for selecting a character, and (iii) a function for editing a character.
  • a cursor movement function is provided using a plurality of arrow keys for up, down, left and right movements, as shown in FIG. 6D.
  • the selection of characters can be made for a character string in the entire range of the document being edited or viewed or in the range in which the cursor is moved.
  • the entire range of documents can be selected by pressing the "Select All" key.
  • a tap TAB
  • SPACE space
  • an enter function is assigned to the second key 220 of the first area
  • a delete function is assigned to the second key 220 of the second area.
  • the second key 220 of the third area is assigned an English upper / lowercase key board conversion (A / a), numeric key board conversion 123, and arrow key board conversion (ARROW) functions.
  • the user may execute the English letter case switch through the input of the second key 220 in the third area so that the keyboard 200 operates in the first mode or the second mode.
  • the user may execute the numeric keypad conversion function through the input of the second key 220 in the third area to allow the keyboard 200 to operate in the third mode.
  • the user may execute the arrow key plate conversion function of the third area to allow the keyboard 200 to operate in the fourth mode.
  • the keyboard 200 operates in the second mode once. That is, the keyboard 200 returns to the first mode when the user's input once is completed. If the user wants to input the uppercase English letter twice or more, the user must enter the "A / a" key to execute the English upper / lowercase keyboard conversion function.
  • the function assigned to the second key 220 or the third key 230 does not change.
  • FIG. 6E illustrates a first mode in which a plurality of characters of a first group are assigned to a plurality of first keys 210
  • FIG. 6F illustrates a plurality of characters of a second group assigned to a plurality of first keys 210
  • FIG. 6G illustrates a third mode in which a plurality of characters of a third group are assigned to the plurality of first keys 210.
  • the first mode may be a basic vowel (or single vowel) mode
  • the second mode may be an extended vowel (or vowel) mode
  • the third mode may be a numeric input mode, but is not limited thereto.
  • the first mode is provided in a default mode. In each mode of operation, different groups of key boards are provided.
  • it is also possible to input the double vowel by inputting the single vowel two times in succession without executing the second mode.
  • the fourth mode may be an arrow mode.
  • the plurality of first keys 210 are assigned one or more functions of (i) a cursor movement function, (ii) a function for selecting a character, and (iii) a function for editing a character.
  • the cursor movement function is provided using a plurality of arrow keys for up, down, left and right movement, as shown in FIG. 6H.
  • the "Select All”, “Select”, “Copy”, “Cut”, and “Paste” keys are the “Select All”, “Select”, “Copy”, “Cut", and “Paste” of FIG. 6D, respectively. Performs the same function.
  • a tap TAB
  • SPACE space
  • an enter function is assigned to the second key 220 of the first area
  • a delete function is assigned to the second key 220 of the second area.
  • the second key 220 in the third area is assigned a vowel key board conversion, a numeric key board conversion 123, and an arrow key board conversion (ARROW) function.
  • the user may execute the vowel keyboard conversion function through the input of the second key 220 in the third area so that the keyboard 200 may operate in the first mode or the second mode.
  • the user may execute the numeric keypad conversion function through the input of the second key 220 in the third area to allow the keyboard 200 to operate in the third mode.
  • the user may execute the arrow key plate conversion function of the third area to allow the keyboard 200 to operate in the fourth mode.
  • the keyboard 200 operates in the second mode once. That is, the keyboard 200 returns to the first mode when the user's input once is completed. If the user wants to enter the double vowel more than once, he must enter the "Convert vowel" key to execute the double vowel keyboard conversion function.
  • the function assigned to the second key 220 or the third key 230 does not change.
  • Some keys remain the same at least some of the assigned plurality of characters despite changes in the keyboard.
  • Some keys are assigned a combination of character language and symbols.
  • the symbol can be input by swiping (in consideration of the fact that the number of times of input is relatively small compared with the character language), as will be described later.
  • FIG. 7 is a schematic diagram illustrating a user input gesture applied to an exemplary keyboard.
  • a user input gesture may be a tap, a swipe up, a swipe down, a swipe right, a swipe left, or a swipe left. It may include, but is not limited to, one or more types of touch gestures.
  • the touch on the first key 210 to which two or more letters are assigned and the second key 220 to which two or more functions are assigned are determined by different key inputs according to the type of the touch.
  • Keys that are assigned a single letter or function can be entered as tabs.
  • a key to which two or more characters or functions are assigned may be input by swiping.
  • "b" located at the center is a tap
  • "y" located at the upper side is a swipe up.
  • the arrow keyboard conversion of "ARROW” located on the left side is inputted by a swipe left. can do.
  • swipe lights can also be used to execute the input or function of certain characters in a similar manner.
  • the user can register a specific character, character string, file, or the like in advance, and input using swipe light or swipe left.
  • FIG. 8 is a schematic diagram for explaining movement of a keyboard using key input.
  • a user may input the third key 233 to move the position of the keyboard 200 on the touch-sensitive display.
  • the keyboard 200 may move by a predetermined distance in one or more directions of up, down, left, and right in response to the input of the third key 233.
  • a moving mode in which the keyboard 200 is movable is executed, and the keyboard on the touch-sensitive display corresponds to a touch gesture such as touch and drag by a user. 200 can be moved freely.
  • the third key 233 is input twice, a fixed mode in which movement of the keyboard 200 is restricted is executed, and thus the position of the keyboard 200 may be fixed on the touch-sensitive display.
  • FIG. 9 is a schematic diagram for explaining movement of a keyboard by a user input gesture.
  • the keyboard 200 may freely move in one or more directions of up, down, left, and right on the touch-sensitive display in response to the touch gesture. Can be.
  • the predetermined touch gesture may be a two-finger swipe, but is not limited thereto.
  • FIG. 10 is a block diagram schematically illustrating an exemplary computing device for performing a method of providing a keyboard according to an embodiment of the present invention.
  • the example computing device 300 may be provided as a computing device such as a smartphone, tablet, laptop, desktop, or the like. However, the present invention is not limited thereto, and the computing device 300 may be provided as any computing device, which is not illustrated, capable of processing, storing, and transmitting information and data.
  • an exemplary computing device 300 includes a wireless communication unit 310, an A / V input unit 320, a user input unit 330, a sensing unit 340, an output unit 350, and a storage unit 360. ), An interface unit 370, a control unit 380, and a power supply unit 390.
  • the wireless communication unit 310 wirelessly communicates with an external device or computer system.
  • the wireless communication unit 310 wirelessly communicates using a wireless communication scheme such as mobile communication, WiBro, Bluetooth, Wi-Fi, Zigbee, ultrasound, infrared, RF, and the like.
  • a wireless communication scheme such as mobile communication, WiBro, Bluetooth, Wi-Fi, Zigbee, ultrasound, infrared, RF, and the like.
  • the wireless communication scheme of the computing device 300 is not limited to a particular scheme.
  • the wireless communication unit 310 transmits the information received from the external device to the control unit 380, and transmits the information transmitted from the control unit 380 to the external device.
  • the wireless communication unit 310 may include a mobile communication module 311 and a short range communication module 312.
  • the wireless communication unit 310 also includes a location information module 313 to obtain location information of the computing device 300.
  • Location information of the computing device 300 may be provided from, for example, but not limited to, a GPS positioning system, a WiFi positioning system, a cellular positioning system, or a beacon positioning systems, but is not limited to any positioning system. Location information may be provided.
  • the wireless communication unit 310 transmits the location information received from the positioning system to the control unit 380.
  • the A / V input unit 320 is for inputting a video or audio signal and may include a camera module 321 and a microphone module 322.
  • the user input unit 330 receives various information from a user (here, the reader).
  • the user input unit 330 includes input means such as a keyboard, a button, a switch, a touch sensing surface, and a jog wheel.
  • the touch sensing surface has a mutual layer structure with the display module 351 to be described later, the touch sensing display may be configured.
  • the sensing unit 340 detects a state of the computing device 300 or a state of a user.
  • the sensing unit 340 detects a touch sensor, a proximity sensor, a pressure sensor, a vibration sensor, a geomagnetic sensor, a gyro sensor, a speed sensor, an acceleration sensor, a gravity sensor, a temperature sensor, an optical sensor, a humidity sensor, and a biometric sensor. It may include. In some embodiments, the sensing unit 340 is used for user input.
  • the output unit 350 notifies the user of various kinds of information.
  • the output unit 350 outputs information in the form of text, video or audio.
  • the output unit 350 may include a display module 251 and a speaker module 252.
  • the display module 251 is a plasma display panel (PDP), liquid crystal display (LCD), thin film transistor (TFT) LCD, organic light emitting diode (OLED), flexible display, three-dimensional display, electronic ink display, or the present invention. It may be provided in any form well known in the art.
  • the storage unit 360 stores various commands, information, data, and the like.
  • the storage 360 stores system software, various programs, and applications for the operation of the computing device 300.
  • the storage unit 360 may include a random access memory (RAM), a read only memory (ROM), a erasable-programmable ROM (EPROM), a memory such as an electrically EPROM (EEPROM), a flash memory, a hard disk, a removable disk, or the present invention. It may include any type of computer readable recording medium well known in the art.
  • the interface unit 370 serves as a path to an external device connected to the computing device 300.
  • the interface unit 370 receives information from an external device or receives power and transfers the information to components inside the computing device 300, or transmits information inside the computing device 300 to an external device or supplies power therein. do.
  • the interface unit 370 may include, for example, a wired / wireless headset port, a charging port, a wired / wireless data port, a memory card port, a universal serial bus (USB) port, and an identification module. Port may be connected to a connected device, an audio input / output (I / O) port, a video input / output (I / O) port, or the like.
  • the controller 380 controls other components to control the overall operation of the computing device 300.
  • the controller 380 performs system software, various programs, and applications stored in the storage 360.
  • the controller 380 includes a processor such as a central processing unit (CPU), a micro processor unit (MPU), a micro controller unit (MCU), an application processor (AP), and the like.
  • CPU central processing unit
  • MPU micro processor unit
  • MCU micro controller unit
  • AP application processor
  • the power supply unit 390 includes a wireless communication unit 310, an A / V input unit 320, a user input unit 330, a sensing unit 340, an output unit 350, a storage unit 360, an interface unit 370, Power required for the operation of the controller 380 is supplied.
  • the power supply unit 390 may include an internal battery.
  • computing device 300 further includes components not shown in FIG. 10, or includes some components shown in FIG. 10. It may be modified so as not to.
  • the storage unit 360 of the computing device 300 stores one or more computer programs including instructions for performing a method for providing a keyboard according to an embodiment of the present invention described with reference to FIGS. 1 to 9, and the controller 380. ) Is configured to execute the computer program stored in the storage 360.
  • the steps of a method or algorithm described in connection with an embodiment of the invention may be implemented directly in hardware, such as an application specific integrated circuit (ASIC), in a software module such as a computer program, or by a combination thereof.
  • Software modules include random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EPROM), flash memory, hard disk, removable disk, CD- It may reside in a ROM, or any form of computer readable recording medium well known in the art.
  • the layout of the keyboard may include (a) a first area corresponding to the first finger, (b) a second finger adjacent to the first area, and The second area corresponding to the third finger and (c) a third area spaced apart from the first area and adjacent to the second area and corresponding to the fourth finger may be conveniently used by the user. Key can be entered.
  • the first to third regions of the keyboard may each include (i) two or more first keys for inputting characters, and (ii) the keyboard of the keyboard.
  • One second key for executing a certain function one or more first keys are assigned two or more characters, one or more second keys are assigned two or more functions, and the first and second keys Since the touch is determined by different key inputs according to the type of the touch, the user can quickly input a key by using the characteristics of the touch sensing surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé permettant de fournir un clavier sur écran et un dispositif informatique permettant d'effectuer cela. Le procédé comprend les étapes suivantes : afficher un clavier ayant un agencement prédéterminé ; détecter un contact tactile d'un utilisateur sur une surface de détection tactile dans une zone prédéterminée du clavier ; déterminer une saisie de touche prédéterminée voulue par l'utilisateur, en fonction d'un résultat de la détection ; et effectuer un processus correspondant à la saisie de touche prédéterminée déterminée.
PCT/KR2019/000364 2018-01-30 2019-01-10 Procédé de fourniture de clavier sur écran et dispositif informatique permettant d'effectuer cela WO2019151669A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180011066A KR20190091914A (ko) 2018-01-30 2018-01-30 온스크린 키보드 제공 방법 및 이를 수행하는 컴퓨팅 디바이스
KR10-2018-0011066 2018-01-30

Publications (1)

Publication Number Publication Date
WO2019151669A1 true WO2019151669A1 (fr) 2019-08-08

Family

ID=67478795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/000364 WO2019151669A1 (fr) 2018-01-30 2019-01-10 Procédé de fourniture de clavier sur écran et dispositif informatique permettant d'effectuer cela

Country Status (2)

Country Link
KR (1) KR20190091914A (fr)
WO (1) WO2019151669A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100933891B1 (ko) * 2007-12-18 2009-12-28 엔에이치엔(주) 터치스크린을 이용한 한글 입력 방법
KR20130026646A (ko) * 2011-09-06 2013-03-14 선문대학교 산학협력단 터치기반 이동 단말 및 터치기반 이동 단말에서의 소프트 키보드 제어 방법
KR20150047413A (ko) * 2013-10-24 2015-05-04 윤경숙 버튼에 배정된 특정 그룹 문자 배정 변환 방법
KR20150129345A (ko) * 2014-05-12 2015-11-20 김여일 터치식 문자판을 지닌 통신단말기의 디스플레이 활용면적을 최대화시킨 다국어 문자입력 문자판 및 문자 배치방법
KR101663909B1 (ko) * 2015-09-01 2016-10-07 한국과학기술원 전자 장치, 및 이의 동작 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK179329B1 (en) 2016-06-12 2018-05-07 Apple Inc Handwriting keyboard for monitors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100933891B1 (ko) * 2007-12-18 2009-12-28 엔에이치엔(주) 터치스크린을 이용한 한글 입력 방법
KR20130026646A (ko) * 2011-09-06 2013-03-14 선문대학교 산학협력단 터치기반 이동 단말 및 터치기반 이동 단말에서의 소프트 키보드 제어 방법
KR20150047413A (ko) * 2013-10-24 2015-05-04 윤경숙 버튼에 배정된 특정 그룹 문자 배정 변환 방법
KR20150129345A (ko) * 2014-05-12 2015-11-20 김여일 터치식 문자판을 지닌 통신단말기의 디스플레이 활용면적을 최대화시킨 다국어 문자입력 문자판 및 문자 배치방법
KR101663909B1 (ko) * 2015-09-01 2016-10-07 한국과학기술원 전자 장치, 및 이의 동작 방법

Also Published As

Publication number Publication date
KR20190091914A (ko) 2019-08-07

Similar Documents

Publication Publication Date Title
US9389700B2 (en) Apparatus and method for inputting characters on touch screen of a terminal
US8739053B2 (en) Electronic device capable of transferring object between two display units and controlling method thereof
WO2016104867A1 (fr) Dispositif numérique et procédé de commande associé
KR101331697B1 (ko) 단말기의 문자 입력 장치 및 방법
KR101636705B1 (ko) 터치스크린을 구비한 휴대 단말의 문자 입력 방법 및 장치
WO2016072823A2 (fr) Dispositif d'entrée tactile multipoint en forme de boucle, gestes, et procédé à cet effet
WO2017192008A1 (fr) Procédé de sélection d'un menu en plusieurs étapes et dispositif électronique pour la mise en œuvre de ce procédé
WO2014084633A1 (fr) Procédé d'affichage d'applications et dispositif électronique associé
WO2015088263A1 (fr) Appareil électronique fonctionnant conformément à l'état de pression d'une entrée tactile, et procédé associé
US8508483B2 (en) Apparatus and method for inputting characters in a terminal
WO2013125804A1 (fr) Procédé et appareil permettant le déplacement d'un contenu dans un terminal
WO2015119378A1 (fr) Appareil et procédé d'affichage de fenêtres
WO2009157637A1 (fr) Dispositif de saisie de caractère et procédé de saisie de caractère
TW201516783A (zh) 輸入裝置、其輸入模式切換方法以及電腦裝置
JP2022545202A (ja) 対象位置調整方法及び電子機器
KR20140073245A (ko) 후면 입력을 가능하게 하기 위한 방법 및 그 방법을 처리하는 전자 장치
TWI659353B (zh) 電子設備以及電子設備的工作方法
WO2016085186A1 (fr) Appareil électronique et procédé d'affichage d'objet graphique de ce dernier
WO2014104726A1 (fr) Procédé de fourniture d'interface utilisateur utilisant un système tactile à un seul point et appareil associé
WO2015115691A1 (fr) Terminal de communication mobile et boîtier pour terminal de communication mobile
WO2011087206A2 (fr) Procédé de saisie d'un caractère coréen sur un écran tactile
WO2013042910A1 (fr) Dispositif et procédé de saisie de lettres dans un terminal mobile
WO2019151669A1 (fr) Procédé de fourniture de clavier sur écran et dispositif informatique permettant d'effectuer cela
WO2018070657A1 (fr) Appareil électronique et appareil d'affichage
KR102120324B1 (ko) 온스크린 키보드 제공 방법 및 이를 수행하는 컴퓨팅 디바이스

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19747071

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19747071

Country of ref document: EP

Kind code of ref document: A1