KR20150001990A - Device and method for information processing providing letter and character - Google Patents

Device and method for information processing providing letter and character Download PDF

Info

Publication number
KR20150001990A
KR20150001990A KR1020130075195A KR20130075195A KR20150001990A KR 20150001990 A KR20150001990 A KR 20150001990A KR 1020130075195 A KR1020130075195 A KR 1020130075195A KR 20130075195 A KR20130075195 A KR 20130075195A KR 20150001990 A KR20150001990 A KR 20150001990A
Authority
KR
South Korea
Prior art keywords
character
gesture
input
matching
keyboard interface
Prior art date
Application number
KR1020130075195A
Other languages
Korean (ko)
Other versions
KR101511132B1 (en
Inventor
조현중
Original Assignee
고려대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 고려대학교 산학협력단 filed Critical 고려대학교 산학협력단
Priority to KR20130075195A priority Critical patent/KR101511132B1/en
Priority to PCT/KR2014/005790 priority patent/WO2014209079A1/en
Publication of KR20150001990A publication Critical patent/KR20150001990A/en
Application granted granted Critical
Publication of KR101511132B1 publication Critical patent/KR101511132B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Abstract

According to the present invention, a data processing device to provide text and characters includes: a storage device in which a gesture-based text input application is stored; and a processor which executes the gesture-based text input application. In the storage device, characters and related data matched to gestures and text are stored. While the gesture-based text input application is being run, the processor displays a gesture-based keyboard interface in a certain area of a display of the data processing device, displays text matched to a gesture inputted by a user, reads the characters or related data matched to the gesture performed during the input of the text and to the text from the storage device, and displays the characters or related data in the remaining area except for the area where the keyboard interface is displayed.

Description

TECHNICAL FIELD [0001] The present invention relates to an information processing apparatus and method for providing characters and characters,

The present invention relates to an information processing apparatus and method for providing characters and characters.

2. Description of the Related Art [0002] With the recent spread of mobile devices based on a touch screen, an interface that provides various user experiences using characteristics of a touch screen has become a main component of a mobile device. In particular, a virtual keyboard for inputting characters is of great importance as an interface that provides convenience to a user because it can be displayed and input on a touch screen without a separate physical keyboard. However, a conventional virtual keyboard has a disadvantage in that a key is placed on a limited area on a touch screen, and thus typos are frequently caused. Accordingly, in recent years, keyboard interfaces have been actively developed to overcome such shortcomings.

Particularly, a gesture-based character input method is being actively developed. The character input method based on the gesture is advantageous in that it can reduce the occurrence of misunderstandings because the key position is not limited without the wide area displayed on the touch screen. However, it is necessary to develop an interface that takes into account the characteristics of the gesture-based character input method. That is, it is necessary to develop a gesture-based character input method that enhances the interaction elements that attract the user's attention and enhance the user's experience.

In this regard, U.S. Patent Application Publication No. 2011-0296324 (entitled Avatars Reflecting User States) may generate a user-defined avatar for reflecting the current status of a user, A specialized avatar instance having a facial expression, body language, accessory, and presentation design that reflects the user status for each user state can be created using the trigger event based on the state of the data, emoticon or device, If one or more trigger events indicating the occurrence of a particular user state are detected at the device, the avatar displayed on the device suggests a method and system in which a particular user state is updated in the user defined avatar.

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems of the prior art, and it is an object of some embodiments of the present invention to provide various information through a gesture input.

According to an aspect of the present invention, there is provided an information processing apparatus including a storage device storing a gesture-based character input application and a processor executing the gesture-based character input application Wherein the storage device stores character and related information that matches gestures and characters, and wherein the processor, when executing the gesture-based character input application, causes the gesture-based keyboard An interface is displayed, a character matching the gesture input by the user is displayed, and a character or association information matching the gesture and character performed when the character is input is read from the storage device, .

According to another aspect of the present invention, there is provided a method of providing characters and characters through an information processing apparatus, the method comprising: displaying a keyboard interface based on a gesture on a predetermined area of a display of the information processing apparatus; Displaying the character to be matched, and displaying the character or association information matching the gesture and character performed when the character is input in a remaining area outside the area where the keyboard interface is displayed.

According to an embodiment of the present invention, a character is input through a gesture input, a character or association information matching a gesture and a character input in an extra space of a display is displayed, The input may be provided to induce the user's interest.

1 is a block diagram for explaining a configuration of an information processing apparatus according to an embodiment of the present invention.
2 is a diagram for explaining a keyboard interface and a gesture input area displayed on a display of an information processing apparatus according to an embodiment of the present invention.
3 is a diagram for explaining a keyboard interface and a character area displayed on a display of an information processing apparatus according to an embodiment of the present invention.
4 is a view for explaining a keyboard interface according to an embodiment of the present invention.
5 is a view for explaining display of advertisement information using a character according to an embodiment of the present invention.
FIG. 6 is a view for explaining display of advertisement information using a character according to an embodiment of the present invention.
7 is a flowchart illustrating a method of providing characters and characters through an information processing apparatus according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.

1 is a block diagram for explaining a configuration of an information processing apparatus according to an embodiment of the present invention.

At this time, in the embodiment of the present invention, it is assumed that the information processing apparatus 100 is a mobile device (for example, a smart phone).

1, the configuration of the information processing apparatus 100 according to an embodiment of the present invention is shown. However, other processing units (not shown) may be further included depending on the type of the apparatus.

In addition, the information processing apparatus 100 according to an exemplary embodiment of the present invention is not limited to a mobile device, and may include various types of processing units depending on the type and purpose of each device.

Also, the information processing apparatus 100 may be implemented as a portable terminal. Here, the portable terminal is, for example, a portable communication device with guaranteed portability and mobility, such as a Personal Communication System (PCS), a Global System for Mobile communications (GSM), a Personal Digital Cellular (PDC) PDA (Personal Digital Assistant), IMT (International Mobile Telecommunication) -2000, Code Division Multiple Access (CDMA) -2000, W-Code Division Multiple Access (W-CDMA), WiBro (Wireless Broadband Internet) Phone, a smart pad, and the like, for example.

In addition, the information processing apparatus 100 may include a display capable of being input by performing touch and gesture.

The information processing apparatus 100 according to an embodiment of the present invention includes a storage device 110 and a processor 120. [

The storage device 110 stores a gesture-based character input application and stores characters and associated information that match gestures and characters. Here, the character is information to be inserted by the gesture input, and can be matched and stored with the character and the related information. Also, the character may be displayed by changing the expression, the attitude, the color, etc. of the character according to the input character, and the character may be updated not only by any one character but also through the downloading method. In addition, the association information is information related to the input character, and may include additional information such as advertisement information, guidance information, and the like.

Processor 120 executes a gesture-based character input application. The processor 120 displays a keyboard interface based on the gesture on a predetermined area of the display of the information processing apparatus 100 at the time of execution of the gesture-based character input application and displays a character matching the gesture input by the user And reads character or related information matching the gesture and character performed when a character is input from the storage device 110, and displays the character or interface in the remaining area outside the displayed area of the keyboard interface. At this time, the keyboard interface can be superimposed on the character input window. In addition, a character matching the input gesture can be displayed through the gesture input area, which is the area outside the area where the keyboard interface is displayed. At this time, the gesture and the character or related information related to the character can be displayed in the gesture input area . However, the processor 120 may not separately display the gesture input area. In other words, the keyboard interface can directly touch and drag the displayed area, and may not define a separate gesture input area. Also, if the gesture input area is not displayed separately, the character or related information may be displayed in the remaining area outside the area where the keyboard interface is displayed.

For example, referring to FIGS. 2 and 3, the keyboard interface, the gesture input area, and the character area displayed on the display will be described below.

2 is a diagram for explaining a keyboard interface and a gesture input area displayed on a display of an information processing apparatus according to an embodiment of the present invention.

When the search web page is displayed on the display 130 of the information processing apparatus 100, the keyboard interface 210 may be displayed so as to overlap with the character input window 230. In addition, the gesture input area 220 may be created in the remaining area outside the area where the keyboard interface 210 is displayed. The processor 120 may detect a gesture input through the gesture input area 220 and display a character matching the gesture on the character input window 230. [ In addition, the character can be displayed as shown in FIG.

3 is a view for explaining a keyboard interface and a character displayed on a display of an information processing apparatus according to an embodiment of the present invention.

The keyboard interface 210 and the character input window 230 may be superimposed on the display 130 of the information processing apparatus 100 as illustrated in FIG. At this time, as shown in FIG. 3, the character 240 can be displayed in the remaining area outside the area where the keyboard interface 210 is displayed. At this time, the character 240 is displayed by being matched with the gesture and the character, and can express the change of the expression, the posture, the color, etc. of the character 240 that has input a specific word or reacted to the specific gesture. This can be a factor that enables the user to feel more interesting and fun in the character input process. This can provide differentiated interaction and user experience.

Here, the keyboard interface 210 includes a center item, a plurality of peripheral items arranged around the center item, and when the user inputs a gesture from each of the plurality of peripheral items toward the center item, The individual character assigned to the item selected by the gesture can be inserted into the character string of the character input window 230. [ At this time, the gesture inputting any one of the peripheral items selected by the user toward the center item may be a straight or curved path. In addition, the keyboard interface 210 may include a guideline for guiding the gesture input, or may be omitted, such as the keyboard interface 210 shown in FIGS. 2 and 3. The keyboard interface 210 having the guideline omitted can input a desired peripheral item as a center item through a straight line gesture. The keyboard interface 210 will be described in more detail with reference to FIG.

4 is a view for explaining a keyboard interface according to an embodiment of the present invention. In the following description, the keyboard interface 210 including the guidelines is described, but the present invention is not limited thereto.

The keyboard interface 210 includes a central item 21 and a plurality of peripheral items 22a to 22j in which at least one different item is disposed and a plurality of guide lines 22a to 22j corresponding to the plurality of peripheral items 22a to 22j. (23i to 23j). The plurality of items arranged in the keyboard interface 210 of FIG. 4 may be numerals or letters, but is not limited thereto. At this time, the center item 21 is blank, in which no items are arranged before the gesture for selecting any one of the plurality of peripheral items 22a to 22j is input, or one of the items representing at least one item It may be a placed item. The gesture for selecting an item may include a drag operation based on a guide line corresponding to a peripheral item to be selected from among the plurality of guide lines 23a to 23j, or a click operation corresponding to the center item 21. For example, in order to input the number '2' when the numbers are arranged in the peripheral items 22a to 22j of the keyboard interface 210 as shown in FIG. 4, the user displays the number '2' It is possible to easily input the gesture of the linear drag operation from the upper side to the lower side by using the guide line 23b corresponding to the peripheral item 22b.

Referring again to FIG. 1, the processor 120 may read and display the characters 240 matching the direction, input speed, or input word of the drag input from the storage device 110. That is, the character 240 may be changed according to the direction of the drag input, the input speed, or the input word. For example, when the input word is 'love', the face of the character 240 may change to a red color to indicate a reaction to the word 'love'.

In addition, when the processor 120 detects a continuous drag input for a predetermined number of times up or down or left or right, the processor 120 can read and display the matching character from the storage device 110. For example, when the character 240 is repeatedly dragged repeatedly in the vertical direction, the character 240 can insert an emoticon such as '^^' into the character input window 230 while providing an interaction for making a smiling face . In addition, when the character 240 is repeatedly dragged repeatedly in the left and right direction, the character 240 can insert and display an emoticon such as 'ㅠ ㅠ' in the character input window 230 while providing an interaction for making a sad expression.

In addition, the processor 120 may read the associated special character or special gesture from the storage device 110 and display the matching information when the special character or the special gesture is input. That is, if a character set as a special character is input, related information related thereto can be displayed. At this time, the processor 120 may first output the notification information for receiving the special character. At this time, the user can cause the user to input the special character through the notification using the character 240 . Further, the user can input the special gesture using the character 240, and can confirm the association information matching the special gesture. In this case, the association information may be advertisement information. The display of the advertisement information using the character 240 according to an embodiment of the present invention will be described with reference to FIGS. 5A to 6B.

Referring to FIG. 5A, it can be seen that the character 240 is superimposed on the gesture input area 220. At this time, notification information for receiving a special character may be displayed on the character 240. For example, the color of the character 240 may change or an alarm may be output. Next, as shown in FIG. 5B, the special character 410 set in advance on the display 130 may be output as a 'April 30th fighting concert'. The user can stop the change of the alarm or character 240 by inputting the special character 410 and can confirm the advertisement information which is related information.

Next, referring to FIG. 6A, the user can confirm the special gesture guide 221 displayed superimposed on the character 240 displayed on the display 130. The information processing apparatus 100 may output the matching information matching the special gesture inputted along the special gesture guide 221 when the special gesture is detected. As shown in FIG. 6B, the association information may be advertisement information. For example, advertisement information such as 'Daejeon Sai Concert' may be displayed along with the expression of the character 241 changed.

Referring again to FIG. 1, the processor 120 may output a character displayed through the keyboard interface 210 as the voice of the character 240. That is, since the keyboard interface 210 is input according to the gesture, input can be performed without viewing the display 130. Therefore, the character input through the keyboard interface 210 can be guided by voice to confirm whether or not the character is input correctly. At this time, the character 240 can output the voice as interesting and interesting.

In addition, the processor 120 can execute the character 240 matching the gesture and the character input by the user in association with each other when the game or the social network service (hereinafter referred to as SNS) is executed. For example, the character 240 can be executed in cooperation with a game through the batter. It is possible to execute the character 240 in conjunction with a game that can compare the batter speed, a misty rate, etc. with other users, or an SNS game using the batter.

7 is a flowchart illustrating a method of providing characters and characters through an information processing apparatus according to an embodiment of the present invention.

In step S110, a keyboard interface based on the gesture input may be displayed in a predetermined area of the display of the information processing apparatus. Here, the keyboard interface includes a center item, a plurality of peripheral items arranged at a distance from each other around the center item, and a plurality of guide lines corresponding to the plurality of peripheral items, And may be of a different shape or orientation. When the user inputs a character selection gesture for selecting any one of the keyboard interface, the individual character assigned to the item selected by the character selection gesture can be inserted into the character string input window.

In step S120, a character matching the gesture input input by the user can be displayed. That is, when a gesture for selecting a peripheral item according to a guideline is input using the keyboard interface, a character matching the gesture can be inserted into a character string of a character input window. At this time, when a drag input is continuously detected for a predetermined number of times in the up, down, left, and right directions, a matching character can be displayed. For example, if a sequential drag input that repeats up and down is detected, an emoticon such as '^^' can be inserted into the string while the character's smiling face is displayed.

In step S130, the character or related information matching the gesture and the character performed when the character is input can be displayed in the remaining area outside the area where the keyboard interface is displayed. The character that matches the direction of the drag input, the input speed, or the input word can be displayed. That is, the change of the expression, the attitude, the color, etc. of the character can be changed by the gesture and the character inputted by the user, and can be changed by the direction of the drag input, the input speed or the input word. In addition, when a predetermined special character or a special gesture is input, association information matching the predetermined special character or special gesture can be displayed. The notification information for receiving a special character can be output first. For example, when the color of the character changes and the alarm information that sounds the alarm is displayed, the user can input the special character to stop the notification information and check the related information. Also, if a special gesture is input according to the special gesture guide, the character that is changed in association with the related information can be output. At this time, the association information may be advertisement information.

Further, in step S130, the character inserted through the keyboard interface can be output as the voice of the character. In other words, the keyboard interface is input through the gesture input, so input is possible even without confirming the display. In such a case, the character can output the input character by voice so that the user can confirm whether or not the input character is correctly input.

Further, in step S130, a character matching the gesture and the character input by the user can be executed in cooperation with each other when the game or the social network service is executed. For example, characters can be linked and executed in a game through the batter. It is possible to run the character in conjunction with a game that can compare the batter speed, a misty rate, etc. with other users, or an SNS game using the batter.

It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.

100: information processing apparatus 110: storage device
120: Processor 130: Display
210: keyboard interface 220: gesture input area
230: character input window 240, 241: character

Claims (16)

An information processing apparatus comprising:
A gesture-based character input application,
A processor for executing the gesture-based character input application,
The storage device stores character and association information matching the gesture and character,
Wherein the processor displays a keyboard interface based on a gesture on a predetermined area of the display of the information processing apparatus when the gesture-based character input application is executed, displays characters matching the gesture input by the user, Reads character or association information matching the gesture and character performed at the time of input from the storage device, and displays the character or association information in a remaining area outside the displayed area of the keyboard interface.
The method according to claim 1,
The keyboard interface
Center item; And
A plurality of peripheral items arranged around the center item,
Wherein when the gesture from each of the plurality of peripheral items is input by the user, the processor inserts the individual character assigned to the item selected by the gesture into the character string of the character input window.
The method according to claim 1,
Wherein the processor reads from the storage device a character that matches the direction of the drag input, the input speed, or the input word, and displays the character.
The method according to claim 1,
Wherein the processor reads characters from the storage device when the drag input is detected consecutively for a predetermined number of times up or down or right and left.
The method according to claim 1,
Wherein the processor reads association information from the storage device and displays the matching information when the predetermined special character or special gesture is input.
6. The method of claim 5,
Wherein the processor first outputs notification information for receiving a special character.
The method according to claim 1,
Wherein the processor outputs characters displayed through the keyboard interface as a voice of the character.
The method according to claim 1,
Wherein the processor cooperates with the character matching the gesture and character input by the user when the game or the social network service is executed.
A method for providing characters and characters through an information processing apparatus,
Displaying a gesture-based keyboard interface on a predetermined area of a display of the information processing apparatus;
Displaying a character matching a gesture input by a user; And
Displaying a character or association information matching a gesture and a character performed in inputting a character in a remaining area outside the area where the keyboard interface is displayed.
10. The method of claim 9,
The keyboard interface
Center item; And
A plurality of peripheral items arranged around the center item,
And inserting individual characters assigned to the item selected by the gesture into a character string of a character input window when a gesture from the plurality of peripheral items is input by the user to the center item.
10. The method of claim 9,
Wherein the step of displaying the character or the association information indicates a character matching the direction of the drag input, the input speed, or the input word.
12. The method of claim 11,
Wherein the step of inserting the character is to display a matching character when a drag input is continuously detected for a predetermined number of times up or down or left or right.
10. The method of claim 9,
Wherein the step of displaying the character or the association information is to display association information that matches the predetermined special character or the special gesture when the special character or the special gesture is input.
14. The method of claim 13,
Wherein the step of displaying the character or the association information is a step of first outputting notification information for receiving a special character.
10. The method of claim 9,
Wherein the step of displaying the character or association information comprises outputting characters displayed through the keyboard interface as a voice of the character.
10. The method of claim 9,
Wherein the step of displaying the character or the association information is performed by interlocking the character matching the gesture and the character input by the user when the game or the social network service is executed.
KR20130075195A 2013-06-28 2013-06-28 Device and method for information processing providing letter and character KR101511132B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR20130075195A KR101511132B1 (en) 2013-06-28 2013-06-28 Device and method for information processing providing letter and character
PCT/KR2014/005790 WO2014209079A1 (en) 2013-06-28 2014-06-30 Information processing device and method for providing text and characters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130075195A KR101511132B1 (en) 2013-06-28 2013-06-28 Device and method for information processing providing letter and character

Publications (2)

Publication Number Publication Date
KR20150001990A true KR20150001990A (en) 2015-01-07
KR101511132B1 KR101511132B1 (en) 2015-04-10

Family

ID=52142317

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130075195A KR101511132B1 (en) 2013-06-28 2013-06-28 Device and method for information processing providing letter and character

Country Status (2)

Country Link
KR (1) KR101511132B1 (en)
WO (1) WO2014209079A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160146474A (en) * 2015-06-12 2016-12-21 스피어다인 주식회사 Input device and setting and executing method for ui
US10635457B2 (en) 2015-06-12 2020-04-28 Tyrenn Co., Ltd. Input device and UI configuration and execution method thereof

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030039019A (en) * 2001-11-09 2003-05-17 신선혜 Medium storing a Computer Program with a Function of Lip-sync and Emotional Expression on 3D Scanned Real Facial Image during Realtime Text to Speech Conversion, and Online Game, Email, Chatting, Broadcasting and Foreign Language Learning Method using the Same
KR101304461B1 (en) * 2006-12-04 2013-09-04 삼성전자주식회사 Method and apparatus of gesture-based user interface
KR20100033879A (en) * 2008-09-22 2010-03-31 오의진 Character inputting device
KR101162243B1 (en) * 2009-12-01 2012-07-04 박철 Method for inputting information of touch screen panal
JP5270606B2 (en) * 2010-03-30 2013-08-21 ヤフー株式会社 Portable terminal, information processing system, display method, program, and information processing apparatus
KR101799315B1 (en) * 2011-09-26 2017-11-20 엘지전자 주식회사 Method for operating an Image display apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160146474A (en) * 2015-06-12 2016-12-21 스피어다인 주식회사 Input device and setting and executing method for ui
US10635457B2 (en) 2015-06-12 2020-04-28 Tyrenn Co., Ltd. Input device and UI configuration and execution method thereof

Also Published As

Publication number Publication date
WO2014209079A1 (en) 2014-12-31
KR101511132B1 (en) 2015-04-10

Similar Documents

Publication Publication Date Title
US10795574B2 (en) Input method, input apparatus, and terminal device
US11743213B2 (en) User interfaces for messages
KR101717805B1 (en) Systems and methods for haptically-enhanced text interfaces
US10097494B2 (en) Apparatus and method for providing information
US9256366B2 (en) Systems and methods for touch-based two-stage text input
ES2778863T3 (en) Graphical user interface for a game system
US8893054B2 (en) Devices, systems, and methods for conveying gesture commands
US20140055381A1 (en) System and control method for character make-up
KR20160088620A (en) Virtual input apparatus and method for receiving user input using thereof
US20150012874A1 (en) Electronic device and method for character deletion
US9507386B2 (en) Dual pivot mechanical hinge with discreet wiring
KR102053196B1 (en) Mobile terminal and control method thereof
JP6139728B2 (en) Chat room management method and terminal
CN109542323A (en) Interaction control method and device, storage medium, electronic equipment
KR101511132B1 (en) Device and method for information processing providing letter and character
JP5704655B2 (en) Display device and program
CN108572744B (en) Character input method and system and computer readable recording medium
Lin et al. Establishing interaction specifications for online-to-offline (O2O) service systems
KR102308927B1 (en) Method for outputting screen and display device for executing the same
US20150113398A1 (en) Method for inputting characters, terminal, and recording medium
KR20210124950A (en) System and method for terminal device control
KR102204599B1 (en) Method for outputting screen and display device for executing the same
KR101755807B1 (en) Method and storage medium for displaying character in portable terminal screen
KR101653102B1 (en) Method for inputting korean/english/number/symbol characters using simplified qwerty software keypad
JP6109397B1 (en) Computer mounting method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
FPAY Annual fee payment

Payment date: 20180406

Year of fee payment: 4