KR101511132B1 - Device and method for information processing providing letter and character - Google Patents
Device and method for information processing providing letter and character Download PDFInfo
- Publication number
- KR101511132B1 KR101511132B1 KR20130075195A KR20130075195A KR101511132B1 KR 101511132 B1 KR101511132 B1 KR 101511132B1 KR 20130075195 A KR20130075195 A KR 20130075195A KR 20130075195 A KR20130075195 A KR 20130075195A KR 101511132 B1 KR101511132 B1 KR 101511132B1
- Authority
- KR
- South Korea
- Prior art keywords
- character
- gesture
- input
- keyboard interface
- matching
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An information processing apparatus according to the present invention includes a storage device storing a gesture-based character input application and a processor executing the gesture-based character input application, the storage device storing character and association information matching gestures and characters, Wherein the processor displays a keyboard interface based on a gesture on a predetermined area of the display of the information processing apparatus when the gesture-based character input application is executed, displays characters matching the gesture input by the user, A character or association information matching the gesture and character performed at the time of input is read out from the storage device and displayed on the remaining area outside the area where the keyboard interface is displayed.
Description
The present invention relates to an information processing apparatus and method for providing characters and characters.
2. Description of the Related Art [0002] With the recent spread of mobile devices based on a touch screen, an interface that provides various user experiences using characteristics of a touch screen has become a main component of a mobile device. In particular, a virtual keyboard for inputting characters is of great importance as an interface that provides convenience to a user because it can be displayed and input on a touch screen without a separate physical keyboard. However, a conventional virtual keyboard has a disadvantage in that a key is placed on a limited area on a touch screen, and thus typos are frequently caused. Accordingly, in recent years, keyboard interfaces have been actively developed to overcome such shortcomings.
Particularly, a gesture-based character input method is being actively developed. The character input method based on the gesture is advantageous in that it can reduce the occurrence of misunderstandings because the key position is not limited without the wide area displayed on the touch screen. However, it is necessary to develop an interface that takes into account the characteristics of the gesture-based character input method. That is, it is necessary to develop a gesture-based character input method that enhances the interaction elements that attract the user's attention and enhance the user's experience.
In this regard, U.S. Patent Application Publication No. 2011-0296324 (entitled Avatars Reflecting User States) may generate a user-defined avatar for reflecting the current status of a user, A specialized avatar instance having a facial expression, body language, accessory, and presentation design that reflects the user status for each user state can be created using the trigger event based on the state of the data, emoticon or device, If one or more trigger events indicating the occurrence of a particular user state are detected at the device, the avatar displayed on the device suggests a method and system in which a particular user state is updated in the user defined avatar.
SUMMARY OF THE INVENTION The present invention has been made to solve the above problems of the prior art, and it is an object of some embodiments of the present invention to provide various information through a gesture input.
According to an aspect of the present invention, there is provided an information processing apparatus including a storage device storing a gesture-based character input application and a processor executing the gesture-based character input application Wherein the storage device stores character and related information that matches gestures and characters, and wherein the processor, when executing the gesture-based character input application, causes the gesture-based keyboard An interface is displayed, a character matching the gesture input by the user is displayed, and a character or association information matching the gesture and character performed when the character is input is read from the storage device, .
According to another aspect of the present invention, there is provided a method of providing characters and characters through an information processing apparatus, the method comprising: displaying a keyboard interface based on a gesture on a predetermined area of a display of the information processing apparatus; Displaying the character to be matched, and displaying the character or association information matching the gesture and character performed when the character is input in a remaining area outside the area where the keyboard interface is displayed.
According to an embodiment of the present invention, a character is input through a gesture input, a character or association information matching a gesture and a character input in an extra space of a display is displayed, The input may be provided to induce the user's interest.
1 is a block diagram for explaining a configuration of an information processing apparatus according to an embodiment of the present invention.
2 is a diagram for explaining a keyboard interface and a gesture input area displayed on a display of an information processing apparatus according to an embodiment of the present invention.
3 is a diagram for explaining a keyboard interface and a character area displayed on a display of an information processing apparatus according to an embodiment of the present invention.
4 is a view for explaining a keyboard interface according to an embodiment of the present invention.
5 is a view for explaining display of advertisement information using a character according to an embodiment of the present invention.
FIG. 6 is a view for explaining display of advertisement information using a character according to an embodiment of the present invention.
7 is a flowchart illustrating a method of providing characters and characters through an information processing apparatus according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.
Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.
1 is a block diagram for explaining a configuration of an information processing apparatus according to an embodiment of the present invention.
At this time, in the embodiment of the present invention, it is assumed that the
1, the configuration of the
In addition, the
Also, the
In addition, the
The
The
For example, referring to FIGS. 2 and 3, the keyboard interface, the gesture input area, and the character area displayed on the display will be described below.
2 is a diagram for explaining a keyboard interface and a gesture input area displayed on a display of an information processing apparatus according to an embodiment of the present invention.
When the search web page is displayed on the
3 is a view for explaining a keyboard interface and a character displayed on a display of an information processing apparatus according to an embodiment of the present invention.
The
Here, the
4 is a view for explaining a keyboard interface according to an embodiment of the present invention. In the following description, the
The
Referring again to FIG. 1, the
In addition, when the
In addition, the
Referring to FIG. 5A, it can be seen that the
Next, referring to FIG. 6A, the user can confirm the
Referring again to FIG. 1, the
In addition, the
7 is a flowchart illustrating a method of providing characters and characters through an information processing apparatus according to an embodiment of the present invention.
In step S110, a keyboard interface based on the gesture input may be displayed in a predetermined area of the display of the information processing apparatus. Here, the keyboard interface includes a center item, a plurality of peripheral items arranged at a distance from each other around the center item, and a plurality of guide lines corresponding to the plurality of peripheral items, And may be of a different shape or orientation. When the user inputs a character selection gesture for selecting any one of the keyboard interface, the individual character assigned to the item selected by the character selection gesture can be inserted into the character string input window.
In step S120, a character matching the gesture input input by the user can be displayed. That is, when a gesture for selecting a peripheral item according to a guideline is input using the keyboard interface, a character matching the gesture can be inserted into a character string of a character input window. At this time, when a drag input is continuously detected for a predetermined number of times in the up, down, left, and right directions, a matching character can be displayed. For example, if a sequential drag input that repeats up and down is detected, an emoticon such as '^^' can be inserted into the string while the character's smiling face is displayed.
In step S130, the character or related information matching the gesture and the character performed when the character is input can be displayed in the remaining area outside the area where the keyboard interface is displayed. The character that matches the direction of the drag input, the input speed, or the input word can be displayed. That is, the change of the expression, the attitude, the color, etc. of the character can be changed by the gesture and the character inputted by the user, and can be changed by the direction of the drag input, the input speed or the input word. In addition, when a predetermined special character or a special gesture is input, association information matching the predetermined special character or special gesture can be displayed. The notification information for receiving a special character can be output first. For example, when the color of the character changes and the alarm information that sounds the alarm is displayed, the user can input the special character to stop the notification information and check the related information. Also, if a special gesture is input according to the special gesture guide, the character that is changed in association with the related information can be output. At this time, the association information may be advertisement information.
Further, in step S130, the character inserted through the keyboard interface can be output as the voice of the character. In other words, the keyboard interface is input through the gesture input, so input is possible even without confirming the display. In such a case, the character can output the input character by voice so that the user can confirm whether or not the input character is correctly input.
Further, in step S130, a character matching the gesture and the character input by the user can be executed in cooperation with each other when the game or the social network service is executed. For example, characters can be linked and executed in a game through the batter. It is possible to run the character in conjunction with a game that can compare the batter speed, a misty rate, etc. with other users, or an SNS game using the batter.
It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.
The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.
100: information processing apparatus 110: storage device
120: Processor 130: Display
210: keyboard interface 220: gesture input area
230:
Claims (16)
A gesture-based character input application,
A processor for executing the gesture-based character input application,
The storage device stores character and association information matching the gesture and character,
Wherein the processor displays a keyboard interface based on a gesture on a predetermined area of the display of the information processing apparatus when the gesture-based character input application is executed, displays characters matching the gesture input by the user, A character or association information matching the gesture and character performed at the time of input is read out from the storage device and displayed in a remaining area outside the area where the keyboard interface is displayed,
The processor first outputs notification information for receiving a predetermined special character or a special gesture,
The keyboard interface
Center item;
A plurality of peripheral items arranged to be spaced apart from each other around the center item; And
Wherein the information processing apparatus includes a plurality of guide lines having different shapes or directions and corresponding to the plurality of peripheral items.
The keyboard interface
Wherein when the gesture from each of the plurality of peripheral items is input by the user, the processor inserts the individual character assigned to the item selected by the gesture into the character string of the character input window.
Wherein the processor reads from the storage device a character that matches the direction of the drag input, the input speed, or the input word, and displays the character.
Wherein the processor reads characters from the storage device when the drag input is detected consecutively for a predetermined number of times up or down or right and left.
Wherein the processor reads association information from the storage device and displays the matching information when the predetermined special character or special gesture is input.
Wherein the processor outputs characters displayed through the keyboard interface as a voice of the character.
Wherein the processor cooperates with the character matching the gesture and character input by the user when the game or the social network service is executed.
Displaying a gesture-based keyboard interface on a predetermined area of a display of the information processing apparatus;
Displaying a character matching a gesture input by a user; And
Displaying a character or association information matching a gesture and a character performed at the time of character input in a remaining area outside the area where the keyboard interface is displayed,
The step of displaying the character or association information may first output notification information for receiving a predetermined special character,
The keyboard interface
Center item,
A plurality of peripheral items arranged to be spaced apart from each other around the center item and
And a plurality of guide lines having different shapes or directions and corresponding to the plurality of peripheral items.
The keyboard interface
And inserting individual characters assigned to the item selected by the gesture into a character string of a character input window when a gesture from the plurality of peripheral items is input by the user to the center item.
Wherein the step of displaying the character or the association information indicates a character matching the direction of the drag input, the input speed, or the input word.
Wherein the step of displaying the character includes displaying characters matched to the drag when a continuous drag input is detected a predetermined number of times up or down or right and left.
Wherein the step of displaying the character or the association information is to display association information that matches the predetermined special character or the special gesture when the special character or the special gesture is input.
Wherein the step of displaying the character or association information comprises outputting characters displayed through the keyboard interface as a voice of the character.
Wherein the step of displaying the character or the association information is performed by interlocking the character matching the gesture and the character input by the user when the game or the social network service is executed.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130075195A KR101511132B1 (en) | 2013-06-28 | 2013-06-28 | Device and method for information processing providing letter and character |
PCT/KR2014/005790 WO2014209079A1 (en) | 2013-06-28 | 2014-06-30 | Information processing device and method for providing text and characters |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130075195A KR101511132B1 (en) | 2013-06-28 | 2013-06-28 | Device and method for information processing providing letter and character |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20150001990A KR20150001990A (en) | 2015-01-07 |
KR101511132B1 true KR101511132B1 (en) | 2015-04-10 |
Family
ID=52142317
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR20130075195A KR101511132B1 (en) | 2013-06-28 | 2013-06-28 | Device and method for information processing providing letter and character |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101511132B1 (en) |
WO (1) | WO2014209079A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101643711B1 (en) * | 2015-06-12 | 2016-07-28 | 스피어다인 주식회사 | Smart display apparatus and setting and executing method for ui |
WO2017065482A1 (en) | 2015-06-12 | 2017-04-20 | 스피어다인 주식회사 | Input device and ui configuration and execution method thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030039019A (en) * | 2001-11-09 | 2003-05-17 | 신선혜 | Medium storing a Computer Program with a Function of Lip-sync and Emotional Expression on 3D Scanned Real Facial Image during Realtime Text to Speech Conversion, and Online Game, Email, Chatting, Broadcasting and Foreign Language Learning Method using the Same |
KR20100033879A (en) * | 2008-09-22 | 2010-03-31 | 오의진 | Character inputting device |
KR20110061433A (en) * | 2009-12-01 | 2011-06-09 | 박철 | Method for inputting information of touch screen panal |
JP2011210002A (en) * | 2010-03-30 | 2011-10-20 | Yahoo Japan Corp | Mobile terminal, information processing system, display method, program, and information processing apparatus |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101304461B1 (en) * | 2006-12-04 | 2013-09-04 | 삼성전자주식회사 | Method and apparatus of gesture-based user interface |
KR101799315B1 (en) * | 2011-09-26 | 2017-11-20 | 엘지전자 주식회사 | Method for operating an Image display apparatus |
-
2013
- 2013-06-28 KR KR20130075195A patent/KR101511132B1/en active IP Right Grant
-
2014
- 2014-06-30 WO PCT/KR2014/005790 patent/WO2014209079A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030039019A (en) * | 2001-11-09 | 2003-05-17 | 신선혜 | Medium storing a Computer Program with a Function of Lip-sync and Emotional Expression on 3D Scanned Real Facial Image during Realtime Text to Speech Conversion, and Online Game, Email, Chatting, Broadcasting and Foreign Language Learning Method using the Same |
KR20100033879A (en) * | 2008-09-22 | 2010-03-31 | 오의진 | Character inputting device |
KR20110061433A (en) * | 2009-12-01 | 2011-06-09 | 박철 | Method for inputting information of touch screen panal |
JP2011210002A (en) * | 2010-03-30 | 2011-10-20 | Yahoo Japan Corp | Mobile terminal, information processing system, display method, program, and information processing apparatus |
Also Published As
Publication number | Publication date |
---|---|
WO2014209079A1 (en) | 2014-12-31 |
KR20150001990A (en) | 2015-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10795574B2 (en) | Input method, input apparatus, and terminal device | |
US11743213B2 (en) | User interfaces for messages | |
KR101375166B1 (en) | System and control method for character make-up | |
CN106687889B (en) | Display portable text entry and editing | |
KR101717805B1 (en) | Systems and methods for haptically-enhanced text interfaces | |
US9256366B2 (en) | Systems and methods for touch-based two-stage text input | |
ES2778863T3 (en) | Graphical user interface for a game system | |
US8893054B2 (en) | Devices, systems, and methods for conveying gesture commands | |
US9740400B2 (en) | Electronic device and method for character deletion | |
KR20190053207A (en) | Creation of messaging streams using animated objects | |
JP6139728B2 (en) | Chat room management method and terminal | |
US20170045917A1 (en) | Dual pivot mechanical hinge with discreet wiring | |
KR102053196B1 (en) | Mobile terminal and control method thereof | |
KR20210124950A (en) | System and method for terminal device control | |
KR101511132B1 (en) | Device and method for information processing providing letter and character | |
KR102204599B1 (en) | Method for outputting screen and display device for executing the same | |
CN104123070A (en) | Information processing method and electronic device | |
JP5704655B2 (en) | Display device and program | |
Lin et al. | Establishing interaction specifications for online-to-offline (O2O) service systems | |
JP5926755B2 (en) | Object display system for relationship graph | |
US20150262419A1 (en) | Stereoscopic 3D display model and mobile device user interface systems and methods | |
KR102308927B1 (en) | Method for outputting screen and display device for executing the same | |
CN109947511A (en) | Interactive interface determines method and device, electronic equipment and storage medium | |
US20150113398A1 (en) | Method for inputting characters, terminal, and recording medium | |
CN104423773B (en) | A kind of display methods and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
FPAY | Annual fee payment |
Payment date: 20180406 Year of fee payment: 4 |