KR20170093427A - Implement the method and system of the story keyboard for a social network service - Google Patents

Implement the method and system of the story keyboard for a social network service Download PDF

Info

Publication number
KR20170093427A
KR20170093427A KR1020160014801A KR20160014801A KR20170093427A KR 20170093427 A KR20170093427 A KR 20170093427A KR 1020160014801 A KR1020160014801 A KR 1020160014801A KR 20160014801 A KR20160014801 A KR 20160014801A KR 20170093427 A KR20170093427 A KR 20170093427A
Authority
KR
South Korea
Prior art keywords
story
image data
keyboard
information
code
Prior art date
Application number
KR1020160014801A
Other languages
Korean (ko)
Inventor
이명호
Original Assignee
이미지랩409(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이미지랩409(주) filed Critical 이미지랩409(주)
Priority to KR1020160014801A priority Critical patent/KR20170093427A/en
Publication of KR20170093427A publication Critical patent/KR20170093427A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/30Transportation; Communications

Abstract

Disclosed are a method and a system for realizing a story keyboard for a social network service. The method for realizing a story keyboard for a social network service is performed by a user terminal, and comprises: (a) a step of driving an application program to execute a social network service; (b) a step of receiving a command for execution of story composition data while executing the application program; (c) a step of referring to previously stored response setting information to recognize image data matching a code identifier designated by code information included in selected story composition data in image data previously stored in the user terminal, and composing a story inserted to replace the code information designating the matched code identifier by the recognized image data to display the story on a story keyboard which is a display area formed on one side of an execution screen of the application program; (d) a step of generating selection information which is an object being an image or text corresponding to a touch position or indication information including an identifier to specify the story and position coordinate values in the story corresponding to the touch position if a touch for a position in the story keyboard with the displayed story is sensed; and (e) a step of transmitting the selection information to a terminal device of a conversation partner via a communication network by a user command.

Description

[0001] The present invention relates to a method and system for a story keyboard for a social network service,

The present invention relates to a method and system for implementing a story keyboard for a social network service that enables a user to access various information through an instant messenger or utilize individual areas according to various purposes.

Recently, a variety of social network services (Social Network Services, SNS) are provided, and conversations are carried out by exchanging photographs, emoticons and texts among users connected to the real-time conversation page such as Telegram, Watts App, Kakao Talk and Naver Line Instant messenger is the most widely used.

1 (a) and 1 (b), the execution screen of the instant messenger generally includes the name display area 10, the input area 15, and the input area 15 of the conversation partner And a display area 11 in which a text or emoticon that is input to the conversation partner and sent to the conversation partner and text or emoticon received from the conversation partner is displayed.

The input area 15 includes an emoticon display area 14 and an emoticon selected in the text or emoticon display area 14 input by the user so as to be maintained in the transmission standby state And a dialog window (12). It is natural that the emoticon presenting area 14 can be switched to a keyboard presentation area, which is a virtual keyboard for text input, with the cursor positioned in the dialog window 12 for the user to input text.

In recent years, emoticons that can be transmitted and received by using an instant messenger are extended to moving image emoticons as illustrated in (b) of FIG. 1 in addition to still image emoticons. When the user selects an arbitrary moving image emoticon, the selected moving image emoticon is displayed on the preview window 16 temporarily formed in the display area 11 in the transmission standby state, so that the user can check in advance which action the selected moving image emoticon is going to proceed have.

The user can transmit the text or / and emoticon inputted in the dialog window 12 and the emoticon displayed in the preview window 16 to the conversation partner by pressing the transmission button.

However, the conventional instant messenger has only the input area 15 and the display area 11, that is, focused on the purpose of real-time conversation, and the instant messenger allows the user to access various information, There was a limit that could not be utilized.

Also, in the conventional instant messenger, the text is limited to input using only the keyboard presentation area, and the emoticon is limited to select any one of the emoticons displayed small in the lattice-like arrangement structure at the lower end of the execution screen.

This limitation causes inconvenience for the user to select text or emoticons. In addition, in the case of keyboard selection for text input, there is a problem that the size and arrangement interval of selectable objects (for example, alphabetic characters or English alphabets listed in the keyboard presentation area) are narrow and frequent typos are frequently generated, It is difficult to grasp the images of the emoticons in detail by arranging them as small and simple as possible due to the limitation of the arrangement area, and there is a problem that the feelings, emotions and stories possessed by the emoticons can not be transmitted.

In addition, the present instant messenger's emoticon gift-giving function allows other users There is a problem in that it is difficult to attract the user because it is limited only to the function of transmitting emoticons and it is also difficult to share emotions with other users even if they present emoticons to other users.

Korean Patent Publication No. 2009-0075397 (Method of transmitting and receiving communication messages including emoticons)

The present invention is to provide a method and system for implementing a story keyboard for a social network service that enables an instant messenger to have a variety of additional functions as well as a real-time messaging function.

The present invention provides a method and system for implementing a storyboard keyboard for a social network service in which story configuration data can be variously reassembled and displayed for each user by image data stored in a terminal in response to code information. Here, the image data is data stored in a terminal equipped with a storage device in the form of electronic data such as still image / moving image emoticon, picture, and photograph.

The present invention is to provide a method and system for implementing a story keyboard for a social network service that enables various sizes of a story keyboard displayed in an execution screen of an instant messenger to be used.

The present invention provides a method and system for implementing a story keyboard for a social network service that enables a story keyboard, which is a region where a story is displayed, to be used as a virtual input device.

The present invention is to provide a story keyboard implementation method and system for a social network service that can contribute to enhance copyright protection for image data reconstructed into a story and activate e-commerce.

The present invention is to provide a story keyboard implementation method and system for a social network service that allows emotions to be shared through the empathy of a story with other users by presenting story configuration data and image data together with other users .

Other objects of the present invention will become readily apparent from the following description.

According to an aspect of the present invention, there is provided a story keyboard implementation method for a social network service performed by a user terminal, the method comprising: (a) driving an application program for executing a social network service; (b) receiving execution of story configuration data in an execution state of the application program; (c) recognizing image data corresponding to a code identifier specified in each of the code information included in the selected story configuration data among the image data stored in advance in the user terminal, with reference to the previously stored corresponding setting information, Constructing a story inserted so as to replace code information for which a code identifier corresponding to the data is substituted and displaying the story on a story keyboard which is a display area formed on one side of the execution screen of the application program; / RTI >

Wherein the step (c) comprises: determining whether image data corresponding to each of the code identifiers is stored in the user terminal; And code information in which the matching image data is not stored may be inserted in place of the predetermined error image. Here, when the user selection for the error image is input, the user terminal can access a web page for purchasing the image data corresponding to the code information.

The code information may include a code identifier that identifies image data to be inserted and image data corresponding to the code identifier may be inserted in place of the code information in the story configuration data . The code information may further include at least one of information about an insertion size of image data corresponding to the code identifier, information about a rotation angle of the image data, and a characteristic value corresponding to the image data.

When a touch on the internal position of the story keyboard in which the story is displayed is detected, the story keyboard implementation method for the social network service may be an object that is a text or an image matching the touch position, an identifier for specifying the story, Generating selection information that is instruction information including a position coordinate value in the story corresponding to the position; And transmitting the selection information to a terminal device of the other party specified by a user command through a communication network.

The application program may be one or more of a real time instant messenger program and a blog service program.

The story configuration data may comprise a combination of text and the code information. One or more of the image data and the story configuration data may be transmitted by a third party's request and stored in the user terminal in a manner such as presenting. Also, one or more of the text and images included in the story displayed on the story keyboard may be enlarged or reduced by the user's touch operation on the execution screen.

According to another aspect of the present invention, there is provided a story keyboard implementation system for a social network service, comprising: an SNS service server for providing a social network service; A content service server for providing story configuration data and image data; And storing one or more pieces of story data and at least one piece of image data provided from the contents service server, and when any story configuration data is selected by user selection, referring to previously stored corresponding setting information, Recognizes the image data corresponding to the code identifier specified by each of the code information included in the story configuration data, constructs a story inserted to replace the code information in which the code identifier corresponding to the recognized image data is specified, There is provided a story keyboard implementation system for a social network service including a user terminal for displaying on a storyboard keyboard.

Wherein the user terminal replaces code information in which configuration image information included in the selected story configuration data is not stored with a predetermined error image, and when a user selection of the error image is input, And can access a web page for purchasing matching image data.

The code information may include a code identifier that identifies image data to be inserted and image data corresponding to the code identifier may be inserted in place of the code information in the story configuration data . The code information may further include at least one of information about an insertion size of image data corresponding to the code identifier, information about a rotation angle of the image data, and a characteristic value corresponding to the image data.

When the user touches the internal position of the story keyboard on which the story is displayed, the user terminal displays the text or image corresponding to the touch position, the identifier for specifying the story, And transmits the selection information to the terminal device of the other party designated by the user command through the communication network.

The story keyboard may be formed at one side of an execution screen of a social network service program that is one or more of a real time instant messenger program and a blog service program driven by the user terminal. One or more of the image data and the story configuration data may be transmitted by a third party's request and stored in the user terminal in a manner such as presenting. Also, one or more of the text and images included in the story displayed on the story keyboard may be enlarged or reduced by the user's touch operation on the execution screen.

Other aspects, features, and advantages will become apparent from the following drawings, claims, and detailed description of the invention.

According to the embodiment of the present invention, the instant messenger has a function of executing a story keyboard (for example, an e-book execution area) that expresses a story reconstructed by story configuration data such as a picture book, a comic book, a newspaper article, an advertisement, So that the user of the instant messenger can access the chat room for real-time conversation with the chat partner or execute the story keyboard even without accessing the chat room to read the story.

In addition, when a story reconstructed on the basis of the story configuration data stored in the terminal or received via the communication network is displayed on the story keyboard, a story is formed by inserting the image data stored in the terminal corresponding to the code information inserted in the story configuration data . For example, if the same code identifiers are set in different image data in each of a plurality of terminals, even if the story is reconstructed by the same story configuration data, a different image and a recombined story can be displayed through the story keyboard There is also. In this case, even though the same story is displayed, in the terminal A, the image of the character a is displayed as the main character, and in the terminal B, the image of the character b is displayed as the main character.

In addition, since the size of the story keyboard displayed in the execution screen of the instant messenger can be variously adjusted, the story can be browsed on the story keyboard of the desired size by the user, so that the user can conveniently select the object and the emoticon displayed on the conventional emoticon display area It is possible to grasp the image in a shorter time, and it is also possible to grasp a more detailed image when the displayed image is a detailed picture or a photograph.

When a touch is made on any object (for example, a character, a word, a sentence, a picture, a picture, an emoticon, etc.) constituting a story displayed on a story keyboard, It is also convenient that the selected object can be transmitted to the conversation partner only by pressing the transmission button of the user.

If the image data to be inserted into the story is not stored in the terminal corresponding to the code information inserted in the story configuration data, the image data is replaced with an error image (for example, a rectangular icon or an X-shaped icon) And it is possible to purchase corresponding image data through the touch of the error image, thereby contributing to copyright protection and activation of electronic commerce.

In addition to providing story data and image data to other users, it is possible to share image data with other users by sharing story data with story data, .

1 is a diagram illustrating an execution screen of an instant messenger according to the related art.
2 is a diagram schematically illustrating a social network service system according to an embodiment of the present invention;
3 is a block diagram schematically illustrating a configuration of a user terminal according to an embodiment of the present invention;
4 is a diagram illustrating a configuration of corresponding setting information between a code identifier and image data according to an embodiment of the present invention;
FIG. 5 is a diagram for explaining a story reconstruction process according to an embodiment of the present invention; FIG.
6 is a flowchart illustrating a method of implementing a story keyboard according to an embodiment of the present invention.
7 is a view illustrating a story display state using a story keyboard according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating an object transmission method selected in a story keyboard according to an embodiment of the present invention; FIG.
9 is a diagram for explaining an object selection process using a story keyboard according to an embodiment of the present invention.
10 is a view illustrating an emoticon collection in a story display state of a story keyboard according to an embodiment of the present invention;
11 is a diagram illustrating a story and an emoticon gift function in a story keyboard according to an embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises" or "having" and the like refer to the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Also, the terms " part, "" unit," " module, " Hardware, and software.

It is to be understood that the components of the embodiments described with reference to the drawings are not limited to the embodiments and may be embodied in other embodiments without departing from the spirit of the invention. It is to be understood that although the description is omitted, multiple embodiments may be implemented again in one integrated embodiment.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

FIG. 2 is a schematic diagram of a social network service system according to an embodiment of the present invention, and FIG. 3 is a block diagram schematically showing a configuration of a user terminal according to an embodiment of the present invention. FIG. 4 is a view illustrating a configuration of corresponding setting information between a code identifier and image data according to an embodiment of the present invention. FIG. 5 is a view for explaining a story reconstruction process according to an embodiment of the present invention.

Referring to FIG. 2, the social network system may include a user terminal 110, a server group 120, and a counterpart terminal 130.

The user terminal 110 and the counterpart terminal 130 are communication terminals that can be connected to one or more service servers included in the server group 120 through a communication network. For example, smart phones, tablet PCs, notebook computers, personal computers, and the like, each having an application program downloading and driving function and an Internet access function, may be included.

The server group 120 collectively refers to service servers for performing a normal social network service, for example, an instant messaging service and the like.

In the server group 120, a content service server 122 for providing story configuration data, image data and the like to be described later, an SNS service server 124 for providing an application program for providing a social network service and related services, and the like . In addition, all of the communication terminals may be included in the server for accessing the Internet and using various communication services.

Hereinafter, for convenience of explanation, an example will be described in which the SNS service server 124 is an instant messaging service server capable of performing real-time conversation by simultaneously connecting the user terminal 110 and the counterpart terminal 130 . Of course, the scope of application of the present invention is not limited to the instant messaging service, and it is obvious that the present invention can be applied to various SNS services as well as a blog service such as Facebook.

Also, the case where the content service server 122 and the SNS service server 124 are separately implemented will be described as an example, but it is needless to say that they may be implemented to provide a plurality of services simultaneously in one server.

The user terminal 110 can receive a content such as story configuration data, image data, and the like by accessing the content service server 122 by driving a predetermined application program. Also, the user terminal 110 can perform a real-time conversation with the counterpart terminal 130 connected to the SNS service server 124 by driving a predetermined application program.

At this time, the respective application programs for connecting to the content service server 122 and for connecting to the SNS service server 124 may be integrated and implemented. However, in each of the individually implemented application programs, And then executing the program as if it were integrated.

3, in which the configuration of the user terminal 110 is schematically shown in a block configuration, the user terminal 110 may include a hardware unit 210 and a software unit 220. [ 2, the user terminal 110 and the counterpart terminal 130 are referred to as different terms for the sake of clarity. However, the user terminal 110 and the counterpart terminal 130 may have the same configuration as illustrated in FIG. 3 . ≪ / RTI >

The hardware unit 210 may include a communication unit 212, a storage unit 214, and a touch screen unit 216.

The communication unit 212 functions to connect the user terminal 110 to the content service server 122 and / or the SNS service server 124 through a communication network. The user terminal 110 can receive story configuration data and / or image data from the connected contents service server 122 using the network connection function provided by the communication unit 212, and can also receive the story configuration data and / or image data from the connected SNS service server 124 ) Can receive a real-time conversation service with a conversation partner.

The storage unit 214 stores at least one of an operating program for driving the user terminal 110, image data, story configuration data, program data to be driven by the software unit 220, and the like.

4, each of the image data may be stored in the storage unit 214 to correspond to a pre-designated code identifier 410, and the relationship between the image data and the code identifier 410 may be separately managed . The code identifier 410 may be assigned by the content service server 122 that provides image data, for example, and may be predefined such that the code identifier 410 corresponding to the image data by the user may be modified or disabled have.

Referring again to FIG. 3, the storage unit 214 includes a permanent memory area in which data is stored and held until a user performs an active deletion process, and a temporary memory area in which data for driving an application program is temporarily And a temporary memory area to be stored and deleted. Of course, temporary files which are temporarily stored for temporary use in the process of driving the Internet or application programs in the temporary memory area, but are not automatically deleted, may be stored and maintained.

The touch screen unit 216 is a display device having a virtual input device implemented to receive a position when a hand or an accelerator pen touches (i.e., touches). The touch screen unit 216 may provide information on the touch position to the software unit 220 when a predetermined area is touched.

The software unit 220 is a conceptual area in which an application program (for example, an instant messaging application, a music reproduction application, etc.) operates by driving program data stored in the storage unit 214, Is a virtual unit. Hereinafter, for convenience of explanation and understanding, the case where the instant messaging application is driven by the software unit 220 will be described as an example.

The software unit 220 may include an input module 222, a story keyboard processing module 224, and a display module 226.

The input module 222 inputs a virtual input area 15 (see FIG. 1) for inputting or selecting an image such as text or / or emoticon to the conversation partner in order to conduct a real-time conversation on one side of the touch screen unit 216 Display. The input text and / or the selected image using the input area 15 displayed by the input module 222 are transmitted to the counterpart terminal 130 via the communication unit 212 by the user's operation, Lt; / RTI >

The display module 226 displays text and / or images transmitted to the counterpart terminal 130 and text and / or images received from the counterpart terminal 130 on the display area 11 formed on one side of the touch screen unit 216 Output.

The story keyboard processing module 224 outputs the story keyboard 720 (see FIG. 7) to one side of the display area of the touch screen unit 216, and outputs the reconstructed story to the story keyboard based on the story construction data selected by the user .

The story configuration data is illustrated in FIG. 5 (b), which replaces a specific portion of the story data illustrated in FIG. 5 (a) with respective pieces of code information.

The story keyboard processing module 224 refers to the corresponding setting information (see FIG. 4) stored in advance in the code identifier 410 specified in the code information 520 (see FIG. 5) included in the story configuration data selected by the user, 5C by replacing the image data with the image data corresponding to each code identifier 410 stored in the terminal 110 and outputting the reconstructed story within the area defined by the story keyboard 720 do.

5B shows an example [[0x1006]] including only the code identifier 410 as the code information 520, but the code identifier 520 is inserted in place of the code identifier 520 in the code information 520 A scale value of the image data may be further included. In this case, the story keyboard processing module 224 may recognize the image data corresponding to the code identifier 410, and then insert the recognized image data into the corresponding position in a size corresponding to the designated scale value.

The code information 520 may further include a rotation value. In this case, the story keyboard processing module 224 may rotate image data corresponding to the rotation angle value when inserting and replacing the code information.

In addition, the characteristic values (for example, characteristic values for implementing flickering effect, sound effect, vibration effect, etc., respectively) for adjusting the insertion shape of the image to be replaced in the code information 520 in various ways Information can be further included in the information, and more various effects can be given when reconstructing a story based on story configuration data using such code information.

For example, an image in which a characteristic value for a blinking effect is set may be displayed in a blinking shape in a story.

As another example, in the process of scrolling the display information in the story keyboard 720 upward or downward to browse the story, the image in which the property value for the sound effect or the vibration effect is set is moved to a predetermined position (for example, Height position in the keyboard 720, etc.), the designated sound may be outputted through the speaker of the user terminal 110, or a vibration of a predetermined pattern may be output from the user terminal 110. [

In addition, the story configuration data illustrated in FIG. 5B exemplifies a form in which the text and the code information 520 are arranged in the same paragraph in a mixed manner. However, the story configuration data may be configured to form a story in which the text and the image are individually arranged at independent locations, such as a children's picture book, or may be arranged so that the text and the image have no special rules such as a comic book or a photo album In addition, text and images can be arranged in various structures.

In reconstructing the story using the story configuration data and the image data previously stored in the user terminal 110, the story keyboard processing module 224 reconstructs the story based on the image data corresponding to the code information included in the story configuration data, 110), a predetermined error image 910 (see FIG. 9) may be inserted in place of the corresponding image.

The size of the story keyboard 720 displayed on the touch screen unit 216 by the story keyboard processing module 224 may be defined as a part of the display screen area of the software unit 220 as illustrated in Figure 7 (b) , Or substantially all of the display screen area of the software unit 220 as illustrated in Fig. 7 (c). The size of the story keyboard 720 to be displayed may be adjusted by a user through selection of a predetermined function button or the like.

The story keyboard processing module 224 can variously change the display form of the story displayed in the story keyboard 720 using the touch information provided from the touch screen unit 216. [

For example, if the user's two fingers remain touched and the fingers are spaced apart, the text and images that make up the story can be enlarged as a whole. If the spacing between the fingers is narrowed, The entire image may be reduced so that more content is displayed within the story keyboard 720. [

As another example, if the user's three fingers remain touched and the distance between the fingers is increased, only the text and images constituting the story may be enlarged and displayed so that the text is more readable and can be used for learning , And if the interval between the fingers is narrowed, only the text and images constituting the story may be reduced in size and displayed as a whole.

As another example, if the user's four fingers remain touched and the distance between the fingers is distant, only the text and the image constituting the story may be enlarged and displayed so that a picture assimilation subscription effect can be obtained, When the interval between the fingers becomes narrow, the text and the image constituting the story may be reduced in size as a whole.

In addition, even if the user does not use the scroll bar displayed on one side of the story keyboard 720, when the user drags the finger upward or downward in a state where the finger is touched at an arbitrary position in the story keyboard 720, It is of course possible to make it possible.

Referring to FIG. 10, a part of the story configuration data may include an area 1010 in which emoticons that can be utilized when a user uses the instant messaging service are displayed together. This can be implemented by arranging the code information 520 corresponding to each of the displayable emoticons according to a predetermined rule, and the user can grasp necessary emoticons at a glance by using the corresponding area, and can easily select and use the emoticons.

FIG. 6 is a flowchart illustrating a method of implementing a story keyboard according to an embodiment of the present invention, and FIG. 7 is a view illustrating a story display state using a story keyboard according to an embodiment of the present invention.

Referring to FIG. 6, the story keyboard processing module 224 selects one piece of story configuration data by the user at step 610.

7A, when a story function button 710 arranged on one side of the execution screen of the instant messaging application program is touched, a list of selectable story configuration data is outputted to one side of the execution screen, Can touch one of the story configuration data and select it.

Referring back to FIG. 6, in step 620, the story keyboard processing module 224 generates an image corresponding to the code identifier 410 specified in the code information included in the selected story configuration data among the image data stored in the user terminal 110 And recognizes the data. The story configuration data may be composed of text and code information as described above, and may be composed of only code information if necessary.

In step 630, the story keyboard processing module 224 replaces each piece of code information included in the storyline composition data with corresponding image data to reconstruct the story, and outputs the reconstructed story to the story keyboard 720 formed on one side of the execution screen. .

FIG. 7B shows a display example in which the story configuration data selected by the user is reconstructed as a story and output to the story keyboard 720. The story keyboard 720 may be resized as illustrated in FIG. 7C by touching a predetermined function button or by dragging the boundary of the story keyboard 720 displayed on the screen in a touch state .

7A to 7C illustrate a case in which the story keyboard 720 is executed while the user is connected to the chat room with the chat partner "Hong Kil-dong". However, in a state in which the user is not connected to any chat room For example, it is natural that the story keyboard 720 can also be executed in a basic execution screen of an application program).

7 shows a case where 'A Christmas Carol' is selected from three pieces of story configuration data corresponding to three books, and a story in which a text and an image are mixed in the same paragraph is displayed. However, The story that can be displayed in the space of the image can be varied into a picture book or a comic book or the like which is composed of only image data and an advertisement in which the text and the image are independently arranged in the respective areas.

FIG. 8 is a flowchart illustrating an object transfer method selected in a story keyboard according to an embodiment of the present invention. FIG. 9 illustrates an object selection process using a story keyboard according to an embodiment of the present invention.

The story displayed in the story keyboard 720 described with reference to FIGS. 8 and 9 may be a story reconstructed based on the story construction data as described above, A PDF file, a word document file, an image file, etc. downloaded through a communication network to be executed in the book terminal) may be displayed and executed.

9 shows a case where the story keyboard 720 is executed while connected to a dialog window for real-time conversation with a specific conversation partner. However, the fact that the story keyboard 720 can be executed even when the story keyboard 720 is not connected to the dialog window, Same as.

Referring to FIG. 8, the story keyboard processing module 224 recognizes at step 810 that the user has touched a location within the story keyboard area 720 area. The story keyboard processing module 224 compares the touch position information provided from the touch screen unit 216 with the coordinate value range indicating the area of the story keyboard 720 so that the user touches the area defined by the story keyboard 720 And also can recognize where the position is.

In step 820, the story keyboard processing module 224 specifies selection information corresponding to the touch position. Here, the selection information may be an object (for example, image data or text) conforming to the touch position as illustrated in, for example, FIG. 9A, or the story configuration data File, hereinafter the same) and the position coordinate value in the corresponding story configuration data corresponding to the touch position.

Then, the story keyboard processing module 224 proceeds to a transmission wait state for transmitting the selection information corresponding to the touch position to the counterpart terminal 130 in step 830, and transmits a transmission command (e.g., 'Send 'Button touch) is inputted, the selection information is transmitted to the counterpart terminal 130 through the communication unit 212. If the selection information is an object having text, for example, the object itself may be transmitted, but an ASCII code value, a Unicode code value, or the like corresponding to the object may be transmitted.

As described above, the story keyboard 720 according to the present embodiment is used not only as a space for browsing a story, but also by touching an object included in a story displayed in the space, And a virtual input device for transmitting to a designated conversation partner.

In this way, image data such as emoticons can be freely distributed in free space within the space partitioned by the story keyboard 720, rather than the conventional lattice arrangement of the same size, so that the user can be precisely touched , And in the case of Korean character input, there is also an advantage that it can be transmitted as a completed character type object by touching 'RU' in the word 'geese' without making the character 'RU' with the combination of '' 'and' 있다 ' .

Referring to FIG. 9, when the user touches the image data 420, which is an object, as illustrated in FIG. 9A, the object is placed in a transmission waiting state as shown in FIG. 9B And transmitted to the counterpart terminal 130 by a transmission command of the user.

As another example, when the user touches the text 510 'about', which is another object, the object proceeds to the transmission waiting state as shown in FIG. 9C, Lt; / RTI > At this time, although it may be set to be recognized as being inputted as a word unit (for example, about) corresponding to the touch position, it is set to be recognized as being input as a character unit corresponding to the touch position (for example, about a) It is possible.

Also, depending on the setting, the length of the text to be recognized may be set to be selected according to the duration of the touch. For example, if the duration of the touch is 0.5 seconds, it may be set to recognize one character, one duration if the duration of the touch is 1 second, and one sentence if the duration of the duration is 2 seconds or more.

You can also touch a letter and drag it along the sentence to select the desired sentence.

On the other hand, when the instruction information is transmitted to the counterpart terminal 130 in the transmission standby state, the counterpart terminal 130 specifies the story component data using the identifier of the story component data included in the instruction information , The text or code information corresponding to the position coordinate value in the story configuration data is read out and the corresponding text or image data is output to the execution screen. That is, when the instruction information is transmitted to the counterpart terminal 130 and the corresponding story configuration data is stored in the counterpart terminal 130, the object corresponding to the object touched by the user terminal 110 is stored in the counterpart terminal 130, Can be output in the same manner.

If the story data and / or the image data are not stored in the counterpart terminal 130, the same object as the object specified by the instruction information can not be specified in the counterpart terminal 130, It may not be displayed properly in the form of a display. In order to prevent this, when the instruction information is transmitted to the counterpart terminal 130, the story component data and / or the image data for reconstructing the story using the gift function may be transmitted to the counterpart terminal 130 as described above will be.

When the story keyboard processing module 224 reconstructs the story based on the story configuration data, if the image data corresponding to the code identifier 410 is not yet stored in the storage 214, the corresponding code identifier 410 ) To replace the corresponding image data and output an error image 910. [

In this case, when the user touches the displayed error image 910, the story keyboard processing module 224 transmits the image data corresponding to the code identifier 410 through the communication network to the contents service server 122 . In this case, the story keyboard processing module 224 may invoke a pre-implemented application program so that it can be connected to the content service server 122 and be executed.

For example, if each image data included in the story configuration data of the entertainer's picture album is configured to be individually purchased, if the user displays a story corresponding to the story configuration data on the story keyboard 720, Since they are not yet stored in the part 214, they will be replaced with error images 910 and displayed, respectively.

In this situation, the user can purchase the corresponding image data from the content service server 122 connected via the communication network by touching the specific error image 910, or download it free of charge, and download the image data to the storage unit 214 Once saved, the user can view the story displayed by replacing the error image with appropriate image data.

11 is a diagram illustrating a story structure data and / or image data for forming a story to be displayed through a story keyboard 720 formed on one side of a display screen of a user terminal 110, using a gift function 1110, And the like.

As described above, the story keyboard implementation method and system for the social network service according to the present embodiment may be implemented by a software program or the like as described above. The code and code segments that make up the program can be easily deduced by a computer programmer in the field. In addition, the program is stored in a computer readable medium, readable and executed by a computer, and implements the method. The information storage medium includes a magnetic recording medium, an optical recording medium, and a carrier wave medium.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention as defined in the following claims And changes may be made without departing from the spirit and scope of the invention.

110: user terminal 120: server group
130: counterpart terminal 122: content service server
124: SNS service server 210: hardware unit
212: communication unit 214:
216: touch screen unit 220: software unit
222: input module 224: story keyboard processing module
226: Display module 720: Story keyboard

Claims (20)

1. A story keyboard implementation method for a social network service performed by a user terminal,
(a) driving an application program for executing a social network service;
(b) receiving execution of story configuration data in an execution state of the application program;
(c) recognizing image data corresponding to a code identifier specified in each of the code information included in the selected story configuration data among the image data stored in advance in the user terminal, with reference to the previously stored corresponding setting information, Constructing a story inserted to replace code information for which a code identifier corresponding to the data is specified, and displaying the story on a story keyboard, which is a display area formed on one side of the execution screen of the application program;
(d) If a touch on the internal position of the story keyboard in which the story is displayed is detected, the touch point is an object that is a text or image corresponding to the touch position, or an identifier for specifying the story, Generating selection information which is instruction information including a position coordinate value of the position information; And
(e) transmitting the selection information through a communication network to a terminal device of a conversation partner by a user command.
The method according to claim 1,
The step (c)
Determining whether image data corresponding to each of the code identifiers is stored in the user terminal; And
Wherein the code information in which the matching image data is not stored includes inserting a predetermined error image in place,
Wherein when the user selection of the error image is input, the user terminal accesses a web page for purchasing the image data corresponding to the code information.
The method according to claim 1,
Wherein the code information includes a code identifier that identifies image data to be inserted,
Wherein the image data corresponding to the code identifier is inserted in place of the code information in the story configuration data.
The method of claim 3,
Wherein the code information further includes at least one of information about an insertion size of image data corresponding to the code identifier, information about a rotation angle of the image data, and a characteristic value corresponding to the image data. How to Implement a Storyboard for Services.
The method according to claim 1,
When a touch on the internal position of the story keyboard in which the story is displayed is detected, the touch point is detected as an object that is a text or image corresponding to the touch position, an identifier for specifying the story, and a position coordinate value Generating selection information which is instruction information including the selection information; And
And transmitting the selection information via a communication network to a terminal device of the other party specified by a user command.
The method according to claim 1,
Wherein the application program is at least one of a real-time messenger program and a blog service program.
The method according to claim 1,
Wherein the story configuration data comprises a combination of text and the code information.
The method according to claim 1,
Wherein the story displayed in the step (c) is a story displayed by execution of an e-book file.
The method according to claim 1,
When the instruction information is transmitted to the terminal apparatus of the conversation partner as the selection information, the terminal apparatus of the conversation partner specifies a story corresponding to the identifier, and identifies an object corresponding to the position coordinate value in the story And outputting the information on the screen.
The method according to claim 1,
Wherein the at least one of the image data and the story configuration data is transmitted at a request of a third party and stored in the user terminal.
The method according to claim 1,
Wherein the at least one of the text and the images included in the story displayed on the story keyboard is enlarged or reduced by a touch operation of the user with respect to the execution screen.
12. A recording medium on which a program readable by a digital processing apparatus is recorded for performing a story keyboard implementation method for a social network service according to any one of claims 1 to 11.
A story keyboard implementation system for a social network service,
An SNS service server providing a social network service;
A content service server for providing story configuration data and image data; And
Storing one or more pieces of story configuration data and one or more pieces of image data provided from the content service server and, when arbitrary story configuration data is selected by user selection, referring to the previously stored corresponding configuration information, The image data corresponding to the code identifier specified by each of the code information included in the configuration data is recognized and a story inserted to replace the code information with the code identifier corresponding to the recognized image data, A story keyboard implementation system for a social network service including a user terminal for displaying on a story keyboard.
14. The method of claim 13,
The user terminal comprises:
The code information in which the matching image data among the configuration information included in the selected story configuration data is not stored is replaced with a predetermined error image,
And if a user selection of the error image is input, a web page for purchasing the image data corresponding to the code information is accessed.
14. The method of claim 13,
Wherein the code information includes a code identifier that identifies image data to be inserted,
Wherein the image data corresponding to the code identifier is inserted in place of the code information in the story configuration data.
16. The method of claim 15,
Wherein the code information further includes at least one of information about an insertion size of image data corresponding to the code identifier, information about a rotation angle of the image data, and a characteristic value corresponding to the image data. Story keyboard implementation system for service.
14. The method of claim 13,
The user terminal comprises:
If a touch on the internal position of the story keyboard in which the story is displayed is detected, the touch point is an object that is a text or image corresponding to the touch position, an identifier for specifying the story, and a position coordinate And generates selection information, which is instruction information including a value,
And transmitting the selection information to a terminal device of the other party specified by a user command through a communication network.
14. The method of claim 13,
Wherein the story keyboard is formed at one side of an execution screen of a social network service program that is at least one of a real time instant messenger program and a blog service program driven by the user terminal.
14. The method of claim 13,
Wherein the at least one of the image data and the story configuration data is transmitted at a request of a third party and stored in the user terminal.
14. The method of claim 13,
Wherein at least one of text and images included in a story displayed on the story keyboard is enlarged or reduced by a user's touch operation on the display screen.
KR1020160014801A 2016-02-05 2016-02-05 Implement the method and system of the story keyboard for a social network service KR20170093427A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160014801A KR20170093427A (en) 2016-02-05 2016-02-05 Implement the method and system of the story keyboard for a social network service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160014801A KR20170093427A (en) 2016-02-05 2016-02-05 Implement the method and system of the story keyboard for a social network service

Publications (1)

Publication Number Publication Date
KR20170093427A true KR20170093427A (en) 2017-08-16

Family

ID=59752544

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160014801A KR20170093427A (en) 2016-02-05 2016-02-05 Implement the method and system of the story keyboard for a social network service

Country Status (1)

Country Link
KR (1) KR20170093427A (en)

Similar Documents

Publication Publication Date Title
CN113747376B (en) Message extension application store
TWI463368B (en) Method for providing the background of locked screen, electronic device and computer program product using the same
CN107491296B (en) Messaging application interfacing with one or more extension applications
US10194288B2 (en) Sticker distribution system for messaging apps
Firtman Programming the mobile web
CN107831974B (en) Information sharing method and device and storage medium
US20140380229A1 (en) System, method and user interface for designing customizable products from a mobile device
Lal Digital design essentials: 100 ways to design better desktop, web, and mobile interfaces
MX2012008069A (en) Electronic text manipulation and display.
WO2021008334A1 (en) Data binding method, apparatus, and device of mini program, and storage medium
KR101567555B1 (en) Social network service system and method using image
US20230196646A1 (en) Personalized media overlay recommendation
Kim The library mobile experience: Practices and user expectations
CN103455508A (en) Method and device for controlling social network application information stream display
CN107766106A (en) The method and apparatus for generating configuration file
TW201035860A (en) Method and computer program product for displaying document on mobile device
US20230229279A1 (en) User interfaces for managing visual content in media
KR101750788B1 (en) Method and system for providing story board, and method and system for transmitting and receiving object selected in story board
KR20170055345A (en) Social Network Service and Program using Cartoon Image Extraction and Transformation system and method using image
WO2018149288A1 (en) Office document sending method, terminal and system
KR20180135532A (en) Method and system for providing Story-board
KR20170093427A (en) Implement the method and system of the story keyboard for a social network service
KR20180134719A (en) Method and system for providing Story Keyboard, and method and system for transmitting and receiving a selected object using the same
Kaida et al. Interaction by Taking a Picture for Smartphone Generation
Ling et al. Browsers vs. apps: The role of apps in the mobile internet

Legal Events

Date Code Title Description
N231 Notification of change of applicant