WO2016036218A1 - 캐릭터를 이용한 메시지 서비스 방법, 상기 방법을 수행하는 사용자 단말, 상기 방법을 포함하는 메시지 애플리케이션 - Google Patents
캐릭터를 이용한 메시지 서비스 방법, 상기 방법을 수행하는 사용자 단말, 상기 방법을 포함하는 메시지 애플리케이션 Download PDFInfo
- Publication number
- WO2016036218A1 WO2016036218A1 PCT/KR2015/009402 KR2015009402W WO2016036218A1 WO 2016036218 A1 WO2016036218 A1 WO 2016036218A1 KR 2015009402 W KR2015009402 W KR 2015009402W WO 2016036218 A1 WO2016036218 A1 WO 2016036218A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- character
- conversation
- user terminal
- conversation mode
- chat room
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 230000000007 visual effect Effects 0.000 claims description 37
- 230000008451 emotion Effects 0.000 claims description 25
- 230000009471 action Effects 0.000 claims description 23
- 230000008859 change Effects 0.000 claims description 10
- 238000012552 review Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 239000003550 marker Substances 0.000 description 13
- 230000003190 augmentative effect Effects 0.000 description 11
- 241000086550 Dinosauria Species 0.000 description 10
- 235000015220 hamburgers Nutrition 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G13/00—Producing acoustic time signals
- G04G13/02—Producing acoustic time signals at preselected times, e.g. alarm clocks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
Definitions
- the present invention relates to a message service method, and more particularly, to a method for more intuitively expressing a conversation content between a user terminal and a counterpart terminal using a 2D character or a 3D character.
- the message service In the past, the message service generally displayed the contents of the conversation simply through text, images, and video. However, with the emergence of various kinds of message services, there is a demand for a function that can be more interesting to users in addition to the message-specific functions. In addition, the subject of the message service needs a way to generate a separate revenue through the message.
- the present invention provides a message service method that can induce interest of a user by providing a conversation mode for expressing a conversation content through a 3D character.
- the present invention provides a message service method that enables a user to communicate through a unique character by allowing a user to replace emotions, actions, and accessories of a 3D character through selection of a character sticker.
- Message service method performed by the user terminal comprises the steps of identifying the conversation mode associated with the method of expression of the conversation content; And displaying the chat contents of the user terminal and the counterpart terminal in the chat room according to the chat mode, wherein the chat mode does not use the first chat mode and the 3D character using the 3D character corresponding to the user terminal and the counterpart terminal.
- the displaying may include providing an interface including a character sticker associated with a visual effect to be applied to the 3D character in the first conversation mode or a time applied to a 2D character associated with the 3D character in the second conversation mode.
- An interface including a character sticker related to the effect may be provided.
- the displaying may include applying a visual effect corresponding to the selected character sticker to the 3D character in the chat room in the first conversation mode, and displaying the selected character sticker in the chat room in the second conversation mode.
- the character sticker is added or updated based on character identification information or video information, and is selected by a user from among character stickers provided through an interface or a word, number, or symbol corresponding to the character sticker in a chat window. At least one of the combination may be selected when the user inputs.
- the word may be composed of one letter or a plurality of letters, and may be expressed in various languages such as Korean, English, Japanese, and Chinese.
- the character sticker associated with the emotion of joy may be stored in a match with a keyword such as "joy”, “funny", " ⁇ ”, or the like.
- the step may identify the switched conversation mode.
- the contents of the conversation may be displayed in a chat room having a virtual screen or a real world screen set in advance or determined according to a user's selection.
- the message service method may further include: storing conversation contents in a chat room when the user terminal and the counterpart terminal leave the chat room; Displaying a 3D character associated with a user terminal and an identifier for a review according to the review request of the conversation content; And if the identifier is selected, reproducing the conversation contents in chronological order according to the first conversation mode.
- the message service method includes displaying a 3D character associated with the user terminal or the counterpart terminal according to a profile view request associated with the user terminal or the counterpart terminal; And controlling the 3D character to describe personal information of the user terminal or the counterpart terminal or to express preset emotions or actions.
- Message application for a message service method comprises the steps of identifying a conversation mode, stored in the medium of the user terminal and executed by the processor of the user terminal, the conversation method related to the representation of the conversation content; And displaying the chat contents of the user terminal and the counterpart terminal in the chat room according to the chat mode, wherein the chat mode does not use the first chat mode and the 3D character using the 3D character corresponding to the user terminal and the counterpart terminal.
- the displaying may include providing an interface including a character sticker associated with a visual effect to be applied to the 3D character in the first conversation mode or a time applied to a 2D character associated with the 3D character in the second conversation mode.
- An interface including a character sticker related to the effect may be provided.
- the displaying may include applying a visual effect corresponding to the selected character sticker to the 3D character in the chat room in the first conversation mode, and displaying the selected character sticker in the chat room in the second conversation mode.
- the method may further include switching between a first conversation mode and a second conversation mode, wherein the identifying may identify the switched conversation mode.
- the contents of the conversation may be displayed in a chat room having a virtual screen or a real world screen set in advance or determined according to a user's selection.
- the message application when the user terminal and the counterpart terminal leave the chat room, storing the conversation contents in the chat room; Displaying a 3D character associated with a user terminal and an identifier for a review according to the review request of the conversation content; If the identifier is selected, the method may further include reproducing the conversation contents in chronological order according to the first conversation mode.
- the message application displaying a 3D character associated with the user terminal or the counterpart terminal according to a profile view request associated with the user terminal or the counterpart terminal; And controlling the 3D character to describe personal information of the user terminal or the counterpart terminal or to express preset emotions or actions.
- a user terminal includes a processor for identifying a conversation mode associated with a method of presenting conversation contents; And a display for displaying the chat contents of the user terminal and the counterpart terminal in a chat room according to the chat mode, wherein the chat mode uses a first chat mode and a 3D character using 3D characters corresponding to the user terminal and the counterpart terminal. And a second conversation mode that does not.
- the first chat mode and the second chat mode may be switched between each other.
- Message service method performed by the user terminal comprises the steps of displaying a 3D character in a chat room in the background of the real world screen or virtual screen in response to the request to enter the chat room; And displaying the conversation contents of the user terminal and the counterpart terminal corresponding to the 3D character.
- the displaying of the 3D character may include displaying the 3D character in the chat room when a predetermined object or movement of the object is detected on the real world screen or the virtual screen.
- the method may further include switching the chat mode, and the displaying may include switching the chat mode. Accordingly, the conversation contents of the user terminal and the counterpart terminal can be displayed without using the 3D character.
- the displaying may include providing an interface including a character sticker associated with a visual effect to be applied to the 3D character.
- the character sticker is added or updated based on character identification information or video information, and is selected by a user from among character stickers provided through an interface or a word, number, or symbol corresponding to the character sticker in a chat window. At least one of the combination may be selected when the user inputs.
- the message application for the message service method is a chat room that is stored in a medium of a user terminal and executed by a processor of the user terminal in response to an entry request to a chat room, in the background of a real world screen or a virtual screen. Displaying a 3D character on the screen; And displaying the conversation contents of the user terminal and the counterpart terminal corresponding to the 3D character.
- an interest of the user may be induced by providing a dialogue mode for expressing the dialogue contents through the 3D character.
- the emotion, motion, accessories, etc. of the 3D character can be replaced by selecting a character sticker, so that the user can communicate through a unique character.
- FIG. 1 is a view showing the overall configuration according to an embodiment.
- FIG. 2 is a diagram illustrating a process of selecting a conversation mode according to an exemplary embodiment.
- FIG. 3 is a diagram illustrating a conversation mode for representing a 3D character using a marker, according to an exemplary embodiment.
- FIG. 4 is a diagram illustrating a process of displaying conversation contents using a 3D character according to an exemplary embodiment.
- FIG. 5 is a diagram illustrating a process of applying a visual effect to a 3D character according to an embodiment.
- FIG. 6 is a diagram for describing a process of switching a second conversation mode from a first conversation mode, according to an exemplary embodiment.
- FIG. 7 is a diagram illustrating a process of switching from a second conversation mode to a first conversation mode according to an embodiment.
- FIG. 8 is a diagram illustrating a process of re-viewing a conversation content, according to an exemplary embodiment.
- FIG. 1 is a view showing the overall configuration according to an embodiment.
- the user terminal 101 and the counterpart terminal 103 may exchange contents of a conversation through a chat room generated by the message server 102.
- the user terminal 101 and the counterpart terminal 103 may include a medium capable of storing a processor, a display, and a message application.
- the user terminal 101 and the counterpart terminal 103 may include all types of electronic devices capable of transmitting and receiving conversation contents through a network.
- when there is only one counterpart terminal 103 it may mean 1: 1 conversation, and when there are a plurality of counterpart terminals 103, it may mean 1: N conversation (group chat).
- the user terminal 101 and the counterpart terminal 103 may display a chat room through a background screen such as a real world screen or a virtual screen.
- the background screen may include an image captured by the user terminal 101 or the counterpart terminal 103 in real time, an image edited, processed, or newly generated through a computer task.
- the user terminal 101 or the counterpart terminal 103 may be operated through an interface displayed on a display or through an interface displayed on a virtual screen.
- the user terminal 101 or the counterpart terminal 103 may be a device that operates without being attached to a body part of the user, or may be a device that is attached and operated to a body part (arm, hand, head, body, etc.) of the user.
- the user terminal 101 or the counterpart terminal 103 may input a specific command according to the movement of the hand, eye, or body, or may input a specific command through a separate device such as a pen, a mouse, a keyboard, or the like.
- the user terminal 101 and the counterpart terminal 103 may be a calling terminal or a receiving terminal according to the development of conversation contents.
- a description will be given focusing on the user terminal 101, and the description of the user terminal 101 may be equally applied to the counterpart terminal 103.
- the user terminal 101 may identify a conversation mode related to a method of expressing a conversation content. For example, when a user of the user terminal 101 selects a conversation mode through an interface or starts a conversation for the first time in the user terminal 101, the conversation mode may be set in advance. Then, the user terminal 101 can identify the conversation mode selected by the user or the preset conversation mode.
- the conversation mode may mean an expression method indicating which character to express the conversation content.
- the conversation mode may include a first conversation mode using a 3D character or a second conversation mode not using a 3D character.
- the 3D character is a basic character represented in a 3D form and corresponds to the user terminal 101 and the counterpart terminal 103, respectively.
- the user terminal 101 and the counterpart terminal 103 may designate a 3D character in advance before talking through a chat room.
- the second conversation mode refers to a conventional conversation method that does not use a 3D character.
- the second conversation mode may basically mean a conversation method in which a content such as text, an image, a video, or the like may be attached and displayed in a situation where the 3D character is not displayed.
- the background screen of the chat room in which the chat contents are displayed may be a real world screen or a virtual screen.
- the real-world screen is a real-time screen shot by the user terminal 101 in real time through the camera, or a screen in which the user terminal 101 reproduces a still image or a video photographing a place or a specific object that exists in reality. It may include.
- the real world screen may be a hologram image.
- the user terminal 101 when the user terminal 101 is playing a screen, regardless of the current time and the current place of the user terminal 101, the user terminal 101 may be a still image or a still image of actually photographing a specific place or a specific object. You can play the composed slide or video. For example, even if the user terminal 101 plays a video photographing the Statue of Liberty in New York on July 30, 2014 in Seoul on August 20, 2014, the video may be a real world screen.
- the virtual screen may include a screen on which the user terminal 101 reproduces a still image or a video produced for a virtual world.
- the virtual screen may mean a content that is separately produced, not an actual object such as a game screen or an animation.
- the virtual screen may be a screen obtained by processing a real world screen.
- the virtual screen may be a screen generated by rendering the real world screen in three dimensions.
- the virtual screen may be a hologram image virtually generated by the hologram device.
- the background screen of the chat room is a real world screen
- the object set in advance by the user is displayed on the screen
- the 3D character may be displayed corresponding to the object.
- the object may be a marker for displaying a 3D character in augmented reality.
- the message service method may be performed regardless of the presence of a marker.
- the 3D character when displaying a 3D character, if a flat surface such as a desk or a floor is a real world screen or a virtual screen, the 3D character may be displayed as it is originally stored. However, in the case of a real world screen or a virtual screen that is not a flat surface such as a desk or a floor, additional accessories such as "wings" may be displayed on the 3D character to display the 3D character in the form of floating in the air. That is, the user terminal 101 displays the 3D character originally stored according to the background of the chat room in the first conversation mode or displays additional accessories such as "wings" together with the 3D character in which space the 3D character is located. I can express it.
- the user terminal 101 may additionally display the flat object at the position of the 3D character.
- the user terminal 101 may display the conversation contents of the user terminal 101 and the counterpart terminal 103 in the chat room according to the conversation mode. For example, when the first conversation mode is selected, the 3D character corresponding to the user terminal and the 3D character corresponding to the counterpart terminal may express conversation contents with each other so that the characters may have a conversation. The user terminal 101 and the counterpart terminal 103 may change the 3D character upon request.
- the first conversation mode and the second conversation mode may be switched to each other according to the user's selection while the contents of the conversation are transmitted and received in the chat room. Then, the user terminal 101 may display the contents of the conversation in the chat room using the character determined according to the switched conversation mode.
- the 3D character described in FIG. 1 may be directly registered with the message server 102 or may be registered with the message server 102 through the character server 104.
- the user terminal 101 may be provided with a basic character representing a preset basic accessory, basic operation, or basic emotion.
- the basic character corresponds to the 3D character.
- the user terminal 101 may purchase a character sticker related to a visual effect to be applied to the 3D character in order to indicate an accessory, an action, or an emotion to the basic character.
- These visual effects relate to accessories, actions or emotions.
- Character stickers related to accessories refer to contents that can change the basic character, such as clothes, shoes, jewelry, body parts (eyes, nose, mouth, chin, eyebrows, etc.) of the basic character. For example, when the user terminal 101 purchases and selects a character sticker related to an accessory called "clothes", the user terminal 101 may apply a visual effect related to "clothes" to the basic character.
- the character sticker related to an action refers to a content that allows a basic character to perform an action other than the basic action for a preset time. For example, when the user terminal 101 purchases and selects a character sticker such as "running", the user terminal 101 may apply a visual effect related to "running" to the basic character for a preset time.
- the character sticker associated with the emotion refers to content that allows the basic character to perform other emotions for a predetermined time in addition to the basic emotion. For example, when the user terminal 101 purchases and executes a character sticker of “joy”, the user terminal 101 may apply a visual effect related to “joy” to the basic character for a preset time.
- the character stickers related to the accessory, the action, or the emotion may exist independently of each other or may be present in combination with each other.
- the accessory and the emotion may be present in the form of a combination of the accessory and the emotion, such as a character sticker expressing the emotion when the accessory is worn on the basic character in the 3D form.
- a character that wears an accessory or expresses an action may exist separately in the basic character in the 3D form.
- the character sticker may be added or updated based on the identification information or the image information of the character provided online or offline.
- FIG. 2 is a diagram illustrating a process of selecting a conversation mode according to an exemplary embodiment.
- the conversation mode may be selected in various situations.
- the friend's profile 201 may be displayed as shown in FIG. 2.
- the user may select a 3D conversation meaning a first conversation mode or a 2D conversation meaning a second conversation mode in the profile 201. If the user selects the 3D conversation, the user terminal 101 may display the conversation content through the 3D character in the chat room 202 according to the first conversation mode.
- the user terminal 101 can display the chat content through text, images, video, etc. without using the 3D character in the chat room 203 according to the second chat mode.
- the user terminal 101 may provide an identifier displayed in the chat room to allow the user to switch the conversation mode.
- the identifier may refer to an interface such as an icon or a button for providing the user terminal 101 with a command to switch between the first conversation mode and the second conversation mode.
- the conversation mode is switched based on the operation of the user applied to the user terminal 101 itself or the operation of the user applied to the display of the user terminal 101 displaying a chat room.
- the conversation mode may be switched.
- the conversation mode may be switched when an operation related to the switching of the conversation mode is performed.
- FIG. 3 is a diagram illustrating a conversation mode for representing a 3D character using a marker, according to an exemplary embodiment.
- the screen 301 of FIG. 3 shows a situation in which a message including the contents of the conversation of the counterpart terminal 103 is delivered to the user terminal 101.
- the screen 302 of FIG. 3 may switch to the conversation mode in which the user terminal 101 displays a real world screen or a virtual screen.
- the 3D character corresponding to the user terminal 101 or the counterpart terminal 103 in response to the marker. May be displayed.
- the 3D character may be displayed in the chat room like augmented reality.
- the user terminal 101 when selecting a message including the contents of the conversation of the other terminal 103 on the screen 301 of FIG. 3, the user terminal 101 is a first conversation mode for displaying the 3D character or a first that does not display the 3D character 2 may provide an interface for receiving selection of one of the conversation modes. For example, the user terminal 101 displays a "3D conversation button" for selecting a first conversation mode displaying a 3D character and a "2D conversation button" for selecting a second conversation mode not displaying a 3D character. This allows the user to select a particular conversation mode.
- the user terminal 101 displays the marker when the marker is displayed on the real world screen or the virtual screen.
- the 3D character can be displayed in the chat room according to augmented reality.
- the user terminal 101 may be preset in addition to the real world screen, or the virtual screen selected by the user may be determined as the background screen of the chat room.
- the 3D character when the user requests to enter the chat room as shown in the screen 302 of FIG. 3, the 3D character may be displayed through augmented reality.
- the user terminal 101 may control to turn on the camera.
- the 3D character can be expressed in augmented reality by displaying the 3D character in a chat room having a background of a real world screen or a virtual screen.
- the content of the conversation between the user terminal 101 and the counterpart terminal 103 may be displayed through the 3D character of the user terminal 101 and the 3D character of the counterpart terminal 103 expressed through augmented reality.
- the user terminal 101 may display a 3D character in the chat room.
- the object may serve as a marker for displaying a 3D character through augmented reality.
- a 3D character may be displayed corresponding to the object.
- the 3D character may be displayed even if the object or the movement of the object is present or absent.
- the user terminal 101 displays the 3D character in the chat room according to augmented reality based on a situation in which a marker exists or based on a situation in which there is no marker,
- the conversation contents of the user terminal 101 and the counterpart terminal 103 may be displayed through the 3D character.
- the user terminal 101 may be able to switch to the conversation mode not using the 3D character at the request of the user.
- the user terminal 101 may switch from the conversation mode not using the 3D character back to the conversation mode using the 3D character according to augmented reality.
- FIG. 4 is a diagram illustrating a process of displaying conversation contents using a 3D character according to an exemplary embodiment.
- the 3D character corresponding to the user terminal 101 and the 3D character corresponding to the counterpart terminal 103 may be displayed in the chat room according to the first conversation mode.
- the conversation contents input by the user terminal 101 and the counterpart terminal 103 may be displayed corresponding to each 3D character.
- the dialogue content may be displayed through a speech bubble connected to the 3D character.
- the speech bubble may disappear over time.
- the speech bubble is continuously displayed even after a time elapses so that the user terminal 101 or the counterpart terminal 103 may check the conversation contents.
- the 3D character may perform a predetermined operation such as shaking a hand, winking, or moving a head.
- FIG. 5 is a diagram illustrating a process of applying a visual effect to a 3D character according to an embodiment.
- the 3D character displayed in the chat room in the first conversation mode may be a basic character set by the user.
- the basic character may mean a 3D character expressing a preset emotion or motion.
- an interface for selecting a character sticker related to an emotion, an action, or a visual effect of an accessory may be provided to a basic character in a part of a chat room.
- the interface may be displayed by overlapping a part of the chat room.
- the interface may be provided by selecting a button or icon displayed in the chat room.
- the character sticker may be associated with a visual effect to be applied to the 3D character which is the basic character.
- a visual effect corresponding to the character sticker may be applied to the basic character as illustrated in the screen 502 of FIG. 5.
- Character stickers may be transferable and giftable.
- the character sticker purchased by the user terminal 101 may be transferred to the counterpart terminal 103 or a third party's terminal in the form of a transfer or a gift.
- the character sticker may be continuously added or updated according to the purchase or activity of the user online or offline.
- the purchase or activity of the user offline will be described as an example. For example, suppose you are running an event that provides a dinosaur character to a customer who enters a theme park associated with a dinosaur or purchases a hamburger at a store in the theme park.
- a user may receive a real dinosaur character as a gift.
- various areas such as admission tickets purchased to enter the theme park, pamphlets of the theme park received at the time of entry, structures installed at the designated places of the theme park, receipts for the purchase of hamburgers, separate coupons provided with the hamburgers, and wrapping paper for the hamburgers.
- Identification information serial number
- image information image related to the dinosaur character may be displayed. Such identification information or image information may also be displayed on a dinosaur character.
- the user registers the identification information or the image information on the page provided by the message server 102 or the character server 104, the character associated with the dinosaur character or dinosaur character in the interface provided through the chat room of the user's message application
- the sticker may be displayed.
- the user may access a page provided by the message server 102 or the character server 104 through the user terminal 101 or other terminal.
- a character sticker associated with the dinosaur character or dinosaur character is automatically displayed on the interface provided through the chat room of the user's messaging application. Can be displayed.
- the user when a user performs online activities such as joining a specific online site, purchasing a product, or participating in an event, the user may provide identification information or video information related to a character previously set in the corresponding site. have. Then, the user can register the identification information or the image information in the message server 102 or the character server 104 to use the character sticker associated with the character corresponding to the identification information or the image information in the message application.
- Such a character sticker may be expressed differently according to the current conversation mode.
- the user terminal 101 may apply a visual effect related to the character sticker selected by the user to the basic character.
- the user terminal 101 may display the character sticker selected by the user in the chat room as it is.
- the character sticker selected by the user represents a result in which the visual effect is already reflected on the 2D character associated with the 3D character.
- the character sticker may be generated in the character server 104 and registered in the message server 102.
- the user may purchase a character sticker related to visual effects such as accessories, emotions, and actions through a purchase page provided to the message server 102.
- the purchased character stickers may be determined differently for each user.
- the counterpart terminal 103 If the user terminal 101 purchases the character sticker A and then applies it to the basic character, even if the counterpart terminal 103 does not purchase the character sticker A, the counterpart terminal 103 generates a visual effect related to the character sticker A. Applied to the 3D character of the user terminal 101 can be displayed through the chat room.
- Character stickers can be used to represent visual effects applied to 3D characters and visual effects applied to 2D characters associated with the 3D character.
- the character sticker purchased by the user may be shared in the first conversation mode and the second conversation mode.
- the character sticker representing the visual effect related to the accessory may be semi-permanently applied to the basic character.
- a character sticker representing a visual effect related to an emotion or motion may be applied to the basic character in a temporary form.
- the character sticker applied in the first conversation mode may be stored as a history in the order of time. Then, when switching from the first chat mode to the second chat mode, the character sticker used in the first chat mode may be displayed in the chat room that is switched to the second chat mode as it is in the order of time. This will be described in detail with reference to FIG. 6.
- FIG. 5 a process of selecting a character sticker directly from an interface including a character sticker in order to apply a visual effect related to a desired emotion or motion to the 3D character will be described.
- the present invention is not limited thereto, and when the user inputs a preset word, symbol, or number corresponding to the character sticker in the chat window, the user terminal 101 may enter a character corresponding to the input word, symbol or number.
- the sticker may be determined, and a visual effect related to an emotion or an action may be applied to the 3D character to be expressed as a character sticker.
- the user terminal 101 may enter the user input.
- a character sticker corresponding to " ⁇ ” may express a feeling of joy or an action related to joy to the 3D character in the chat room.
- the character sticker is selected by the user from among the character stickers provided through the interface or a combination of at least one of words, numbers, or symbols corresponding to the character stickers in the chat window. It can be selected if the user enters it. Then, the user terminal 101 may apply a visual effect such as emotion or motion corresponding to the selected character sticker to the 3D character and display the same in the chat room.
- FIG. 6 is a diagram for describing a process of switching a second conversation mode from a first conversation mode, according to an exemplary embodiment.
- a process of displaying a conversation content in a chat room through a 3D character according to the first conversation mode is illustrated.
- an identifier (2D button") associated with the transition from the first chat mode to the second chat mode may be displayed in the chat room.
- the chat room may be switched from the first conversation mode to the second conversation mode as shown in the screen 601 of FIG. 6.
- the chat content is displayed through the 3D character in the chat room.
- the counterpart terminal 103 may switch the conversation mode or may not switch the conversation mode.
- the counterpart terminal 103 when the user terminal 101 and the counterpart terminal 103 both request to switch to the second chat mode from the user terminal 101 while talking in the first chat mode, the counterpart terminal 103 performs the first chat. It is possible to maintain the mode or to switch to the second conversation mode in response to the switching request from the user terminal 101.
- the visual effect applied through the character sticker to the 3D-type basic character in the first conversation mode may be similarly applied to the second conversation mode.
- the character sticker selected in the first conversation mode may be displayed in the order of time as it is in the second conversation mode.
- the character sticker means a result that the visual effect is already applied to the 2D character associated with the 3D character.
- FIG. 7 is a diagram illustrating a process of switching from a second conversation mode to a first conversation mode according to an embodiment.
- an identifier (“3D button”) related to the transition from the second conversation mode to the first conversation mode may be displayed in the chat room.
- the user terminal 101 may display the conversation content through the 3D character according to the first conversation mode.
- the visual effect of the character sticker selected in the second conversation mode may not be applied to the 3D character displayed in the first conversation mode.
- the dialogue contents displayed in the second dialogue mode may be displayed through a speech bubble of the 3D character in history form in chronological order.
- a process of dialogue between 3D characters according to the first dialogue mode of the 3D form in the second dialogue mode of the 2D form is illustrated. Then, as shown in the screen 702 of FIG. 7, a character sticker applied to the 3D character may be displayed differently from the character sticker applied to the 2D character in the second conversation mode.
- FIG. 8 is a diagram illustrating a process of re-viewing a conversation content, according to an exemplary embodiment.
- the user terminal 101 may store the conversation contents in the chat room.
- the exit from the chat room may mean a case where the message application is terminated, another application is executed, or the chat room is terminated.
- the user may request to review the contents of the conversation exchanged with the counterpart terminal 103.
- an identifier for viewing again (“story view”) may be displayed in association with the 3D character.
- the conversation contents exchanged with the counterpart terminal 103 may be reproduced in chronological order through the 3D character according to the first conversation mode.
- the 3D character means a 3D character corresponding to each of the user terminal 101 and the counterpart terminal 103 related to the conversation.
- visual effects corresponding to the selected character sticker at a specific time may also be applied to the 3D character and displayed.
- the user may transmit a profile view request associated with the user terminal 101 or the counterpart terminal 103. Then, the user terminal 101 may display the 3D character associated with the user terminal 101 or the counterpart terminal 103 that requested to view the profile. Thereafter, the 3D character may be controlled to describe personal information of the user terminal 101 or the counterpart terminal 103 or to express a preset emotion or action.
- Methods according to an embodiment of the present invention can be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
- the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
- Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Theoretical Computer Science (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Operations Research (AREA)
- Remote Sensing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Radar, Positioning & Navigation (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (23)
- 사용자 단말이 수행하는 메시지 서비스 방법에 있어서,대화 내용의 표현 방법과 관련된 대화 모드를 식별하는 단계;상기 대화 모드에 따라 사용자 단말 및 상대방 단말의 대화 내용을 채팅방에 표시하는 단계를 포함하고,상기 대화 모드는, 사용자 단말 및 상대방 단말에 대응하는 3D 캐릭터를 이용하는 제1 대화 모드 및 3D 캐릭터를 이용하지 않는 제2 대화 모드를 포함하는 메시지 서비스 방법.
- 제1항에 있어서,상기 표시하는 단계는,제1 대화 모드인 경우, 상기 3D 캐릭터에 적용할 시각 효과와 관련된 캐릭터 스티커를 포함하는 인터페이스를 제공하거나,제2 대화 모드인 경우, 상기 3D 캐릭터와 관련된 2D 캐릭터에 적용된 시각 효과와 관련된 캐릭터 스티커를 포함하는 인터페이스를 제공하는 메시지 서비스 방법.
- 제2항에 있어서,상기 표시하는 단계는,제1 대화 모드인 경우, 선택된 캐릭터 스티커에 대응하는 시각 효과를 채팅방에 있는 3D 캐릭터에 적용하고,제2 대화 모드인 경우, 선택된 캐릭터 스티커를 채팅방에 표시하는 메시지 서비스 방법.
- 제2항에 있어서,상기 캐릭터 스티커는,캐릭터의 식별 정보 또는 영상 정보에 기초하여 추가되거나 또는 업데이트되고,인터페이스를 통해 제공되는 캐릭터 스티커들 중에서 사용자에 의해 선택되거나 또는 채팅창에 캐릭터 스티커에 대응하는 단어, 숫자, 또는 기호 중 적어도 하나의 조합을 사용자가 입력하는 경우에 선택되는 메시지 서비스 방법.
- 제1항에 있어서,상기 채팅방에 표시된 대화 모드의 전환과 관련된 식별자가 선택되거나 또는 대화 모드의 전환과 관련된 사용자의 동작이 인식되면, 제1 대화 모드와 제2 대화 모드 간에 서로 전환하는 단계를 더 포함하고,상기 식별하는 단계는, 전환된 대화 모드를 식별하는 메시지 서비스 방법.
- 제1항에 있어서,상기 표시하는 단계는,상기 제1 대화 모드가 선택되는 경우, 미리 설정되거나 또는 사용자의 선택에 따라 결정된 가상 화면 또는 실세계 화면을 배경으로 하는 채팅방에 대화 내용을 표시하는 메시지 서비스 방법.
- 제1항에 있어서,사용자 단말 및 상대방 단말이 채팅방에서 퇴장하는 경우, 채팅방에서의 대화 내용을 저장하는 단계;상기 대화 내용의 다시 보기 요청에 따라 사용자 단말과 관련된 3D 캐릭터 및 다시 보기를 위한 식별자를 표시하는 단계;상기 식별자를 선택하면, 대화 내용을 제1 대화 모드에 따라 시간 순서로 재생하는 단계를 더 포함하는 메시지 서비스 방법.
- 제1항에 있어서,상기 사용자 단말 또는 상대방 단말과 관련된 프로필 보기 요청에 따라 상기 사용자 단말 또는 상대방 단말과 관련된 3D 캐릭터를 표시하는 단계; 및상기 3D 캐릭터가 사용자 단말 또는 상대방 단말의 개인 정보를 설명하거나 또는 미리 설정된 감정 또는 동작을 표현하도록 제어하는 단계를 더 포함하는 메시지 서비스 방법.
- 메시지 서비스 방법을 위한 메시지 애플리케이션에 있어서,사용자 단말의 매체에 저장되어 사용자 단말의 프로세서에 의해 실행되는,대화 내용의 표현 방법과 관련된 대화 모드를 식별하는 단계;상기 대화 모드에 따라 사용자 단말 및 상대방 단말의 대화 내용을 채팅방에 표시하는 단계를 포함하고,상기 대화 모드는, 사용자 단말 및 상대방 단말에 대응하는 3D 캐릭터를 이용하는 제1 대화 모드 및 3D 캐릭터를 이용하지 않는 제2 대화 모드를 포함하는 메시지 애플리케이션.
- 제9항에 있어서,상기 표시하는 단계는,제1 대화 모드인 경우, 상기 3D 캐릭터에 적용할 시각 효과와 관련된 캐릭터 스티커를 포함하는 인터페이스를 제공하거나,제2 대화 모드인 경우, 상기 3D 캐릭터와 관련된 2D 캐릭터에 적용된 시각 효과와 관련된 캐릭터 스티커를 포함하는 인터페이스를 제공하는 메시지 애플리케이션.
- 제10항에 있어서,상기 표시하는 단계는,제1 대화 모드인 경우, 선택된 캐릭터 스티커에 대응하는 시각 효과를 채팅방에 있는 3D 캐릭터에 적용하고,제2 대화 모드인 경우, 선택된 캐릭터 스티커를 채팅방에 표시하는 메시지 애플리케이션.
- 제9항에 있어서,상기 채팅방에 표시된 식별자가 선택되면, 제1 대화 모드와 제2 대화 모드 간에 서로 전환하는 단계를 더 포함하고,상기 식별하는 단계는, 전환된 대화 모드를 식별하는 메시지 애플리케이션.
- 제9항에 있어서,상기 표시하는 단계는,상기 제1 대화 모드가 선택되는 경우, 미리 설정되거나 또는 사용자의 선택에 따라 결정된 가상 화면 또는 실세계 화면을 배경으로 하는 채팅방에 대화 내용을 표시하는 메시지 애플리케이션.
- 제9항에 있어서,사용자 단말 및 상대방 단말이 채팅방에서 퇴장하는 경우, 채팅방에서의 대화 내용을 저장하는 단계;상기 대화 내용의 다시 보기 요청에 따라 사용자 단말과 관련된 3D 캐릭터 및 다시 보기를 위한 식별자를 표시하는 단계;상기 식별자를 선택하면, 대화 내용을 제1 대화 모드에 따라 시간 순서로 재생하는 단계를 더 포함하는 메시지 애플리케이션.
- 제9항에 있어서,상기 사용자 단말 또는 상대방 단말과 관련된 프로필 보기 요청에 따라 상기 사용자 단말 또는 상대방 단말과 관련된 3D 캐릭터를 표시하는 단계상기 3D 캐릭터가 사용자 단말 또는 상대방 단말의 개인 정보를 설명하거나 또는 미리 설정된 감정 또는 동작을 표현하도록 제어하는 단계를 더 포함하는 메시지 서비스 방법.
- 사용자 단말에 있어서,대화 내용의 표현 방법과 관련된 대화 모드를 식별하는 프로세서; 및상기 대화 모드에 따라 사용자 단말 및 상대방 단말의 대화 내용을 채팅방에 표시하는 디스플레이를 포함하고,상기 대화 모드는, 사용자 단말 및 상대방 단말에 대응하는 3D 캐릭터를 이용하는 제1 대화 모드 및 3D 캐릭터를 이용하지 않는 제2 대화 모드를 포함하는 사용자 단말.
- 제16항에 있어서,상기 채팅방에 표시된 대화 모드의 전환과 관련된 식별자가 선택되거나 또는 대화 모드의 전환과 관련된 사용자의 동작이 인식되면, 제1 대화 모드와 제2 대화 모드 간에 서로 전환하는 사용자 단말.
- 사용자 단말이 수행하는 메시지 서비스 방법에 있어서,채팅방으로의 입장 요청에 대응하여 실세계 화면 또는 가상 화면을 배경으로 하는 채팅방에 3D 캐릭터를 표시하는 단계;상기 사용자 단말 및 상대방 단말의 대화 내용을 상기 3D 캐릭터에 대응하여 표시하는 단계를 포함하는 메시지 서비스 방법.
- 제18항에 있어서,상기 3D 캐릭터를 표시하는 단계는,실세계 화면 또는 가상 화면에 미리 지정한 오브젝트 또는 오브젝트의 움직임이 감지되는 경우, 상기 채팅방에 3D 캐릭터를 표시하는 메시지 서비스 방법.
- 제18항에 있어서,상기 채팅방에 표시된 대화 모드의 전환과 관련된 식별자가 선택되거나 또는 대화 모드의 전환과 관련된 사용자의 동작이 인식되면, 대화 모드를 전환하는 단계를 더 포함하고,상기 표시하는 단계는,상기 전환된 대화 모드에 따라 상기 사용자 단말 및 상대방 단말의 대화 내용을 3D 캐릭터를 이용하지 않고 표시하는 메시지 서비스 방법.
- 제18항에 있어서,상기 표시하는 단계는,상기 3D 캐릭터에 적용할 시각 효과와 관련된 캐릭터 스티커를 포함하는 인터페이스를 제공하는 메시지 서비스 방법.
- 제21항에 있어서,상기 캐릭터 스티커는,캐릭터의 식별 정보 또는 영상 정보에 기초하여 추가되거나 또는 업데이트되고,인터페이스를 통해 제공되는 캐릭터 스티커들 중에서 사용자에 의해 선택되거나 또는 채팅창에 캐릭터 스티커에 대응하는 단어, 숫자, 또는 기호 중 적어도 하나의 조합을 사용자가 입력하는 경우에 선택되는 메시지 서비스 방법.
- 메시지 서비스 방법을 위한 메시지 애플리케이션에 있어서,사용자 단말의 매체에 저장되어 사용자 단말의 프로세서에 의해 실행되는,채팅방으로의 입장 요청에 대응하여 실세계 화면 또는 가상 화면을 배경으로 하는 채팅방에 3D 캐릭터를 표시하는 단계;상기 사용자 단말 및 상대방 단말의 대화 내용을 상기 3D 캐릭터에 대응하여 표시하는 단계를 포함하는 메시지 애플리케이션.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580058773.8A CN107210949A (zh) | 2014-09-05 | 2015-09-07 | 利用角色的消息服务方法、执行所述方法的用户终端、包括所述方法的消息应用程序 |
EP15838514.6A EP3190563A4 (en) | 2014-09-05 | 2015-09-07 | Message service method using character, user terminal for performing same, and message application including same |
JP2017512918A JP2017527917A (ja) | 2014-09-05 | 2015-09-07 | キャラクターを利用するメッセージサービス方法、前記方法を行うユーザ端末、前記方法を含むメッセージアプリケーション |
US15/508,425 US20170323266A1 (en) | 2014-09-05 | 2015-09-07 | Message service method using character, user terminal for performing same, and message application including same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140119018A KR101540544B1 (ko) | 2014-09-05 | 2014-09-05 | 캐릭터를 이용한 메시지 서비스 방법, 상기 방법을 수행하는 사용자 단말, 상기 방법을 포함하는 메시지 애플리케이션 |
KR10-2014-0119018 | 2014-09-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016036218A1 true WO2016036218A1 (ko) | 2016-03-10 |
Family
ID=53877003
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/009402 WO2016036218A1 (ko) | 2014-09-05 | 2015-09-07 | 캐릭터를 이용한 메시지 서비스 방법, 상기 방법을 수행하는 사용자 단말, 상기 방법을 포함하는 메시지 애플리케이션 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20170323266A1 (ko) |
EP (1) | EP3190563A4 (ko) |
JP (1) | JP2017527917A (ko) |
KR (1) | KR101540544B1 (ko) |
CN (1) | CN107210949A (ko) |
WO (1) | WO2016036218A1 (ko) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110895439A (zh) * | 2016-09-23 | 2020-03-20 | 苹果公司 | 头像创建和编辑 |
US10861248B2 (en) | 2018-05-07 | 2020-12-08 | Apple Inc. | Avatar creation user interface |
US10891013B2 (en) | 2016-06-12 | 2021-01-12 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
EP3920052A1 (en) * | 2016-09-23 | 2021-12-08 | Apple Inc. | Image data for enhanced user interactions |
US11287942B2 (en) | 2013-09-09 | 2022-03-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces |
US11334209B2 (en) | 2016-06-12 | 2022-05-17 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11386189B2 (en) | 2017-09-09 | 2022-07-12 | Apple Inc. | Implementation of biometric authentication |
US11393258B2 (en) | 2017-09-09 | 2022-07-19 | Apple Inc. | Implementation of biometric authentication |
US11468155B2 (en) | 2007-09-24 | 2022-10-11 | Apple Inc. | Embedded authentication systems in an electronic device |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11532112B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Emoji recording and sending |
US11619991B2 (en) | 2018-09-28 | 2023-04-04 | Apple Inc. | Device control using gaze information |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US11809784B2 (en) | 2018-09-28 | 2023-11-07 | Apple Inc. | Audio assisted enrollment |
US11836725B2 (en) | 2014-05-29 | 2023-12-05 | Apple Inc. | User interface for payments |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11928200B2 (en) | 2018-06-03 | 2024-03-12 | Apple Inc. | Implementation of biometric authentication |
US12033296B2 (en) | 2018-05-07 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
US12079458B2 (en) | 2016-09-23 | 2024-09-03 | Apple Inc. | Image data for enhanced user interactions |
US12099586B2 (en) | 2021-01-25 | 2024-09-24 | Apple Inc. | Implementation of biometric authentication |
US12124770B2 (en) | 2023-08-24 | 2024-10-22 | Apple Inc. | Audio assisted enrollment |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8584031B2 (en) | 2008-11-19 | 2013-11-12 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US9940637B2 (en) | 2015-06-05 | 2018-04-10 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US10445425B2 (en) | 2015-09-15 | 2019-10-15 | Apple Inc. | Emoji and canned responses |
WO2017205647A1 (en) * | 2016-05-27 | 2017-11-30 | Barbuto Joseph | System and method for assessing cognitive and mood states of a real world user as a function of virtual world activity |
US11580608B2 (en) | 2016-06-12 | 2023-02-14 | Apple Inc. | Managing contact information for communication applications |
DK179867B1 (en) | 2017-05-16 | 2019-08-06 | Apple Inc. | RECORDING AND SENDING EMOJI |
WO2019079826A1 (en) | 2017-10-22 | 2019-04-25 | Magical Technologies, Llc | DIGITAL ASSISTANT SYSTEMS, METHODS AND APPARATUSES IN AN INCREASED REALITY ENVIRONMENT AND LOCAL DETERMINATION OF VIRTUAL OBJECT PLACEMENT AND SINGLE OR MULTIDIRECTIONAL OBJECTIVES AS GATEWAYS BETWEEN A PHYSICAL WORLD AND A DIGITAL WORLD COMPONENT OF THE SAME ENVIRONMENT OF INCREASED REALITY |
JP6563466B2 (ja) * | 2017-11-30 | 2019-08-21 | 株式会社サイバーエージェント | チャットシステム、サーバ、チャット方法、端末装置及びコンピュータプログラム |
US11398088B2 (en) | 2018-01-30 | 2022-07-26 | Magical Technologies, Llc | Systems, methods and apparatuses to generate a fingerprint of a physical location for placement of virtual objects |
KR102582745B1 (ko) * | 2018-02-08 | 2023-09-25 | 라인플러스 주식회사 | 대화방을 3차원 형태로 제공하는 방법과 시스템 및 비-일시적인 컴퓨터 판독 가능한 기록 매체 |
US10902659B2 (en) * | 2018-09-19 | 2021-01-26 | International Business Machines Corporation | Intelligent photograph overlay in an internet of things (IoT) computing environment |
CN109885367B (zh) * | 2019-01-31 | 2020-08-04 | 腾讯科技(深圳)有限公司 | 互动聊天实现方法、装置、终端和存储介质 |
US11467656B2 (en) | 2019-03-04 | 2022-10-11 | Magical Technologies, Llc | Virtual object control of a physical device and/or physical device control of a virtual object |
DK201970530A1 (en) | 2019-05-06 | 2021-01-28 | Apple Inc | Avatar integration with multiple applications |
KR20210012562A (ko) * | 2019-07-25 | 2021-02-03 | 삼성전자주식회사 | 아바타를 제공하는 전자 장치 및 그의 동작 방법 |
JP7442091B2 (ja) | 2020-04-30 | 2024-03-04 | グリー株式会社 | 動画配信装置、動画配信方法及び動画配信プログラム |
KR102575771B1 (ko) * | 2021-07-21 | 2023-09-06 | 네이버 주식회사 | 3d 인터랙티브 동영상 생성 방법 및 시스템 |
KR102577132B1 (ko) * | 2021-09-24 | 2023-09-12 | (주)이머시브캐스트 | 3차원 텍스트 메시징 서비스 장치 및 방법 |
WO2023140720A1 (ko) * | 2022-01-24 | 2023-07-27 | 마인드로직 주식회사 | 인공 지능 대화 서비스 시스템 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19980072149A (ko) * | 1998-07-27 | 1998-10-26 | 이재각 | 채팅 시스템 |
KR20000036463A (ko) * | 2000-03-15 | 2000-07-05 | 한남용 | 인터넷을 이용한 가상현실 대화 시스템 및 방법 |
KR20060012818A (ko) * | 2004-08-04 | 2006-02-09 | 박홍진 | 입체아바타를 이용한 메신저서비스 시스템 및 서비스방법 |
KR20060104980A (ko) * | 2006-09-26 | 2006-10-09 | 주식회사 비즈모델라인 | 이모티콘과 아바타 연동처리 방법 및 시스템 |
KR20110023962A (ko) * | 2009-09-01 | 2011-03-09 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 메시지 작성 방법 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010019337A1 (en) * | 2000-03-03 | 2001-09-06 | Jong Min Kim | System for providing clients with a three dimensional virtual reality |
JP2002288101A (ja) * | 2001-03-23 | 2002-10-04 | Sony Corp | チャット文字表示方法、チャット文字表示プログラム、チャット文字表示プログラム格納媒体、および共有仮想空間表示装置 |
AU2002950502A0 (en) * | 2002-07-31 | 2002-09-12 | E-Clips Intelligent Agent Technologies Pty Ltd | Animated messaging |
JP3930849B2 (ja) * | 2003-11-21 | 2007-06-13 | 株式会社コナミデジタルエンタテインメント | 通信システム、ゲートウェイ装置、データ中継方法、および、プログラム |
KR100731786B1 (ko) * | 2004-11-08 | 2007-06-25 | 백부현 | 양방향 커뮤니케이션을 위한 캐릭터 서비스 방법 |
CN101242373A (zh) * | 2007-11-14 | 2008-08-13 | 李强 | 三维动态网络聊天 |
US20090259937A1 (en) * | 2008-04-11 | 2009-10-15 | Rohall Steven L | Brainstorming Tool in a 3D Virtual Environment |
KR101521332B1 (ko) * | 2011-11-08 | 2015-05-20 | 주식회사 다음카카오 | 인스턴트 메시징 서비스 및 인스턴트 메시징 서비스로부터 확장된 복수의 서비스들을 제공하는 방법 |
CN202587207U (zh) * | 2011-12-09 | 2012-12-05 | 深圳市顶星数码网络技术有限公司 | 3d聊天主体和客体装置、笔记本电脑及3d聊天系统 |
KR101907136B1 (ko) * | 2012-01-27 | 2018-10-11 | 라인 가부시키가이샤 | 유무선 웹을 통한 아바타 서비스 시스템 및 방법 |
CN102708151A (zh) * | 2012-04-16 | 2012-10-03 | 广州市幻像信息科技有限公司 | 一种实现互联网情景论坛方法和装置 |
JP6206676B2 (ja) * | 2012-05-18 | 2017-10-04 | 株式会社コナミデジタルエンタテインメント | メッセージ管理装置、メッセージ管理方法およびプログラム |
CN102685461B (zh) * | 2012-05-22 | 2014-11-05 | 深圳市环球数码创意科技有限公司 | 一种与观众实时交互的实现方法及其系统 |
US9443271B2 (en) * | 2012-08-15 | 2016-09-13 | Imvu, Inc. | System and method for increasing clarity and expressiveness in network communications |
CN103617029A (zh) * | 2013-11-20 | 2014-03-05 | 中网一号电子商务有限公司 | 一种3d即时通讯系统 |
-
2014
- 2014-09-05 KR KR1020140119018A patent/KR101540544B1/ko active IP Right Grant
-
2015
- 2015-09-07 US US15/508,425 patent/US20170323266A1/en not_active Abandoned
- 2015-09-07 EP EP15838514.6A patent/EP3190563A4/en not_active Withdrawn
- 2015-09-07 CN CN201580058773.8A patent/CN107210949A/zh active Pending
- 2015-09-07 WO PCT/KR2015/009402 patent/WO2016036218A1/ko active Application Filing
- 2015-09-07 JP JP2017512918A patent/JP2017527917A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19980072149A (ko) * | 1998-07-27 | 1998-10-26 | 이재각 | 채팅 시스템 |
KR20000036463A (ko) * | 2000-03-15 | 2000-07-05 | 한남용 | 인터넷을 이용한 가상현실 대화 시스템 및 방법 |
KR20060012818A (ko) * | 2004-08-04 | 2006-02-09 | 박홍진 | 입체아바타를 이용한 메신저서비스 시스템 및 서비스방법 |
KR20060104980A (ko) * | 2006-09-26 | 2006-10-09 | 주식회사 비즈모델라인 | 이모티콘과 아바타 연동처리 방법 및 시스템 |
KR20110023962A (ko) * | 2009-09-01 | 2011-03-09 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 메시지 작성 방법 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3190563A4 * |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11468155B2 (en) | 2007-09-24 | 2022-10-11 | Apple Inc. | Embedded authentication systems in an electronic device |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11869165B2 (en) | 2010-04-07 | 2024-01-09 | Apple Inc. | Avatar editing environment |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US11287942B2 (en) | 2013-09-09 | 2022-03-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces |
US11768575B2 (en) | 2013-09-09 | 2023-09-26 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US11494046B2 (en) | 2013-09-09 | 2022-11-08 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US11836725B2 (en) | 2014-05-29 | 2023-12-05 | Apple Inc. | User interface for payments |
US11334209B2 (en) | 2016-06-12 | 2022-05-17 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11941223B2 (en) | 2016-06-12 | 2024-03-26 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US10891013B2 (en) | 2016-06-12 | 2021-01-12 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
US11681408B2 (en) | 2016-06-12 | 2023-06-20 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
AU2021250944B2 (en) * | 2016-09-23 | 2022-11-24 | Apple Inc. | Image data for enhanced user interactions |
EP3920052A1 (en) * | 2016-09-23 | 2021-12-08 | Apple Inc. | Image data for enhanced user interactions |
JP2020091866A (ja) * | 2016-09-23 | 2020-06-11 | アップル インコーポレイテッドApple Inc. | アバターの作成及び編集 |
CN110895439A (zh) * | 2016-09-23 | 2020-03-20 | 苹果公司 | 头像创建和编辑 |
JP7138614B2 (ja) | 2016-09-23 | 2022-09-16 | アップル インコーポレイテッド | アバターの作成及び編集 |
JP2023011639A (ja) * | 2016-09-23 | 2023-01-24 | アップル インコーポレイテッド | 拡張されたユーザ対話のための画像データ |
US12079458B2 (en) | 2016-09-23 | 2024-09-03 | Apple Inc. | Image data for enhanced user interactions |
US12045923B2 (en) | 2017-05-16 | 2024-07-23 | Apple Inc. | Emoji recording and sending |
US11532112B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Emoji recording and sending |
US11765163B2 (en) | 2017-09-09 | 2023-09-19 | Apple Inc. | Implementation of biometric authentication |
US11393258B2 (en) | 2017-09-09 | 2022-07-19 | Apple Inc. | Implementation of biometric authentication |
US11386189B2 (en) | 2017-09-09 | 2022-07-12 | Apple Inc. | Implementation of biometric authentication |
US11682182B2 (en) | 2018-05-07 | 2023-06-20 | Apple Inc. | Avatar creation user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10861248B2 (en) | 2018-05-07 | 2020-12-08 | Apple Inc. | Avatar creation user interface |
US12033296B2 (en) | 2018-05-07 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US11928200B2 (en) | 2018-06-03 | 2024-03-12 | Apple Inc. | Implementation of biometric authentication |
US11619991B2 (en) | 2018-09-28 | 2023-04-04 | Apple Inc. | Device control using gaze information |
US11809784B2 (en) | 2018-09-28 | 2023-11-07 | Apple Inc. | Audio assisted enrollment |
US12105874B2 (en) | 2018-09-28 | 2024-10-01 | Apple Inc. | Device control using gaze information |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US12008230B2 (en) | 2020-05-11 | 2024-06-11 | Apple Inc. | User interfaces related to time with an editable background |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11822778B2 (en) | 2020-05-11 | 2023-11-21 | Apple Inc. | User interfaces related to time |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US12099713B2 (en) | 2020-05-11 | 2024-09-24 | Apple Inc. | User interfaces related to time |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US12099586B2 (en) | 2021-01-25 | 2024-09-24 | Apple Inc. | Implementation of biometric authentication |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US12124770B2 (en) | 2023-08-24 | 2024-10-22 | Apple Inc. | Audio assisted enrollment |
Also Published As
Publication number | Publication date |
---|---|
EP3190563A1 (en) | 2017-07-12 |
US20170323266A1 (en) | 2017-11-09 |
EP3190563A4 (en) | 2018-03-07 |
KR101540544B1 (ko) | 2015-07-30 |
JP2017527917A (ja) | 2017-09-21 |
CN107210949A (zh) | 2017-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016036218A1 (ko) | 캐릭터를 이용한 메시지 서비스 방법, 상기 방법을 수행하는 사용자 단말, 상기 방법을 포함하는 메시지 애플리케이션 | |
WO2023080506A1 (ko) | 증강현실 기반의 메타버스 서비스 장치 및 그 장치의 구동방법 | |
US7202886B2 (en) | Videophone terminal | |
CN110262715B (zh) | 信息处理方法及装置、计算机可读存储介质和电子设备 | |
CN103442201B (zh) | 用于语音和视频通信的增强接口 | |
WO2013105760A1 (en) | Contents providing system and operating method thereof | |
CN111050222B (zh) | 一种虚拟物品发放方法、装置和存储介质 | |
WO2016129910A1 (ko) | 캐릭터 메시지 서비스에 기반한 광고 서비스 제공 방법 및 상기 방법을 수행하는 장치 | |
WO2023093451A1 (zh) | 一种游戏内直播交互方法、装置、计算机设备及存储介质 | |
KR100883352B1 (ko) | 원격대화에서 감정 및 의사의 표현 방법과 이를 위한 리얼이모티콘 시스템 | |
CN109785229A (zh) | 基于区块链实现的智能合影方法、装置、设备和介质 | |
WO2014126331A1 (en) | Display apparatus and control method thereof | |
WO2019221385A1 (ko) | 대화용 애플리케이션의 운영 방법 | |
WO2021167252A1 (ko) | 멀미 저감을 위한 vr 컨텐츠 제공 시스템 및 방법 | |
WO2021187647A1 (ko) | 가상 공간에서 사용자의 동작을 따라 하는 아바타를 표현하는 방법 및 시스템 | |
WO2023128309A1 (ko) | 메타버스 기반의 오피스 환경에서 디스플레이 제어 방법과 이를 실행하는 프로그램이 기록된 저장매체 및 이를 포함하는 디스플레이 제어 시스템 | |
JP7373599B2 (ja) | 動画を作成するためのシステム、方法、及びプログラム | |
WO2015142007A1 (ko) | 추출된 메시지를 채팅창에 구별하여 표시하는 메시지 표시 방법, 상기 방법을 수행하는 이동 단말 및 채팅 서버 | |
JP2005327115A (ja) | 仮想空間提供システム、仮想空間提供サーバおよび仮想空間提供方法 | |
WO2013085166A1 (ko) | 메시지 방송 아이템을 적용한 축구 게임 제공 방법, 축구 게임 서버, 축구 게임 제공 시스템 및 기록 매체 | |
WO2023003141A2 (ko) | 인터랙티브 자연어 처리 기반의 동영상 생성 장치 및 방법 | |
WO2024190938A1 (ko) | 스크린샷을 이용한 게임 유저간 인터랙션 제공 방법, 서버 및 컴퓨터-판독가능 기록매체 | |
Asiri et al. | The Effectiveness of Mixed Reality Environment-Based Hand Gestures in Distributed Collaboration | |
WO2024075943A1 (ko) | 메타버스의 사용자와 메시지를 송수신하는 전자 장치 및 그 동작 방법 | |
WO2023090565A1 (ko) | 플랫폼 앨범 서비스 제공 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15838514 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017512918 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015838514 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015838514 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15508425 Country of ref document: US |