WO2014065551A1 - 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 - Google Patents
채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 Download PDFInfo
- Publication number
- WO2014065551A1 WO2014065551A1 PCT/KR2013/009392 KR2013009392W WO2014065551A1 WO 2014065551 A1 WO2014065551 A1 WO 2014065551A1 KR 2013009392 W KR2013009392 W KR 2013009392W WO 2014065551 A1 WO2014065551 A1 WO 2014065551A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- area
- image
- displayed
- chat
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000004891 communication Methods 0.000 claims abstract description 52
- 238000001514 detection method Methods 0.000 claims abstract description 4
- 230000008569 process Effects 0.000 claims description 12
- 238000012217 deletion Methods 0.000 claims description 5
- 230000037430 deletion Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 claims description 2
- 238000013523 data management Methods 0.000 description 55
- 238000010586 diagram Methods 0.000 description 14
- 230000008451 emotion Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 101100533647 Drosophila melanogaster Slimp gene Proteins 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Images
Classifications
-
- G06Q50/40—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/169—Annotation, e.g. comment data or footnotes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
Definitions
- IMS Instant Message Service
- MIM Mobile Instant Messenger
- An emoticon is a compound word of emotion and icon, meaning emotion, and means a unique pictogram used to convey emotions or intentions of a user in cyberspace.
- These emoticons began with images of images of smiles, and their types and forms ranged from facial expressions, recognition, occupations, characters, and animals to the emotions of other people when they chat or send e-mails.
- an emoticon service that expresses the emotion of a chat user through an emoticon has been provided to the user.
- the currently provided emoticon service displays an emoticon in a predetermined area when a user selects an emoticon. It is not beyond its simple form.
- an object of the present invention is to provide an image display device and a method for displaying an image in an arbitrary area desired by a user.
- An object of the present invention is to provide an image display device and a method capable of inducing service participation and use by giving a user participating in a chat control such as deletion of an image displayed in a chat area and movement of a location.
- the technical problem to be achieved by the present embodiment is not limited to the technical problems as described above, and other technical problems may exist.
- an embodiment of the present invention is a data communication unit for receiving a text of an external device through a network, a text display unit for displaying the received text in the chat area, through a user interface
- An image retrieval unit for retrieving image data of a selected image
- a position detection unit for detecting position information of a region selected through the user interface in the chat region, and displaying the retrieved image data in a region corresponding to the detected position information
- the display device may include an image display unit, and when the displayed text is moved within the chat area, the displayed image data may also be moved within the chat area.
- a data communication unit for receiving text of an external device through a network, a text display unit for displaying the received text in the chat area, an image search unit for retrieving image data of a selected image through a user interface, A location detector for detecting location information of a region selected through the user interface in the chat area, an image display unit for displaying the retrieved image data in an area corresponding to the detected location information, and an area in which image data is displayed through a user interface And an image deletion unit for not displaying the displayed image data when receiving a selection for the input. If the displayed text is moved within the chat area, the displayed image data is also moved within the chat area. Vise can be provided.
- Still another embodiment of the present invention provides a method of receiving text from an external device via a network, displaying the received text in the chat area, retrieving image data of a selected image through a user interface, and chatting area. Detecting location information of a region selected through the user interface in the display, displaying the retrieved image data in an area corresponding to the detected location information, and when the displayed text is moved within the chat area, The displayed image data may also provide a method of displaying an image, including moving the position within the chat area.
- Still another embodiment of the present invention provides a data communication unit for receiving text, image data and location information of an external device through a network, a text display unit for displaying the received text in the chat area, and the received image data.
- the display device may include an image display unit that displays an area corresponding to the location information, and when the displayed text is moved within the chat area, the displayed image data is also moved within the chat area.
- Another embodiment of the present invention is a text management unit for receiving a text from a first device of a plurality of devices connected via a network, and transmits the received text to a second device, receiving information about image data from the first device And an image data management unit for transmitting the information about the received image data to the second device and a position for receiving the position information associated with the image data from the first device, and transmitting the received position information to the second device. And an information manager, wherein the image data is displayed in an area corresponding to the location information in the chat area of the second device.
- a device and method can be provided. It is possible to provide an image display device and method capable of inducing service participation and use by granting a user participating in a chat control such as deletion of an image displayed in a chat area and movement of a location.
- FIG. 1 is a configuration of a chat service providing system according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a device 20 according to an embodiment of the present invention.
- FIG. 3 is a diagram for describing an example of an operation of the device 20 of FIG. 2.
- FIG. 4 is a diagram illustrating an example of displaying a notification message by the image display unit 205 of FIG. 2.
- FIG. 5 is a block diagram of a device 50 according to another embodiment of the present invention.
- FIG. 6 is a diagram for describing an example of an operation of the device 50 of FIG. 5.
- FIG. 7 is a block diagram of a device 70 according to another embodiment of the present invention.
- FIG. 8 is a configuration diagram of the chat data management server 10 of FIG. 1.
- FIG. 9 is a view for explaining an example of the operation of the chat data management server 10 of FIG.
- FIG. 10 is a flowchart illustrating an image display method according to an exemplary embodiment.
- FIG. 11 is a flowchart illustrating an image display method according to another exemplary embodiment of the present invention.
- FIG. 12 is a flowchart illustrating a chat data management method according to an embodiment of the present invention.
- a chat service providing system includes a chat data management server 10 and devices 21 to 23.
- the chat service providing system of FIG. 1 is only one embodiment of the present invention, the present invention is not limitedly interpreted through FIG. 1. That is, according to various embodiments of the present invention, the chat service providing system may be configured differently from FIG. 1.
- the chat service providing system according to an embodiment of the present invention may further include a payment server (not shown) that performs a payment process for the devices 21 to 23.
- the network refers to a connection structure capable of exchanging information between respective nodes such as terminals and servers.
- Examples of such a network include a 3rd generation partnership project (3GPP) network and a long term evolution (LTE) network.
- 3GPP 3rd generation partnership project
- LTE long term evolution
- WWIX World Interoperability for Microwave Access
- LAN Local Area Network
- WLAN Wireless Local Area Network
- WAN Wide Area Network
- PAN Personal Area Network
- Bluetooth Bluetooth
- satellite broadcasting networks analog broadcasting networks
- DMB Digital Multimedia Broadcasting
- the first network connecting the chat data management server 10 and the devices 21 to 23 and the second network connecting the device 22 and the device 23 may have different types of networks.
- the first network may be a broadband network and the second network may be a local area network.
- a broadband network is an LTE network
- a local area network is a wireless LAN.
- the chat data management server 10 receives the chat data from any one of the plurality of devices (21 to 23), chat data received by the other of the plurality of devices (21 to 23) Send it. At this time, the chat data management server 10 may receive the chat data through the first network, and transmit the received chat data.
- the chat data includes at least one of text, image data and location information associated with the image data.
- the chat data management server 10 receives information (eg, selection information or identification information) about chat data from the plurality of devices 21 to 23, and receives the received information.
- Corresponding chat data may be transmitted to the plurality of devices 21 to 23.
- the chat data management server 10 may receive identification information about the A image data from the device 21 and transmit the A image data to the device 21 based on the received identification information.
- the chat data management server 10 receives information (eg, selection information or identification information) about chat data from the plurality of devices 21 to 23, and the plurality of devices ( 21 to 23) to transmit the received information to the other. Then, the chat data management server 10 sends the chat data to another one of the plurality of devices 21 to 23 when the other one of the plurality of devices 21 to 23 requests the chat data through the information about the chat data. Can be transmitted.
- the chat data management server 10 receives the identification information about the A image data from the device 21, passes the received identification information to the device 22, and then sends the A image data from the device 22 to the A image data. When the A image data is requested through the identification information, the A image data may be transmitted to the device 22.
- the chat data management server 10 relays information (eg, selection information or identification information) about chat data between the devices 21 to 23, and when each device directly requests chat data. You can also send chat data to each device.
- chat data may be sent directly from device 22 to device 23.
- the chat data may be transmitted directly between the device 22 and the device 23 via the second network connecting the device 22 and the device 23 without passing through the chat data management server 10. .
- Each of the devices 21 to 23 displays text and image data input through the user interface in the chat area of the display mounted on the device.
- each of the devices 21 to 23 may display image data in an area corresponding to an arbitrary position selected through the user interface among the chat areas.
- each of the devices 21 to 23 may display image data in an area desired by a user of each of the devices 21 to 23.
- image data is sticker image data.
- Each of the devices 21 to 23 displays text and image data delivered in real time from another device in the chat area.
- each of the devices 21 to 23 may further receive location information about the image data from another device, and display the image data in an area corresponding to the received location information in the chat area.
- each of the devices 21 to 23 may display an image in an arbitrary area desired by a user of each of the devices 21 to 23, or an arbitrary area desired by another user of another device.
- Each of the devices 21 to 23 may control the displayed image data when a control command for the image data displayed in the chat area is input from the user interface or another device. For example, when each of the devices 21 to 23 receives a position shift command as a control command, each of the devices 21 to 23 may shift the position of the displayed image data. For another example, when each of the devices 21 to 23 receives a selection (eg, a click or a touch) for the displayed image, the devices 21 to 23 may not display the displayed image data. As such, each of the devices 21 to 23 may induce participation and use of a service by granting a user who participates in a real-time chat control such as deleting and moving a location of an image displayed in a chat area.
- each of the devices 21 to 23 may be various types of devices.
- the device may be a TV device, a computer or a portable terminal capable of connecting to a remote server via a network.
- a TV device includes a smart TV, an IPTV set-top box, and the like
- an example of a computer includes a laptop, desktop, laptop, etc., which is equipped with a web browser.
- An example of a terminal is a wireless communication device that guarantees portability and mobility, and includes a personal communication system (PCS), a global system for mobile communications (GSM), a personal digital cellular (PDC), a personal handyphone system (PHS), and a personal digital (PDA).
- PCS personal communication system
- GSM global system for mobile communications
- PDC personal digital cellular
- PHS personal handyphone system
- PDA personal digital
- IMT International Mobile Telecommunication
- IMT International Mobile Telecommunication
- CDMA Code Division Multiple Access
- W-CDMA Wireless Broadband Internet
- smartphone smartphone, tablet PC All kinds of handheld based wireless communication devices such as the like may be included.
- FIG. 2 is a block diagram of a device 20 according to an embodiment of the present invention.
- a device 20 refers to any one of the devices 21 to 23 shown in FIG.
- the present invention is not limited to the form of the devices 21 to 23 shown in FIG. 1.
- the device 20 includes a data communication unit 201, a text display unit 202, an image search unit 203, a location detector 204, an image display unit 205, and a user interface 206.
- the device 20 shown in FIG. 2 is just one implementation example of the present invention, and various modifications are possible based on the components shown in FIG. 2.
- the device 20 may further include a voice output device for outputting a voice and a display for outputting an image.
- the data communication unit 201 receives chat data of the external device 30 through a network. At this time, the data communication unit 201 may receive the chat data from the chat data management server 10. In general, chat data includes at least one of text, image data, and location information associated with image data. In addition, the data communication unit 201 may transmit the chat data to the external device 30 connected through the network. In general, the external device 30 may be any one of the devices 21 to 23 shown in FIG. 1, but is not limited to the form of the devices 21 to 23 shown in FIG. 1.
- the data communication unit 201 receives chat data from the chat data management server 10. At this time, the chat data management server 10 receives information (eg, identification information or selection information) about chat data having the device 20 as a destination from the external device 30 and uses the received information. To transmit the chat data to the data communication unit 201.
- information eg, identification information or selection information
- the data communication unit 201 may receive identification information on chat data from the chat data management server 10 and receive chat data from the chat data management server 10 through the received identification information. At this time, the identification information for the chat data will be transmitted from the external device 30 to the chat data management server 10. That is, the chat data management server 10 transmits and receives only the identification information on the chat data with the devices while storing the chat data, and sends the chat data requested when the specific device requests the chat data through the identification information. Can also be sent.
- the data communication unit 201 may directly transmit and receive chat data with the external device 30 without passing through the chat data management server 10.
- the text display unit 202 displays the text in the chat area.
- the text is text received from the external device 30 or text input from the user interface 206.
- the chat area is an area displayed on the display of the device 20 and is a graphic area displaying text input from the user interface 206 or received from the external device 30.
- the text display 202 is 'no' 413, which is text input from the user interface 206 in the chat area 412 displayed on the display 411 of the device 20.
- the text received from the external device 30, 'Busy?' 414 or 'Slimp' 415 may be displayed in the form of speech bubbles as shown by reference numeral 41, respectively.
- the image retrieval unit 203 retrieves image data of the selected image through the user interface 206. Specifically, when the image display unit 205 receives a command for calling images from the user interface 206 and displays a plurality of images in a predetermined area of the display of the device 20, the image search unit 203 is displayed. The image data of the selected image may be retrieved through the user interface 206 among the plurality of images. In this case, the plurality of images may be sequentially arranged according to the time when each of the plurality of images is used. For example, the most recently used first image of the plurality of images may be displayed in the first column of the first row of the image list in comparison with other images. To this end, the user interface 206 can provide the user with a plurality of image lists sequentially arranged according to the time at which the images were used.
- the image retrieval unit 203 retrieves information (eg, identification information or selection information) of image data of the selected image through the user interface 206.
- the device 20 may store information of the image data, and receive image data corresponding to the information from the chat data management server 10 as necessary.
- the selected image is a first type of image.
- the first type of image refers to an image in which the area where the image is displayed is an area corresponding to the detected position information.
- An image of the first type is distinguished from an image of the second type, wherein the area in which the image is displayed is an area determined by a default value.
- one example of the image of the first type is a sticker type image
- one example of the image of the second type is an image of an emoticon type. Since the sticker type image is displayed in the area selected through the user interface 206, the sticker type image may be displayed in any area desired by the user. On the other hand, the image of the emoticon type can be displayed only in the area determined by the default value.
- the first type of image is displayed by the image display unit 205 in a region corresponding to the detected position information.
- the second type of image is displayed by the image display unit 205 in the area determined by the default value.
- An example of the area determined by the default value is based on an area within a predetermined distance from the area where the latest text is displayed, a left or right area where the latest text is displayed, or an area where the latest text is displayed. This area is reduced or enlarged.
- the image search unit 203 searches for image data of the selected image 423 through a user interface among a plurality of images.
- the plurality of images 423-425 are displayed in the activated area 421 in response to a call command input from the user interface 206.
- the chat area changes from area 412 to area 422.
- the image 423 of the plurality of images 423 to 425 is the most recently used image and may be displayed in the leftmost column of the uppermost row in comparison with the other images 424 and 425.
- the image retrieval unit 203 retrieves image data from a database (not shown).
- the database (not shown) stores the image data input from the user interface 206 and the image data input from the external device 30.
- a database may store information of the image data, and the image search unit 203 may receive image data corresponding to the information from the chat data management server 10 as necessary.
- a database (not shown) may store text and location information in addition to image data.
- An example of such a database (not shown) includes a hard disk drive, a read only memory (ROM), a random access memory (RAM), a flash memory, a memory card, or the like existing inside or outside the device 20.
- the user of the device 20 selects an area to display the image via the user interface 206.
- the user may select an area by placing his / her finger in a predetermined point area within the chat area through the touch screen interface, which is the user interface 206.
- the point area may be specified as one pixel or at least two pixels.
- the user may select an area by moving his / her finger through the touch screen interface, which is the user interface 206, to specify a predetermined area of circles, triangles, or quadrangles.
- the surface area may be specified as one pixel or at least two pixels.
- the image display unit 205 displays a notification message 431 on the display of the device 20 informing the user to select an area, such as 'Please select a part to attach a sticker'.
- the notification message 431 may not be displayed.
- the location detector 204 detects location information of the area selected through the user interface 206 in the chat area.
- the location information may be coordinate information specifying the selected area.
- the location information may be coordinate information of at least one pixel in the area selected through the user interface 206.
- Such coordinate information may be an identification number specifying each of a plurality of pixels as a single numerical value, or may be 2D coordinates including x-axis and y-axis positions, and 3D including x-axis, y-axis, and z-axis positions. It may be a coordinate.
- the selected region may be any one of the cell regions selected through the user interface 206 among at least two cell regions separated by the division line.
- the chat area is divided into at least two cell areas through a division line.
- the location detector 204 may detect location information of any one selected cell area.
- the location information may be coordinate information, and the coordinate information may be an identification number for specifying each of a plurality of cells as a single numerical value, or may be 2D coordinates including x and y axis positions, and x, y and z axes. It may also be 3D coordinates including the axis position.
- the image display unit 205 may display a division line for separating at least two cells in the chat area. 3 through 44, the image display unit 205 may display a division line 433 separating at least two or more cells in the chat area.
- the area selected for displaying the image may overlap all or part of the area where the text is displayed.
- the image displayed in the area selected through the user interface 206 may cover all or part of the area where text is displayed.
- the image display unit 205 may show an emphasis effect on the previously displayed text. For example, when the area to be selected is within a predetermined distance from the text previously displayed in the chat area, the image display unit 205 periodically flashes a speech bubble of the previously displayed text or the previously displayed text, or changes the letter color of the previously displayed text. Alternatively, the color of the speech bubble of the previously displayed text may be changed. For another example, the image display unit 205 periodically flashes a speech bubble of the previously displayed text or the previously displayed text, or when the area to be selected overlaps all or a portion of the previously displayed text in the chat area, or the characters of the previously displayed text. You can change the color, or change the color of the speech bubble of the previously displayed text.
- the location detector 504 detects location information of the selected area 441 through the user interface 206.
- the user may select an area by placing his finger 442 in a predetermined point area within the chat area through the touch screen interface, which is the user interface 206.
- the location information may be location information of a point area specific to a user or may be location information of any one cell area among at least two or more cells divided by the division line 433.
- the image display unit 205 displays the retrieved image data in the area corresponding to the detected position information.
- the detected location information refers to location information of the area selected through the user interface 206
- the area corresponding to the location information refers to an area in which image data is to be displayed.
- an area generally selected through the user interface 206 is distinguished from an area corresponding to the location information.
- the area selected via the user interface 206 is specified as a predetermined circle area within the chat area, while the area corresponding to the location information is the center of the circle area, the x-axis distance of the image data, and the y axis of the image data. It may be specified as a predetermined rectangular region specified by the distance.
- the area selected through the user interface 206 and the area corresponding to the location information may be the same.
- the area in which the image data is displayed may overlap all or part of the area in which the text is displayed. In other words, the area where the image data is displayed can overlap all or part of the area where the text is displayed.
- text is displayed in the chat area by the text display unit 202
- image data is displayed in the chat area by the image display unit 205, but is not limited thereto.
- the displayed image data is also moved within the chat area.
- the first to seventh text is displayed in the chat area
- the first image is displayed in the chat area
- the eighth text is input from the user interface 206 or the external device 30
- the first text is displayed.
- the text is not displayed in the chat area
- the second to seventh text is moved upward in the chat area
- the first image is also moved upward in the chat area.
- the area in which the first to eighth texts are displayed is determined by a default value, while the area in which the first image is displayed may be arbitrarily determined by the user interface 206.
- the first to eighth texts may be displayed, not displayed or moved by the text display unit 202, and the image data may be displayed, not displayed or moved by the image display unit 205.
- a control signal may be transmitted and received between the text display unit 202 and the image display unit 205.
- the first image refers to an image of a sticker type that is positioned by a user.
- the image display unit 205 displays the retrieved image data in an area 451 corresponding to the detected position information.
- the region 451 corresponding to the location information may overlap all of the region where the text 415 is displayed.
- the area 451 corresponding to the location information may cover the entire area where the text 415 is displayed.
- the text 'No' 413 which is text input from the user interface 206, is displayed in the area determined by the default value on the right side of the chat area.
- the text input from the user interface 206 'no okay' 413, from the first area determined by the default value on the right side of the chat area according to the chat progress, may be used.
- the text received from the external device 30, 'Busy?' 414 or 'Blood' 415 is also displayed in the area determined by the default value on the left side of the chat area, and the chat progress. Is moved from the third area determined by the default value to the fourth area (e.g., above the third area).
- image data may be displayed in any area 451 determined by the user.
- the image display unit 205 may move the displayed image data. Specifically, when the position detector 204 receives a movement command for the displayed image data through the user interface 206 and detects the position information of the destination region to which the displayed image data is moved based on the input movement command, The display unit 205 may move the image data displayed as an area corresponding to the detected position information of the destination area.
- An example of the move command is that the displayed image data is activated in a state in which the user can move and the activated image data is moved to the destination area by the user. At this time, the displayed image data may be activated by a user touching a finger for several seconds within an area of the displayed image data.
- the data communication unit 201 transmits image data and location information to the external device 30.
- the data communication unit 201 retrieves the image data of the image selected by the image retrieval unit 203 through the user interface 206, and the position detection unit 204 is selected via the user interface 206 within the chat area.
- the searched image data and the detected location information may be transmitted to the external device 30.
- the data communication unit 201 displays the retrieved image data in the area corresponding to the detected location information
- the data communication unit 201 displays the retrieved image data and the detected location information in the external device 30. Can also be sent.
- the external device 30 displays the image data received in the area corresponding to the detected location information in the chat area of the external device 30.
- the image display unit 205 displays external image data received from the external device 30.
- the data communication unit 201 may receive external image data and external location information from the external device 30, and the image display unit 205 may display the external image data in an area corresponding to the external location information in the chat area.
- the external location information refers to location information of the area selected through the user interface of the external device 30.
- an external user may select an area by placing his or her finger in a predetermined point area within a chat area through a touch screen interface, which is a user interface of the external device 30, or touch a user interface of the external device 30.
- the area may be selected by moving a finger through a screen interface to specify a predetermined surface area consisting of a circle, a triangle, or a square.
- the external location information may be an identification number for specifying each of the plurality of pixels as a single numerical value, may be 2D coordinates including the x-axis and y-axis positions, and includes the x-axis, y-axis, and z-axis positions. It may also be 3D coordinates.
- the image display unit 205 may display a notification message indicating that the external image data has been received on the display of the device 20.
- the image display unit 205 may provide a notification message indicating that the external image data has been received or the location information has been received. It can be displayed on the display of the device 20.
- FIG. 4 is a diagram illustrating an example of displaying a notification message by the image display unit 205 of FIG. 2.
- the data communication unit 201 receives a sticker image, which is external image data, from the external device 30 of the user A
- a notification message indicating that the sticker image is received from the user A is displayed.
- 'A has a sticker' 46 may be displayed on the display of the device 20.
- the notification message may be displayed in the form of a push message on the lock screen on which the chat area is not activated.
- the user interface 206 refers to a tool or device that receives a control command from a user.
- the user interface 206 may be a physical input device such as a keyboard, a mouse, a touch screen, or the like, or may be a graphical user interface (GUI) expressed in an image output device.
- GUI graphical user interface
- the user interface 206 can provide a tool for modifying image data displayed in the chat area.
- the user interface 206 can provide a graphical editing tool for modifying image data selected by the user interface 206 or image data received from the external device 30.
- the image data selected by the user interface 206 or the image data received from the external device 30 may be image data of a sticker image type.
- User interface 206 may provide a tool for generating image data.
- the user interface 206 may provide a graphical editing tool for generating image data from an original image received from the outside, or generating image data from a captured image captured on a display of the device 20. You can also provide a graphical editing tool for this.
- the image data generated from the original image or the image data generated from the captured image may be image data of a sticker image type.
- the device 50 refers to any one of the devices 21 to 23 illustrated in FIG. 1, but is not limited to the form of the devices 21 to 23 of FIG. 1.
- the device 50 may include a data communication unit 501, a text display unit 502, an image search unit 503, a position detector 504, an image display unit 505, a user interface 506, and image deletion.
- a unit 507 and a payment processing unit 508 is included.
- each of the data communication unit 501, the text display unit 502, the image search unit 503, the position detector 504, the image display unit 505, and the user interface 506 may be a device of FIG. 2.
- 20 corresponds to the data communication unit 201, the text display unit 202, the image search unit 203, the position detector 204, the image display unit 205, and the user interface 206.
- each of the data communication unit 501 to the user interface 506 of FIG. 5 will be described above with respect to each of the data communication unit 201 to the user interface 206 of the device 20 of FIG. 2.
- the written content shall apply mutatis mutandis.
- the description of each of the data communication unit 501 to the user interface 506 of FIG. 5 may apply mutatis mutandis to each of the data communication unit 201 to the user interface 206 of the device 20 of FIG. 2.
- the matters which are not described with respect to the external device 60 are applicable to the content described with respect to the device 30, and the content described with respect to the external device 60 may be applicable to the device 30.
- the device 50 shown in FIG. 5 is only one implementation example of the present invention, and various modifications are possible based on the components shown in FIG. 5.
- the device 50 may further include a voice output device for outputting voice and a display for outputting an image.
- the data communication unit 501 receives chat data of the external device 60 through a network.
- the chat data includes at least one of text, image data, and location information associated with the image data.
- the text display unit 502 displays the text in the chat area.
- the text is text received from the external device 60 or text input from the user interface 506.
- the chat area is an area displayed on the display of the device 50 and means a graphic area displaying text input from the user interface 506 or received from the external device 60.
- the image retrieval unit 503 retrieves image data of the selected image through the user interface 506.
- the image display unit 505 receives a command for calling images from the user interface 506 and displays a plurality of images in a predetermined area of the display of the device 50
- the image search unit 503 may include a plurality of images. Image data of the selected image may be retrieved through the user interface among the images of the.
- the location detector 504 detects location information of the area selected through the user interface 506 within the chat area. The user can select an area via the user interface 506.
- the image display unit 505 displays the retrieved image data in an area corresponding to the detected position information.
- the detected location information refers to location information of the area selected through the user interface 506, and an area corresponding to the location information refers to an area in which image data is to be displayed.
- the area selected through the user interface 506 is distinguished from the area corresponding to the location information.
- the area selected through the user interface 506 and the area corresponding to the location information may be the same.
- the user interface 206 refers to a tool or device that receives a control command from a user.
- the user interface 206 may be a physical input device such as a keyboard, a mouse, a touch screen, or the like, or may be a graphical user interface (GUI) expressed in an image output device.
- GUI graphical user interface
- the image deleting unit 507 may not display the displayed image data. For example, when the user positions (or clicks or touches) a finger on the displayed image data through the touch screen that is the user interface 506, the image deleting unit 507 may not display the displayed image data.
- the selection for the area in which the image data is displayed is an example of a user's command for instructing the non-display of the displayed image data. Accordingly, according to various embodiments of the present disclosure, the user's command to instruct to not display the image data may be that the user selects an icon displayed on the display or selects a hardware button of the device 50.
- the operation of the image deleting unit 507 is performed by the image display unit 505.
- FIG. 6 is a diagram for describing an example of an operation of the device 50 of FIG. 5.
- 6 illustrates an operation of the image deleting unit 507 through reference numerals 46 and 47, when the user's finger 461 is positioned on the displayed image data 451, the user interface ( The touch screen 506 may recognize that the user's finger 461 is positioned on the displayed image data, and may not display the displayed image data 451.
- Reference numeral 47 denotes that the displayed image data 451 is not displayed.
- the image display unit 505 may re-display the image data 451 that was not displayed when the user deactivates the chat area and then activates it again. In other words, the image display unit 505 may re-display the undisplayed image data 451 when the user exits the chat area and enters the chat area again.
- the image deleting unit 507 may non-display at least one or more image data previously displayed in the chat area.
- the image non-display command may refer to a user command for instructing all of the at least one or more image data previously displayed in the chat area.
- An example of such an image non-display command may be that the user selects any one or more of the at least one image data previously displayed, the user selects the non-display icon displayed on the display, or selects the hardware non-display button of the device 50. have.
- the operation of the image deleting unit 507 may be performed by the image display unit 505.
- the data communication unit 501 may transmit a signal indicating that the image data is not displayed to the external device 60.
- the signal for notifying the non-display may include identification information for identifying the non-displayed image data. Such identification information may be location information of undisplayed image data.
- the external device 60 may not display at least one or more image data displayed on the external device 60 with reference to a signal indicating that the display is not displayed.
- the payment processor 508 performs at least one payment process corresponding to the image data.
- An example of the payment process is a payment process for purchasing image data and paying for the purchased image data.
- the payment process may be performed when the user enters a purchase request for the image data through the user interface 506, and transmits the image data from the device 50 to the payment server (not shown) or the chat data management server 10.
- a payment request message is transmitted, and a payment completion message of image data is received from a payment server (not shown) or the chat data management server 10.
- a payment process is a presenting process for presenting image data to another user.
- the presenting process may include a payment server (not shown) or chat data management server from device 50 when a user enters a request to present image data to external device 60 via user interface 506.
- the payment request message of the image data is transmitted to (10)
- the payment completion message of the image data is received from the payment server (not shown) or the chat data management server 10
- the image data is transmitted to the external device 60. do.
- examples of the payment process may be variously determined.
- FIG. 7 is a block diagram of a device 70 according to another embodiment of the present invention.
- the device 70 may refer to the external device 60 of FIG. 5, and the external device 80 may refer to the device 50 of FIG. 5. Therefore, the matters that are not described with respect to the device 70 through FIG. 7 may apply to the content described with respect to the device 30 of FIG. 2 or the content described with respect to the device 60 through FIG. 5.
- the device 70 includes a data communication unit 701, a text display unit 702, an image display unit 703, a user interface 704, and an image delete unit 705.
- the device 70 shown in FIG. 7 is only one implementation example of the present invention, and various modifications are possible based on the elements shown in FIG. 7.
- the device 70 may be configured to perform an operation of the image retrieval unit 503 of the device 50, may be configured to perform an operation of the position detector 504 of the device 50, and the payment of the device 50 may be performed.
- Each of the components that perform the operation of the processor 508 may be further included.
- the device 70 may perform an operation of the device 20 or the device 50 while performing the operation of the device 60. Accordingly, the data communication unit 701, the text display unit 702, the image display unit 703, the user interface 704, and the image delete unit 705 of the device 70 may be configured as the data communication unit 201 of the device 20. ), Operations of the text display unit 202, the image display unit 205, the user interface 206, and the image deleting unit 207, or the data communication unit 501, the text display unit 502, and the image display unit of the device 50. The operation 505, the user interface 506, and the image deleting unit 507 may be performed.
- the device 70 may be configured to perform the operation of the image search unit 203 or the image search unit 503, or may be configured to perform the operation of the position detector 204 or the position detector 504.
- the device 70 may be configured to perform the operation of the image search unit 203 or the image search unit 503, or may be configured to perform the operation of the position detector 204 or the position detector 504.
- Each of the components that perform the operation of the payment processing unit 508 may be further included.
- the data communication unit 701 receives chat data of the external device 80 through a network.
- the chat data includes at least one of text, image data, and location information associated with the image data.
- the data communication unit 701 may transmit the chat data to the external device 70 through the network.
- the data communication unit 701 may receive chat data from the external device 70 via the chat data management server 10, and directly chat from the external device 80 without passing through the chat data management server 10. Data can be received. In addition, the data communication unit 701 may transmit / receive information (for example, identification information, selection information, or identification code) of the chat data with the chat data management server 10.
- information for example, identification information, selection information, or identification code
- the text display unit 702 displays the text in the chat area.
- the text is text received from the external device 80 or text input from the user interface 704.
- the chat area is an area displayed on the display of the device 70 and refers to a graphic area displaying text input from the user interface 704 or received from the external device 80.
- the image display unit 703 displays the received image data in an area corresponding to the received position information.
- the received location information refers to location information of an area selected through the user interface of the external device 80, and an area corresponding to the location information refers to an area in which the received image data is to be displayed.
- the user interface 704 refers to a tool or device that receives a control command from a user.
- the user interface 704 may be a physical input device such as a keyboard, a mouse, a touch screen, or the like, or may be a graphical user interface (GUI) expressed in an image output device.
- GUI graphical user interface
- the image deleting unit 705 may not display the displayed image.
- the operation of the image deleting unit 705 may be performed by the image display unit 703.
- the chat data management server 10 includes a text managing unit 101, an image data managing unit 102, and a location information managing unit 103.
- the chat data management server 10 shown in FIG. 8 is just one implementation example of the present invention, and various modifications are possible based on the elements shown in FIG. 8.
- the chat data management server 10 may further include an administrator interface for controlling each configuration inside the chat data management server 10.
- the text manager 101 receives text from the first device 81 of a plurality of devices connected through a network, and transmits the received text to the second device 82.
- the image data manager 102 receives image data from the first device 81 and transmits the received image data to the second device 82.
- the image data manager 102 receives information (for example, identification information or selection information) about image data from the first device 81, and uses the received information to transmit image data to the second device ( 82).
- the image data manager 102 receives information (eg, identification information or selection information) about image data from the first device 81, and transmits the received information to the second device 82.
- the image data may be transmitted to the second device 82.
- the location information manager 103 receives location information associated with the image data from the first device 81 and transmits the received location information to the second device 82.
- the image data may be displayed in an area corresponding to the location information in the chat area of the first device.
- the image data may be displayed in an area corresponding to the location information in the chat area of the second device.
- FIG. 9 is a view for explaining an example of the operation of the chat data management server 10 of FIG. Since the operation of the chat data management server 10 of FIG. 9 merely represents one embodiment of various embodiments of the present invention, all embodiments of the present invention are limited to FIG. 9 and the description thereof. no.
- the chat data management server 10 may transmit and receive a signal for activating the chat area with the first device 81 and the second device 82.
- the chat data management server 10 receives a request for a chat from the first device 81 or the second device 82, and responds to the received chat request with the first device 81 and the second device. Send a response to the chat area activation to the device 82.
- the chat data management server 10 transmits the text received from the second device 82 to the first device 81 (S904).
- the first device 81 displays the text received from the chat data management server 10 in the chat area of the first device 81 (S905), and an image of the image selected through the user interface of the first device 81.
- Search the data (S906), detect the location information of the selected area through the user interface of the first device 81 (S907), display the retrieved image data in the area corresponding to the location information (S908), and search the retrieved image
- the data and the detected location information are transmitted to the chat data management server 10 (S909).
- the chat data management server 10 receives image data (or information on the image data) and location information associated with the image data from the first device 81 (S909), and receives the received image data (or the information on the image data). ) And the received location information to the second device 82 (S910).
- the second device 82 displays the received image data in an area corresponding to the received location information (S910).
- the first device 81 When the first device 81 receives a selection of displayed image data through the user interface of the first device 81 (S912), the first device 81 does not display image data in the chat area of the first device 82 (S913). In operation S914, a notification signal informing that the image data is not displayed is transmitted to the chat data management server 10.
- the chat data management server 10 receives a notification signal informing that the image data is not displayed from the first device 81 (S914), and transmits the received notification signal to the second device 82 (S915).
- the second device 82 does not display an image displayed in the chat area of the second device 82 based on the received notification signal.
- each of the first device 81 or the second device 82 of FIG. 8 is the devices 21 to 24, the device 20, the device 30, and the device 50 described with reference to FIGS. 1 to 7. ), The device 60, the device 70, or the device 80.
- FIG. 10 is a flowchart illustrating an image display method according to an exemplary embodiment.
- the image display method according to the embodiment shown in FIG. 10 includes steps processed in time series in the device 20 according to the embodiment shown in FIG. 5 or the device 50 according to the embodiment shown in FIG. 5. . Therefore, although omitted below, the contents described above with respect to the device 20 according to the embodiment shown in FIG. 2 or the device 50 according to the embodiment shown in FIG. 5 are described in the embodiment shown in FIG. 10. The same applies to the image display method according to the present invention.
- the following steps will be described as if performed by each configuration of the device 50 of FIG. 5, but may also be performed by each configuration of the device 50 of FIG. 2.
- the data communication unit 501 receives a text of an external device through a network.
- the text display unit 502 displays the received text in the chat area.
- the image retrieval unit 503 retrieves image data of the selected image through the user interface 506.
- the location detector 504 detects location information of the area selected through the user interface 506 within the chat area.
- the image display unit 507 displays the retrieved image data in an area corresponding to the detected position information.
- the image display unit 507 also moves the displayed image data in the chat area.
- the image display method when the image display method receives a selection for a region where image data is displayed through the user interface 506, non-displaying the displayed image data (not shown), and external image data is displayed.
- the method may further include displaying a notification message on the display of the device indicating that the communication unit is received, and generating at least one payment process corresponding to the image data.
- FIG. 11 is a flowchart illustrating an image display method according to another exemplary embodiment of the present invention.
- the image display method according to the embodiment shown in FIG. 11 includes steps processed in time series in the device 70 according to the embodiment shown in FIG. 7. Therefore, although omitted below, the above descriptions of the device 70 according to the exemplary embodiment shown in FIG. 7 also apply to the image display method according to the exemplary embodiment illustrated in FIG. 11.
- the data communication unit 701 receives text, image data, and location information of the external device 80 through a network.
- the text display unit 702 displays the received text in the chat area.
- the image display unit 703 displays the received image data in an area corresponding to the received position information.
- the image display unit 704 when the displayed text is moved in the chat area, the image display unit 704 also moves the displayed image data in the chat area.
- the image display method includes the step of not displaying the displayed image (not shown) when the data communication unit 701 receives a signal indicating that the image data transmitted from the external device 80 is not displayed. It may further include.
- the chat data management method according to the embodiment shown in FIG. 12 includes steps that are processed in time series in the chat data management server 10 according to the embodiment shown in FIG. 8. Therefore, even if omitted below, the above descriptions of the chat data management server 10 according to the exemplary embodiment shown in FIG. 8 may also be applied to the chat data managing method according to the exemplary embodiment illustrated in FIG. 12.
- the text manager 101 receives text from the first device 81 of the plurality of devices connected through the network, and transmits the received text to the second device 82.
- the image data manager 102 receives information (or image data) about image data from the first device 81, and transmits information (or image data) about the received image data to the second device 82.
- the location information manager S1203 receives location information associated with image data from the first device 81 and transmits the received location information to the second device 82.
- the image data may be displayed in an area corresponding to the location information in the chat area of the second device 82.
- the image data may be displayed in an area corresponding to the location information in the chat area of the first device 81.
- Each of the image display method or the chat data management method described with reference to FIGS. 10, 11 or 12 may also be implemented in the form of a recording medium including instructions executable by a computer, such as a program module executed by the computer.
- Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media.
- computer readable media may include both computer storage media and communication media.
- Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transmission mechanism, and includes any information delivery media.
Abstract
Description
Claims (16)
- 텍스트가 표시되는 채팅 영역에 이미지를 표시하는 디바이스에 있어서,네트워크를 통해 외부 디바이스의 텍스트를 수신하는 데이터 통신부;상기 수신된 텍스트를 상기 채팅 영역에 표시하는 텍스트 표시부;사용자 인터페이스를 통해 선택된 이미지의 이미지 데이터를 검색하는 이미지 검색부;상기 채팅 영역 내에서 상기 사용자 인터페이스를 통해 선택된 영역의 위치 정보를 검출하는 위치 검출부; 및상기 검색된 이미지 데이터를 상기 검출된 위치 정보에 대응하는 영역에 표시하는 이미지 표시부를 포함하되,상기 표시된 텍스트가 상기 채팅 영역 내에서 위치 이동할 경우, 상기 표시된 이미지 데이터도 상기 채팅 영역 내에서 위치 이동되는 것인, 디바이스.
- 제 1 항에 있어서,상기 이미지 데이터가 표시된 영역은 상기 텍스트가 표시된 영역의 전부 또는 일부에 오버랩되는 것인, 디바이스.
- 제 1 항에 있어서,상기 사용자 인터페이스를 통해 상기 이미지 데이터가 표시된 영역에 대한 선택을 입력받는 경우, 상기 표시된 이미지 데이터를 미표시시키는 이미지 삭제부를 더 포함하는, 디바이스.
- 제 3 항에 있어서,상기 이미지 삭제부가 상기 표시된 이미지 데이터를 미표시시키는 경우, 상기 데이터 통신부가 상기 이미지 데이터의 미표시 취지를 알리는 신호를 상기 외부 디바이스에게 송신하는 것인, 디바이스.
- 제 1 항에 있어서,상기 텍스트 표시부는 상기 사용자 인터페이스를 통해 선택된 영역이 상기 표시된 텍스트로부터 소정 거리 이내의 영역인 경우, 상기 표시된 텍스트에 강조 효과를 나타내는 것인, 디바이스.
- 제 1 항에 있어서,상기 위치 검출부는 상기 사용자 인터페이스를 통해 상기 표시된 이미지 데이터에 대한 이동 명령을 입력받는 경우, 상기 입력된 이동 명령에 기초하여 상기 표시된 이미지 데이터가 이동되는 목적지 영역의 위치 정보를 검출하고,상기 이미지 표시부는 상기 검출된 목적지 영역의 위치 정보에 대응하는 영역으로 상기 표시된 이미지 데이터를 이동시켜 표시시키는 것인, 디바이스.
- 제 1 항에 있어서,상기 사용자 인터페이스로부터 이미지 미표시 명령을 입력받는 경우, 상기 채팅 영역에 기 표시된 적어도 하나 이상의 이미지 데이터를 미표시시키는 이미지 삭제부를 더 포함하는, 디바이스.
- 제 1 항에 있어서,상기 채팅 영역은 구분 라인을 통해 적어도 둘 이상의 셀 영역으로 구분되고,상기 선택된 영역은 상기 구분 라인을 통해 구분된 적어도 둘 이상의 셀 영역 중 상기 사용자 인터페이스를 통해 선택된 어느 하나의 셀 영역인 것인, 디바이스.
- 제 1 항에 있어서,상기 사용자 인터페이스는 이미지가 사용된 시간에 따라 순차적으로 정렬된 복수의 이미지 리스트를 사용자에게 제공하는 것인, 디바이스.
- 제 1 항에 있어서,상기 사용자 인터페이스는 상기 이미지 데이터를 수정하기 위한 도구를 제공하는 것인, 디바이스.
- 제 1 항에 있어서,상기 이미지 표시부는 상기 데이터 통신부가 상기 외부 디바이스의 외부 이미지 데이터 및 외부 위치 정보를 수신하는 경우, 상기 채팅 영역 중 상기 외부 위치 정보에 대응하는 영역에 상기 외부 이미지 데이터를 표시하는 것인, 디바이스.
- 제 11 항에 있어서,상기 이미지 표시부는 상기 데이터 통신부가 상기 외부 이미지 데이터를 수신하는 경우, 상기 외부 이미지 데이터를 수신한 것을 알리는 알림 메시지를 상기 디바이스의 디스플레이에 표시하는 것인, 디바이스.
- 제 1 항에 있어서,상기 이미지 데이터에 관한 적어도 하나 이상의 결제 프로세스를 수생하는 결제 처리부를 더 포함하는, 디바이스.
- 텍스트가 표시되는 채팅 영역에 이미지를 표시하는 방법에 있어서,네트워크를 통해 외부 디바이스의 텍스트를 수신하는 단계;상기 수신된 텍스트를 상기 채팅 영역에 표시하는 단계;사용자 인터페이스를 통해 선택된 이미지의 이미지 데이터를 검색하는 단계;상기 채팅 영역 내에서 상기 사용자 인터페이스를 통해 선택된 영역의 위치 정보를 검출하는 단계;상기 검색된 이미지 데이터를 상기 검출된 위치 정보에 대응하는 영역에 표시하는 단계; 및상기 표시된 텍스트가 상기 채팅 영역 내에서 위치 이동할 경우, 상기 표시된 이미지 데이터도 상기 채팅 영역 내에서 위치 이동시키는 단계를 포함하는 이미지 표시 방법.
- 제 14 항에 있어서,상기 사용자 인터페이스를 통해 상기 이미지 데이터가 표시된 영역에 대한 선택을 입력받는 경우, 상기 표시된 이미지 데이터를 미표시시키는 단계;외부 이미지 데이터가 수신된 것을 알리는 알림 메시지를 상기 디바이스의 디스플레이에 표시하는 단계; 및상기 이미지 데이터에 관한 적어도 하나 이상의 결제 프로세스를 수행하는 단계를 더 포함하는 이미지 표시 방법.
- 채팅 데이터를 관리하는 서버에 있어서,네트워크를 통해 연결된 복수의 디바이스 중 제 1 디바이스로부터 텍스트를 수신하고, 수신된 텍스트를 제 2 디바이스로 전송하는 텍스트 관리부;상기 제 1 디바이스로부터 이미지 데이터에 대한 정보를 수신하고, 수신된 이미지 데이터에 대한 정보를 상기 제 2 디바이스로 전송하는 이미지 데이터 관리부; 및상기 제 1 디바이스로부터 상기 이미지 데이터와 연관된 위치 정보를 수신하고, 수신된 위치 정보를 상기 제 2 디바이스로 전송하는 위치 정보 관리부를 포함하되,상기 이미지 데이터는 상기 제 2 디바이스의 채팅 영역 중 상기 위치 정보에 대응하는 영역에 표시되는 것인 채팅 데이터 관리 서버.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015539499A JP6095790B2 (ja) | 2012-10-22 | 2013-10-21 | チャット領域にイメージを表示するデバイス及び方法、そしてチャットデータを管理するサーバ |
US14/437,005 US9847955B2 (en) | 2012-10-22 | 2013-10-21 | Device and method for displaying image in chatting area and server for managing chatting data |
US15/811,833 US10666586B2 (en) | 2012-10-22 | 2017-11-14 | Device and method for displaying image in chatting area and server for managing chatting data |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0117286 | 2012-10-22 | ||
KR1020120117286A KR101390228B1 (ko) | 2012-10-22 | 2012-10-22 | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/437,005 A-371-Of-International US9847955B2 (en) | 2012-10-22 | 2013-10-21 | Device and method for displaying image in chatting area and server for managing chatting data |
US15/811,833 Continuation US10666586B2 (en) | 2012-10-22 | 2017-11-14 | Device and method for displaying image in chatting area and server for managing chatting data |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014065551A1 true WO2014065551A1 (ko) | 2014-05-01 |
Family
ID=50544880
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/009392 WO2014065551A1 (ko) | 2012-10-22 | 2013-10-21 | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 |
Country Status (4)
Country | Link |
---|---|
US (2) | US9847955B2 (ko) |
JP (1) | JP6095790B2 (ko) |
KR (1) | KR101390228B1 (ko) |
WO (1) | WO2014065551A1 (ko) |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI439960B (zh) | 2010-04-07 | 2014-06-01 | Apple Inc | 虛擬使用者編輯環境 |
KR101390228B1 (ko) * | 2012-10-22 | 2014-05-07 | (주)카카오 | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 |
KR102057944B1 (ko) * | 2013-09-17 | 2019-12-23 | 삼성전자주식회사 | 단말 장치 및 그의 공유 방법 |
KR102138515B1 (ko) * | 2013-10-01 | 2020-07-28 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
JP6672588B2 (ja) * | 2014-02-19 | 2020-03-25 | 株式会社リコー | 伝送システム、方法、プログラム及びシステム |
KR102387268B1 (ko) * | 2014-09-12 | 2022-04-15 | 에스케이플래닛 주식회사 | 말풍선 효과를 제공하는 메시징 서비스 제공 방법, 이를 위한 프로그램을 기록한 기록 매체 및 단말 |
TWI525462B (zh) * | 2014-12-25 | 2016-03-11 | Yu-Wei Huang | Message graphic display method |
CN105955808B (zh) * | 2016-04-21 | 2018-12-11 | 北京小米移动软件有限公司 | 即时通信应用的任务管理方法及装置 |
CN111176509B (zh) * | 2016-05-18 | 2022-01-25 | 苹果公司 | 在图形消息传送用户界面中应用确认选项 |
US11320982B2 (en) | 2016-05-18 | 2022-05-03 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10368208B2 (en) | 2016-06-12 | 2019-07-30 | Apple Inc. | Layers in messaging applications |
US10595169B2 (en) * | 2016-06-12 | 2020-03-17 | Apple Inc. | Message extension app store |
AU2017100670C4 (en) | 2016-06-12 | 2019-11-21 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
WO2018032271A1 (zh) * | 2016-08-15 | 2018-02-22 | 北京小米移动软件有限公司 | 信息搜索方法、装置、电子设备及服务器 |
WO2018038277A1 (ko) * | 2016-08-22 | 2018-03-01 | 스노우 주식회사 | 대화방을 통해 각 사용자의 상태를 반영한 화상 데이터를 공유하는 메시지 공유 방법메시지 공유 방법 및 상기 방법을 실행시키기 위한 컴퓨터 프로그램 |
CN117193618A (zh) * | 2016-09-23 | 2023-12-08 | 苹果公司 | 头像创建和编辑 |
US10404640B2 (en) | 2017-07-14 | 2019-09-03 | Casey Golden | Systems and methods for providing online chat-messages with configurable, interactive imagery |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
DK180212B1 (en) | 2018-05-07 | 2020-08-19 | Apple Inc | USER INTERFACE FOR CREATING AVATAR |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
DK202070625A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11657058B2 (en) * | 2020-07-15 | 2023-05-23 | Citrix Systems, Inc. | Systems and methods of enhancing mental health and performance |
CN112163398A (zh) * | 2020-09-30 | 2021-01-01 | 金蝶软件(中国)有限公司 | 一种图表分享方法及其相关设备 |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040046272A (ko) * | 2002-11-26 | 2004-06-05 | 엔에이치엔(주) | 사용자 정의 이모티콘 이미지를 이용한 컴퓨터 네트워크상에서의 데이터 전송 서비스 제공방법 및 그를 구현하기위한 응용 프로그램을 기록한 컴퓨터가 읽을 수 있는기록매체 |
KR20070112515A (ko) * | 2006-05-22 | 2007-11-27 | (주)케이티에프테크놀로지스 | 휴대용 단말기의 특수문자 자동 완성 방법 |
KR20110037040A (ko) * | 2009-10-05 | 2011-04-13 | 삼성전자주식회사 | 휴대 단말기 및 그의 화면 표시 방법 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001117849A (ja) | 1999-10-18 | 2001-04-27 | Sons Ltd | チャットシステム、チャット支援方法およびそれを記録した媒体 |
JP3930489B2 (ja) * | 2004-03-31 | 2007-06-13 | 株式会社コナミデジタルエンタテインメント | チャットシステム、通信装置、その制御方法及びプログラム |
GB0428049D0 (en) * | 2004-12-22 | 2005-01-26 | Carnall Murat | Improvements to call management in a telecommunications system |
WO2008059594A1 (fr) | 2006-11-17 | 2008-05-22 | Osaka Electro-Communication University | Dispositif de support de composition musicale, et système de support de composition musicale |
US7756536B2 (en) * | 2007-01-31 | 2010-07-13 | Sony Ericsson Mobile Communications Ab | Device and method for providing and displaying animated SMS messages |
US7912289B2 (en) * | 2007-05-01 | 2011-03-22 | Microsoft Corporation | Image text replacement |
US8930463B2 (en) * | 2007-07-09 | 2015-01-06 | Yahoo! Inc. | Super-emoticons |
US20090094555A1 (en) * | 2007-10-05 | 2009-04-09 | Nokia Corporation | Adaptive user interface elements on display devices |
KR101505688B1 (ko) * | 2008-10-23 | 2015-03-24 | 엘지전자 주식회사 | 이동 단말기 및 그 정보 처리 방법 |
US8584031B2 (en) | 2008-11-19 | 2013-11-12 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US20100185949A1 (en) * | 2008-12-09 | 2010-07-22 | Denny Jaeger | Method for using gesture objects for computer control |
US9130779B2 (en) * | 2009-06-02 | 2015-09-08 | Qualcomm Incorporated | Method and apparatus for providing enhanced SMS/EMS/MMS |
WO2011085248A1 (en) * | 2010-01-07 | 2011-07-14 | Swakker, Llc | Methods and apparatus for modifying a multimedia object within an instant messaging session at a mobile communication device |
KR20130009392A (ko) | 2011-07-15 | 2013-01-23 | 삼성전자주식회사 | 컨트롤러의 동작 방법과 상기 컨트롤러를 포함하는 메모리 시스템 |
US9495661B2 (en) * | 2011-09-26 | 2016-11-15 | Sparxo, Inc | Embeddable context sensitive chat system |
US9152219B2 (en) * | 2012-06-18 | 2015-10-06 | Microsoft Technology Licensing, Llc | Creation and context-aware presentation of customized emoticon item sets |
KR101390228B1 (ko) * | 2012-10-22 | 2014-05-07 | (주)카카오 | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 |
-
2012
- 2012-10-22 KR KR1020120117286A patent/KR101390228B1/ko active IP Right Grant
-
2013
- 2013-10-21 JP JP2015539499A patent/JP6095790B2/ja active Active
- 2013-10-21 US US14/437,005 patent/US9847955B2/en active Active
- 2013-10-21 WO PCT/KR2013/009392 patent/WO2014065551A1/ko active Application Filing
-
2017
- 2017-11-14 US US15/811,833 patent/US10666586B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040046272A (ko) * | 2002-11-26 | 2004-06-05 | 엔에이치엔(주) | 사용자 정의 이모티콘 이미지를 이용한 컴퓨터 네트워크상에서의 데이터 전송 서비스 제공방법 및 그를 구현하기위한 응용 프로그램을 기록한 컴퓨터가 읽을 수 있는기록매체 |
KR20070112515A (ko) * | 2006-05-22 | 2007-11-27 | (주)케이티에프테크놀로지스 | 휴대용 단말기의 특수문자 자동 완성 방법 |
KR20110037040A (ko) * | 2009-10-05 | 2011-04-13 | 삼성전자주식회사 | 휴대 단말기 및 그의 화면 표시 방법 |
Non-Patent Citations (1)
Title |
---|
"[Android / messenger / SNS] line (Line) - Naver created a new smart phone messenger", NAVER BLOG, 22 January 2012 (2012-01-22) * |
Also Published As
Publication number | Publication date |
---|---|
US9847955B2 (en) | 2017-12-19 |
JP2016502174A (ja) | 2016-01-21 |
JP6095790B2 (ja) | 2017-03-15 |
US20150281145A1 (en) | 2015-10-01 |
US10666586B2 (en) | 2020-05-26 |
US20180069814A1 (en) | 2018-03-08 |
KR101390228B1 (ko) | 2014-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014065551A1 (ko) | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 | |
WO2020067585A1 (ko) | 메신저 어플리케이션과 연관된 대화방을 디스플레이하는 방법 및 장치 | |
CN109725975B (zh) | 消息被读状态的提示方法、装置和电子设备 | |
US10153995B2 (en) | Method and apparatus for effecting web page access in a plurality of media applications | |
WO2017135797A2 (en) | Method and electronic device for managing operation of applications | |
WO2013168885A1 (ko) | 락 스크린의 제공 방법 및 그 제공 방법이 구현된 단말 장치 | |
WO2013032236A2 (en) | Method and apparatus for managing schedules in a portable terminal | |
WO2016148525A1 (ko) | 이모티콘 탐색 방법 및 단말 | |
WO2019117642A1 (ko) | 사회적 네트워크 관리를 지원하는 서버 및 사용자 단말 | |
WO2015183045A1 (ko) | 채팅 메시지를 태깅하는 방법 및 디바이스 | |
WO2016129811A1 (ko) | 인스턴트 메시징 서비스에서 공식계정의 리치 메뉴를 제공하는 방법과 시스템 및 기록 매체 | |
CN113591439B (zh) | 一种信息交互方法、装置、电子设备及存储介质 | |
WO2015133777A1 (ko) | 소셜 네트워크 서비스 제공 방법 및 장치 | |
CN114491349B (zh) | 页面显示方法、装置、电子设备、存储介质和程序产品 | |
CN111183441A (zh) | 信息处理方法、信息处理装置及信息处理程序 | |
CN111651110A (zh) | 群聊消息显示方法、装置、电子设备及存储介质 | |
CN113918055A (zh) | 消息处理方法、装置和电子设备 | |
US20170351650A1 (en) | Digital conversation annotation | |
CN111897475A (zh) | 消息查看方法及装置 | |
WO2017175950A1 (ko) | 사회적 네트워크 관리를 지원하는 서버 및 사용자 단말 | |
CN115022272A (zh) | 信息处理方法、装置、电子设备和存储介质 | |
WO2019107799A1 (ko) | 입력 필드의 이동 방법 및 장치 | |
CN113885746A (zh) | 消息发送方法、装置及电子设备 | |
WO2014171613A1 (ko) | 메시징 서비스 제공 방법, 이를 위한 프로그램을 기록한 기록 매체 및 단말 | |
CN111368329A (zh) | 消息展示方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13849932 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14437005 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015539499 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13849932 Country of ref document: EP Kind code of ref document: A1 |