WO2008001961A1 - Mobile animation message service method and system and terminal - Google Patents

Mobile animation message service method and system and terminal Download PDF

Info

Publication number
WO2008001961A1
WO2008001961A1 PCT/KR2006/002470 KR2006002470W WO2008001961A1 WO 2008001961 A1 WO2008001961 A1 WO 2008001961A1 KR 2006002470 W KR2006002470 W KR 2006002470W WO 2008001961 A1 WO2008001961 A1 WO 2008001961A1
Authority
WO
WIPO (PCT)
Prior art keywords
animation
files
file
lip
message service
Prior art date
Application number
PCT/KR2006/002470
Other languages
French (fr)
Inventor
Tae-Sik Kim
Original Assignee
Keimyung University Industry-Academic Cooperation Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keimyung University Industry-Academic Cooperation Foundation filed Critical Keimyung University Industry-Academic Cooperation Foundation
Publication of WO2008001961A1 publication Critical patent/WO2008001961A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L21/00Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
    • G10L21/06Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
    • G10L21/10Transforming into visible information
    • G10L2021/105Synthesis of the lips movements from speech, e.g. for talking heads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/64Details of telephonic subscriber devices file transfer between terminals

Definitions

  • the present invention relates to a message service method, and more particularly, to an animation message service method, system and terminal.
  • a messaging service includes a SMS (Short Message Service), an MMS 9Multimedia Messaging System), an animation message service, and other services known in the art.
  • the animation message service which is of a 2D image card type, has a narrow market range in the field of mobile contents services due to its limited representation.
  • the present invention has been designed to overcome the above and other problems, and it is an object of the invention to provide an animation message service method, system and terminal which is capable of greatly reducing the amount of data for an animation message service by converting a 3D animation file into a flash file.
  • an animation message service system including: a database that stores a plurality of animation files; and a server that converting a plurality of 3D animation files into flash files as the plurality of animation files, stores the plurality of animation files in the database, and transmits one selected from the plurality of animation files to a specified client terminal.
  • the server stores lip position information of a character included in the animation file in the database, stores a plurality of lip-sync files constituted by voice data corresponding to a plurality of announcements and lip motion data in the database, and transmits one selected from the plurality of lip- sync files, along with the animation file, to the client terminal.
  • the server stores a plurality of audio files in the database, and transmits one selected from the plurality of audio files, along with the animation file, to the client terminal.
  • the amount of data for an animation message service can be greatly reduced by converting a 3D animation file into a flash file.
  • lip-sync is possible by modifying some of animation, that is, a lip region, at a user's request, and synchronizing voice data with the animation.
  • the present invention provides an animation message service method, system and terminal, which is capable of greatly reducing the amount of data for an animation message service by converting a 3D animation file into a flash file.
  • the present invention provides an animation message service method, system and terminal, which is capable of making lip-sync possible by modifying some of animation, that is, a lip region, at a user's request, and synchronizing voice data with the animation, and satisfying a variety of demands of users by synchronizing an audio file with the animation at users' requests.
  • FIG. 1 is a view illustrating a configuration of an animation message service system according to a preferred embodiment of the present invention.
  • FIG. 2 is a view illustrating structures of an animation file, a lip-sync file, a final animation file according to a preferred embodiment of the present invention.
  • FIG. 3 is a procedural diagram illustrating an animation message service method according to a preferred embodiment of the present invention.
  • FIG. 4 is a view illustrating a configuration of a terminal according to a preferred embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a method of reproducing a final animation file according to a preferred embodiment of the present invention.
  • a Web server 100 provides a Web site for selection or edition of an animation message of the invention to various kinds of terminals connected through a wire/wireless network. In addition, the Web server 100 downloads the animation message to a particular terminal.
  • a database 102 stores various files for the animation message service according to the preferred embodiment of the present invention.
  • the files include an animation file, an audio file, a lip-sync file, an animation message, etc, as shown in (a) to (d) of FIG. 2.
  • the animation file is constituted by an animation file generated by converting a 3D animation file into a flash file, and information on lip position of a character which belongs to the animation file, as shown in (a) of FIG. 2.
  • the lip position information refers to position information to be modified for lip- sync.
  • the lip-sync file is constituted by announcement of a text data type, voice data corresponding to the announcement, and lip motion data corresponding to the announcement, as shown in (c) of FIG. 2.
  • the animation message combined by a client is constituted by some or all of an animation file, lip position information, an audio file and a lip- sync file depending on selection of a client.
  • a manager terminal 104 generates 3D animation using software such as 3D MAX or Maya at a manager's request, converts the 3D animation into a flash file as an animation file, and stores the animation file in the database 102 through the Web server 100.
  • the manager terminal 104 adds the lip position information of the character belonging to the animation file to the animation file for performance of a lip sync function of the present invention.
  • the manager terminal 104 generates an audio file at a client's request and stores the audio file in the database 102 through the Web server 100.
  • client terminals Tl to T3 such as mobile communication terminals or personal computers access a Web site of the Web server 100 through a wired/wireless network, organizes an animation message using some selected from various animation files, audio files and lip-sync files stored in the database 102, and requests transmission of the animation message.
  • the request for the transmission of the animation message may be made to the requesting terminal or other terminals.
  • the manager terminal 104 generates a 3D animation file using software such as 3D MAX or Maya at a service provider's request (Step 200) and converts the 3D animation file into a flash file (Step 202). Thereafter, the manager terminal 104 adds lip position information of a character included in the animation file to the animation file converted into the flash file and transmits the animation file, along with the lip position information, to the Web server 100 (Step 204).
  • the Web server 100 that received the animation file to which the lip position information is added stores the animation file in the database 102 (Step 206).
  • the Steps 200 to 206 are repeated to store a plurality of animation files corresponding to various events in the database 102.
  • the manager terminal 102 generates an audio file which can be used as a background sound or an effect sound at a service provider's request and provides the generated audio file to the Web server 100 (Step 208).
  • the manager terminal 102 that received the audio file stores the audio file in the database 102 (Step 210).
  • the Steps 208 and 210 are repeated to store a plurality of audio files corresponding to various events in the database 102.
  • the manager terminal 102 generates a lip-sync file, which is constituted by announcement, voice data corresponding to the announcement, and lip motion data according to the voice data, at a service provider's request, and provides the generated lip-sync file to the Web server 100 (Step 212).
  • the manager terminal 102 that received the lip-sync file stores the lip- sync file in the database 102 (Step 214).
  • Steps 212 and 214 are repeated to store a plurality of lip-sync files corresponding to various announcements in the database 102.
  • the Web server 100 When one (for example, Tl) of the plurality of client terminals Tl to T3 accesses the Web server 100 through the wired/wireless network at a client's request, the Web server 100 shows an animation file list, an audio file list and a lip-sync file list which are stored in the database 102, and a client selects one or more of an animation file, an audio file and a lip-sync file from the shown lists and requests the Web server 100 to organize an animation message (Step 218).
  • the Web server 100 reads one or more of the animation file, the audio file and the lip-sync file, which are selected by the client, from the database 102 and organizes the animation message based on the read file(s). Thereafter, the Web server 100 transmits the organized animation message to the client terminal Tl or a client terminal requested by the client.
  • the animation message is transmitted to the terminal through the wired/wireless network or other mobile communication systems (not shown).
  • the client terminal Tl that received the animation message synchronizes and reproduces the animation file, the audio file and the lip-sync file included in the animation message.
  • the client terminal Tl includes a controller 300, a memory unit 302, a communication module 302, an operating panel 306, an audio signal processor 308, a speaker 310, a video signal processor 312 and a display device 314.
  • the controller 300 controls the entire operation of the client terminal Tl, including reproducing an animation message according to a preferred embodiment of the present invention.
  • the memory unit 302 stores a variety of information including process programs of the controller 300 and, particularly, an animation message received from the Web server 100.
  • the communication module 304 communicates with the Web server 100 through a wireless network, such as WiBro, or other mobile communication systems (not shown).
  • a wireless network such as WiBro, or other mobile communication systems (not shown).
  • the operating panel 306 interfaces between a user and the controller 300 and provides various kinds of information related to selection by the user to the controller 300.
  • the audio signal processor 308 processes an audio signal under control of the controller 300 and outputs the processed audio signal through the speaker 310.
  • the video signal processor 312 processes a video signal under control of the controller 300 and outputs the processed video signal through the display device 314.
  • the controller 300 executes an instruction to reproduce the animation message stored in the memory unit 302, which is inputted by a user through the operating panel 306, or checks whether or not there occurs a receipt event of the animation message through the communication module 304 (Step 400).
  • the controller 300 If there occurs a reproduction instruction from the user or the receipt event of the animation message, the controller 300 reads the animation message from the memory unit 302 and checks whether or not a lip-sync file and an audio file are included in the read animation message (Step 404).
  • the controller 300 synchronizes and reproduces an animation file, voice data of the lip-sync file, and the audio file included in the animation message, and modifies an image portion, which corresponds to lip position information added to the animation file, based on lip motion information included in the lip- sync file (Step 406).
  • a lip of a character included in the animation message moves in correspondence to the voice data.
  • 3D animation is reproduced, a background sound or an effect sound and the voice data are outputted in synchronization with the 3D animation, and the character's lip moves as if the character really speaks, according to the voice data.
  • the controller 300 synchronizes and reproduces the animation file and the voice data of the lip-sync file included in the animation message, and modifies an image portion, which corresponds to the lip position information added to the animation file, based on the lip motion information included in the lip-sync file (Step 410).
  • 3D animation is reproduced, the voice data are outputted in synchronization with the 3D animation, and the character's lip moves as if the character really speaks, according to the voice data.
  • the controller 300 synchronizes and reproduces only the animation file and the audio file (Step 412.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An object of the invention is to provide an animation message service method, system and terminal which is capable of greatly reducing the amount of data for an animation message service by converting a 3D animation file into a flash file. The animation message service system includes: a database that stores a plurality of animation files; and a server that converting a plurality of 3D animation files into flash files as the plurality of animation files, stores the plurality of animation files in the database, and transmits one selected from the plurality of animation files to a specified client terminal. The server stores lip position information of a character included in the animation file in the database, stores a plurality of lip-sync files constituted by voice data corresponding to a plurality of announcements and lip motion data in the database, and transmits one selected from the plurality of lip-sync files, along with the animation file, to the client terminal.

Description

DESCRIPTION
MOBILE ANIMATION MESSAGE SERVICE METHOD AND SYSTEM AND TERMINAL
Technical Field
The present invention relates to a message service method, and more particularly, to an animation message service method, system and terminal.
Background Art With development of technologies, there has been proposed a variety of messaging services through personal portable terminals such as mobile communication terminals.
A messaging service includes a SMS (Short Message Service), an MMS 9Multimedia Messaging System), an animation message service, and other services known in the art.
In particular, the animation message service, which is of a 2D image card type, has a narrow market range in the field of mobile contents services due to its limited representation.
To overcome this problem, there has been proposed a message service using 3D animation. However, the 3D animation message service has not been in wide use due to enormous data, that is, high service use fees.
Thus, there is a keen need for technology that can greatly reduce the amount of data for the 3D animation message service.
In addition, for the conventional animation message service, since a service provider transmits a created animation file to terminals of clients one- sidedly, there is a problem that the clients can not receive animation message services that suit clients' diverse tastes. Accordingly, there is also a keen need for technology that can edit an animation file that suits the clients' diverse tastes at clients' requests.
Disclosure of Invention Technical Problem
The present invention has been designed to overcome the above and other problems, and it is an object of the invention to provide an animation message service method, system and terminal which is capable of greatly reducing the amount of data for an animation message service by converting a 3D animation file into a flash file.
It is another object of the invention to provide an animation message service method, system and terminal which is capable of modifying some of animation, or synchronizing an audio file or a voice file with the animation.
Technical Solution
To achieve the above and other objects and address the above and other problems occurring in the prior art, the present invention provides an animation message service system including: a database that stores a plurality of animation files; and a server that converting a plurality of 3D animation files into flash files as the plurality of animation files, stores the plurality of animation files in the database, and transmits one selected from the plurality of animation files to a specified client terminal.
Preferably, the server stores lip position information of a character included in the animation file in the database, stores a plurality of lip-sync files constituted by voice data corresponding to a plurality of announcements and lip motion data in the database, and transmits one selected from the plurality of lip- sync files, along with the animation file, to the client terminal. Preferably, the server stores a plurality of audio files in the database, and transmits one selected from the plurality of audio files, along with the animation file, to the client terminal.
According to an animation message service method, system and terminal of the present invention, the amount of data for an animation message service can be greatly reduced by converting a 3D animation file into a flash file.
In addition, lip-sync is possible by modifying some of animation, that is, a lip region, at a user's request, and synchronizing voice data with the animation.
Furthermore, a variety of demands of users can be satisfied by synchronizing an audio file with the animation at users' requests.
Advantageous Effects
The present invention provides an animation message service method, system and terminal, which is capable of greatly reducing the amount of data for an animation message service by converting a 3D animation file into a flash file.
In addition, the present invention provides an animation message service method, system and terminal, which is capable of making lip-sync possible by modifying some of animation, that is, a lip region, at a user's request, and synchronizing voice data with the animation, and satisfying a variety of demands of users by synchronizing an audio file with the animation at users' requests.
Brief Description of the Drawings
FIG. 1 is a view illustrating a configuration of an animation message service system according to a preferred embodiment of the present invention. FIG. 2 is a view illustrating structures of an animation file, a lip-sync file, a final animation file according to a preferred embodiment of the present invention. FIG. 3 is a procedural diagram illustrating an animation message service method according to a preferred embodiment of the present invention.
FIG. 4 is a view illustrating a configuration of a terminal according to a preferred embodiment of the present invention. FIG. 5 is a flow chart illustrating a method of reproducing a final animation file according to a preferred embodiment of the present invention.
Mode for Carrying Out the Invention
Hereinafter, a configuration of an animation message service system according to a preferred embodiment of the present invention will be described with reference to FIG. 1.
A Web server 100 provides a Web site for selection or edition of an animation message of the invention to various kinds of terminals connected through a wire/wireless network. In addition, the Web server 100 downloads the animation message to a particular terminal.
A database 102 stores various files for the animation message service according to the preferred embodiment of the present invention. The files include an animation file, an audio file, a lip-sync file, an animation message, etc, as shown in (a) to (d) of FIG. 2. The animation file is constituted by an animation file generated by converting a 3D animation file into a flash file, and information on lip position of a character which belongs to the animation file, as shown in (a) of FIG. 2. Here, the lip position information refers to position information to be modified for lip- sync. The lip-sync file is constituted by announcement of a text data type, voice data corresponding to the announcement, and lip motion data corresponding to the announcement, as shown in (c) of FIG. 2. In addition, the animation message combined by a client is constituted by some or all of an animation file, lip position information, an audio file and a lip- sync file depending on selection of a client.
A manager terminal 104 generates 3D animation using software such as 3D MAX or Maya at a manager's request, converts the 3D animation into a flash file as an animation file, and stores the animation file in the database 102 through the Web server 100.
In addition, the manager terminal 104 adds the lip position information of the character belonging to the animation file to the animation file for performance of a lip sync function of the present invention.
In addition, the manager terminal 104 generates an audio file at a client's request and stores the audio file in the database 102 through the Web server 100.
Various types of client terminals Tl to T3 such as mobile communication terminals or personal computers access a Web site of the Web server 100 through a wired/wireless network, organizes an animation message using some selected from various animation files, audio files and lip-sync files stored in the database 102, and requests transmission of the animation message. The request for the transmission of the animation message may be made to the requesting terminal or other terminals. Now, an operation of the animation message service system as constructed above will be described with reference to a procedural diagram of FIG. 3.
The manager terminal 104 generates a 3D animation file using software such as 3D MAX or Maya at a service provider's request (Step 200) and converts the 3D animation file into a flash file (Step 202). Thereafter, the manager terminal 104 adds lip position information of a character included in the animation file to the animation file converted into the flash file and transmits the animation file, along with the lip position information, to the Web server 100 (Step 204).
The Web server 100 that received the animation file to which the lip position information is added stores the animation file in the database 102 (Step 206).
The Steps 200 to 206 are repeated to store a plurality of animation files corresponding to various events in the database 102.
In addition, the manager terminal 102 generates an audio file which can be used as a background sound or an effect sound at a service provider's request and provides the generated audio file to the Web server 100 (Step 208).
The manager terminal 102 that received the audio file stores the audio file in the database 102 (Step 210).
The Steps 208 and 210 are repeated to store a plurality of audio files corresponding to various events in the database 102. In addition, the manager terminal 102 generates a lip-sync file, which is constituted by announcement, voice data corresponding to the announcement, and lip motion data according to the voice data, at a service provider's request, and provides the generated lip-sync file to the Web server 100 (Step 212).
The manager terminal 102 that received the lip-sync file stores the lip- sync file in the database 102 (Step 214).
The Steps 212 and 214 are repeated to store a plurality of lip-sync files corresponding to various announcements in the database 102.
When one (for example, Tl) of the plurality of client terminals Tl to T3 accesses the Web server 100 through the wired/wireless network at a client's request, the Web server 100 shows an animation file list, an audio file list and a lip-sync file list which are stored in the database 102, and a client selects one or more of an animation file, an audio file and a lip-sync file from the shown lists and requests the Web server 100 to organize an animation message (Step 218).
According to the request of the client, the Web server 100 reads one or more of the animation file, the audio file and the lip-sync file, which are selected by the client, from the database 102 and organizes the animation message based on the read file(s). Thereafter, the Web server 100 transmits the organized animation message to the client terminal Tl or a client terminal requested by the client. The animation message is transmitted to the terminal through the wired/wireless network or other mobile communication systems (not shown).
The client terminal Tl that received the animation message synchronizes and reproduces the animation file, the audio file and the lip-sync file included in the animation message.
Now, a method of reproducing the animation message will be described in more detail.
Prior to description about the reproducing method, a general configuration of the client terminal Tl will be first described with reference to FIG. 4.
The client terminal Tl includes a controller 300, a memory unit 302, a communication module 302, an operating panel 306, an audio signal processor 308, a speaker 310, a video signal processor 312 and a display device 314. The controller 300 controls the entire operation of the client terminal Tl, including reproducing an animation message according to a preferred embodiment of the present invention.
The memory unit 302 stores a variety of information including process programs of the controller 300 and, particularly, an animation message received from the Web server 100.
The communication module 304 communicates with the Web server 100 through a wireless network, such as WiBro, or other mobile communication systems (not shown).
The operating panel 306 interfaces between a user and the controller 300 and provides various kinds of information related to selection by the user to the controller 300. The audio signal processor 308 processes an audio signal under control of the controller 300 and outputs the processed audio signal through the speaker 310.
The video signal processor 312 processes a video signal under control of the controller 300 and outputs the processed video signal through the display device 314.
Now, the method of reproducing the animation message in the client terminal Tl as constructed above will be described with reference to FIG. 5.
The controller 300 executes an instruction to reproduce the animation message stored in the memory unit 302, which is inputted by a user through the operating panel 306, or checks whether or not there occurs a receipt event of the animation message through the communication module 304 (Step 400).
If there occurs a reproduction instruction from the user or the receipt event of the animation message, the controller 300 reads the animation message from the memory unit 302 and checks whether or not a lip-sync file and an audio file are included in the read animation message (Step 404).
If the lip-sync file and the audio file are included in the animation message, the controller 300 synchronizes and reproduces an animation file, voice data of the lip-sync file, and the audio file included in the animation message, and modifies an image portion, which corresponds to lip position information added to the animation file, based on lip motion information included in the lip- sync file (Step 406). As a result, a lip of a character included in the animation message moves in correspondence to the voice data. Thus, 3D animation is reproduced, a background sound or an effect sound and the voice data are outputted in synchronization with the 3D animation, and the character's lip moves as if the character really speaks, according to the voice data.
On the other hand, if only the lip-sync file is included in the animation message (Step 408), the controller 300 synchronizes and reproduces the animation file and the voice data of the lip-sync file included in the animation message, and modifies an image portion, which corresponds to the lip position information added to the animation file, based on the lip motion information included in the lip-sync file (Step 410). Thus, 3D animation is reproduced, the voice data are outputted in synchronization with the 3D animation, and the character's lip moves as if the character really speaks, according to the voice data.
In addition, if only the audio file is included in the animation message, the controller 300 synchronizes and reproduces only the animation file and the audio file (Step 412.
Industrial Applicability
While the invention has been shown and described with reference to certain embodiments and drawings, it will be understood by those skilled in the art that various changes in form and details may be made. Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims

1. An animation message service method comprising the steps of: converting a plurality of 3D animation files into flash files, respectively; storing the flash files as animation files in a database; and transmitting one selected from the animation files stored in the database to a specified client terminal.
2. The animation message service method of claim 1, wherein lip position information of a character is included in the animation files, and wherein a plurality of lip-sync files, each being constituted by voice data corresponding to a plurality of announcements and lip motion data, are stored in the database, further comprising the step of transmitting one selected from the plurality of lip-sync files, along with the animation files, to the client terminal.
3. The animation message service method of claim 1, wherein a plurality of audio files are stored in the database, further comprising the step of transmitting one selected from the plurality of audio files, along with the animation files, to the client terminal.
4. An animation message reproducing method comprising the steps of: checking whether it is requested to reproduce an animation message organized by an animation file and lip position information of a character, voice data and lip motion data, which are included in the animation file; and synchronizing and reproducing the animation file and the voice file and modifying an image corresponding to the lip position information based on the lip motion data.
5. The animation message reproducing method of claim 4, wherein an audio file is included in the animation message, and wherein the audio file is reproduced in synchronization with the animation file.
6. An animation message service system comprising: a database that stores a plurality of animation files; and a server that converting a plurality of 3D animation files into flash files as the plurality of animation files, stores the plurality of animation files in the database, and transmits one selected from the plurality of animation files to a specified client terminal.
7. The animation message service system of claim 6, wherein the server stores lip position information of a character included in the animation file in the database, stores a plurality of lip-sync files constituted by voice data corresponding to a plurality of announcements and lip motion data in the database, and transmits one selected from the plurality of lip-sync files, along with the animation file, to the client terminal.
8. The animation message service system of claim 6, wherein the server stores a plurality of audio files in the database, and transmits one selected from the plurality of audio files, along with the animation file, to the client terminal.
9. A client terminal for an animation message service, comprising: a communication module that communicates with an animation message service system which stores a plurality of animation files and a plurality of lip- sync files constituted by lip position information of a character, voice data and lip motion data, which are included in the animation files; an operating panel that provides an interface with a user; and a controller that requests the animation message service system to transmit one of the animation files, lip position information included in the animation file, and selection information of one of the lip-sync files to a specified client terminal.
10. The client terminal of claim 9, wherein the animation message service system further stores a plurality of audio files, and wherein the controller requests the animation message service system to transmit one of the audio files, along with the animation file, to the specified client terminal.
11. A client terminal for an animation message service, comprising: a memory unit that stores an animation message organized by an animation file and lip position information of a character, voice data and lip motion data, which are included in the animation file; and a controller that synchronizes and reproduces the animation file and the voice file and modifies an image corresponding to the lip position information based on the lip motion data.
12. The client terminal of claim 11, wherein an audio file is further included in the animation message, and wherein the controller reproduces the audio file in synchronization with the animation file.
PCT/KR2006/002470 2006-06-26 2006-06-26 Mobile animation message service method and system and terminal WO2008001961A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2006-0057332 2006-06-26
KR1020060057332A KR100795357B1 (en) 2006-06-26 2006-06-26 Mobile animation message service method and system and terminal

Publications (1)

Publication Number Publication Date
WO2008001961A1 true WO2008001961A1 (en) 2008-01-03

Family

ID=38845710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2006/002470 WO2008001961A1 (en) 2006-06-26 2006-06-26 Mobile animation message service method and system and terminal

Country Status (3)

Country Link
KR (1) KR100795357B1 (en)
CN (1) CN101341767A (en)
WO (1) WO2008001961A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010000175A1 (en) * 2008-06-30 2010-01-07 腾讯科技(深圳)有限公司 Method and system for dynamic controlling flash component
CN102063272A (en) * 2010-12-28 2011-05-18 东莞宇龙通信科技有限公司 Data processing method and mobile terminal
CN103516793A (en) * 2013-09-18 2014-01-15 广东欧珀移动通信有限公司 Reifying service providing system based on mobile communication and reifying service providing device for mobile terminal
WO2014118498A1 (en) * 2013-02-04 2014-08-07 Headcastlab Limited Conveying audio messages to mobile display devices

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110211206B (en) * 2019-06-05 2022-12-13 深圳市元人动画有限公司 Computer animation manufacturing system
KR102136793B1 (en) * 2019-07-29 2020-07-22 에스케이텔레콤 주식회사 Method for communicating using image in messenger, apparatus and system for the same
KR102563348B1 (en) * 2021-07-22 2023-08-04 주식회사 마음에이아이 Apparatus, method and computer program for providing lip-sync images and apparatus, method and computer program for displaying lip-sync images

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040061667A1 (en) * 2002-09-27 2004-04-01 Tetsuya Sawano Image sending apparatus
KR20050077907A (en) * 2004-01-29 2005-08-04 주식회사 이둘 Apparatus for providing multimedia data by using flash and method thereof
US20060092154A1 (en) * 2004-11-01 2006-05-04 Samsung Electronics Co., Ltd. Apparatus and method for providing a 3D animation file reflecting a user's personality in a mobile communication terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100676270B1 (en) * 2004-06-17 2007-02-02 (주) 엘지텔레콤 Terminal equipment for implementing the motion of a three dimensional character and the method for servicing the same
KR100733772B1 (en) * 2005-08-12 2007-07-02 주식회사 인프라밸리 Method and system for providing lip-sync service for mobile communication subscriber

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040061667A1 (en) * 2002-09-27 2004-04-01 Tetsuya Sawano Image sending apparatus
KR20050077907A (en) * 2004-01-29 2005-08-04 주식회사 이둘 Apparatus for providing multimedia data by using flash and method thereof
US20060092154A1 (en) * 2004-11-01 2006-05-04 Samsung Electronics Co., Ltd. Apparatus and method for providing a 3D animation file reflecting a user's personality in a mobile communication terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010000175A1 (en) * 2008-06-30 2010-01-07 腾讯科技(深圳)有限公司 Method and system for dynamic controlling flash component
CN102063272A (en) * 2010-12-28 2011-05-18 东莞宇龙通信科技有限公司 Data processing method and mobile terminal
WO2014118498A1 (en) * 2013-02-04 2014-08-07 Headcastlab Limited Conveying audio messages to mobile display devices
CN103516793A (en) * 2013-09-18 2014-01-15 广东欧珀移动通信有限公司 Reifying service providing system based on mobile communication and reifying service providing device for mobile terminal

Also Published As

Publication number Publication date
KR20080000073A (en) 2008-01-02
KR100795357B1 (en) 2008-01-17
CN101341767A (en) 2009-01-07

Similar Documents

Publication Publication Date Title
US10057731B2 (en) Image and message integration system and method
EP1143679B1 (en) A conversational portal for providing conversational browsing and multimedia broadcast on demand
US20050266884A1 (en) Methods and systems for conducting remote communications
JP4439920B2 (en) System and method for simultaneous multimodal communication session persistence
US8660038B1 (en) Previewing voicemails using mobile devices
JP5467031B2 (en) Method and system for producing and transmitting multimedia content
US20040243688A1 (en) Inbox caching of messages on a mobile terminal
US20080090553A1 (en) Dynamic video messaging
KR100803580B1 (en) Electronic music distribution service system and method using synchronous multimedia integration language format
JP2005519363A (en) Simultaneous multimodal communication system and method
WO2008001961A1 (en) Mobile animation message service method and system and terminal
JP2010530572A (en) Host that directly controls PDA applications connected to the interface
JP2001211443A (en) Information distribution system
JP2005527020A (en) Simultaneous multimodal communication system and method using simultaneous multimodal tags
US20100151888A1 (en) Method and system for transmitting and receiving multimedia message
WO2015050966A1 (en) Image and message integration system and method
CN102065340B (en) System and method for implementing multimedia synchronous interaction
CN110418181B (en) Service processing method and device for smart television, smart device and storage medium
US20040110492A1 (en) Method and mobile communication system for transmitting and receiving multimedia messages
CN102724639A (en) Multimedia message manufacture method, device thereof and system thereof
WO2008001962A1 (en) Music video service method and system and terminal
CN101500134A (en) Method and system for accessing applications
KR100833291B1 (en) System for service instant messing and thereof Method
KR100702386B1 (en) System for providing personalized multimedia mail and method thereof
CN113159752A (en) Method and device for generating account transfer transaction certificate

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680001614.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06769047

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06769047

Country of ref document: EP

Kind code of ref document: A1