US20120289262A1 - Method for providing visual effect messages and associated communication system and transmitting end - Google Patents
Method for providing visual effect messages and associated communication system and transmitting end Download PDFInfo
- Publication number
- US20120289262A1 US20120289262A1 US13/450,621 US201213450621A US2012289262A1 US 20120289262 A1 US20120289262 A1 US 20120289262A1 US 201213450621 A US201213450621 A US 201213450621A US 2012289262 A1 US2012289262 A1 US 2012289262A1
- Authority
- US
- United States
- Prior art keywords
- visual effect
- information
- positions
- transmitting end
- communication system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
Definitions
- the present disclosure relates in general to a method for providing visual effect messages in an associated communication system and transmitting end, and more particularly to a method for providing messages of diversified visual effects by utilizing a small message transmission mechanism in an associated communication system and transmitting end.
- communication systems also provide non-audio message services such as text and/or graphics short messages.
- text of a message is inputted at a transmitting end by a user, and is transmitted in packets of predetermined message communication protocols to a receiving end via communication services of a communication system, so as to allow a user at the receiving end to read the text transmitted from the transmitting end.
- text messages are only monotonous text or graphics lacking in visual variations and interactivity ability.
- the present disclosure is directed to a technique for providing messages with visual effects to increase diversity and interactivity of the message communication utilized in a communication system.
- a method for providing a visual effect message in a communication system may comprise: receiving an input message at a transmitting end, determining a plurality of visual effect positions according to the input message and providing visual information corresponding to a visual effect (e.g., a particle system visual effect), encoding the visual effect positions and visual information into encoded information and transmitting the encoded information to a receiving end, decoding the encoded information to retrieve the visual effect positions and visual effect information at the receiving end, and displaying the visual effect at the visual effect positions according to the visual effect information to display the visual effect message.
- a visual effect e.g., a particle system visual effect
- a communication system for providing a visual effect message may comprise: a transmitting end and a receiving end.
- the transmitting end comprises an input module, a visual effect editing module, an encoding module, and a first communication module.
- the receiving end comprises a second communication module, a decoding module, and a display module.
- the input module receives an input message
- the visual effect editing module provides a plurality of visual effect positions according to the input message and provides corresponding visual effect information.
- the encoding module encodes the visual effect positions and the visual effect information into encoded information, which is then transmitted by the first communication module.
- the second communication module receives the encoded information, and the decoding module decodes the encoded information to retrieve the visual effect positions and the visual effect information from the encoded information.
- the display module performs visual effect rendering at the visual effect positions according to the visual effect information to display a corresponding visual effect.
- a transmitting end of a communication system for providing a visual effect message may comprise: a touch screen, a visual effect editing module, a first communication module, and an encoding module.
- the touch screen receives an input message.
- the visual effect editing module coupled to the touch screen, provides a plurality of visual effect positions according to the input message, and provides visual information corresponding to the visual effect positions.
- the touch screen displays a corresponding visual effect at the visual effect positions according to the visual effect information.
- the first communication module transmits the visual effect positions and the visual effect information according to a corresponding communication protocol to a receiving end.
- the encoding module encodes the visual effect positions and the visual effect information into encoded information, such that the visual effect positions and the visual effect information transmitted by the first communication module are the encoded information.
- the visual effect information is particle system mode visual effect information
- the visual effect positions correspond to a plurality of coordinates on a coordinate plane.
- a plurality of coordinates obtained by same-finger touch control are defined as a visual effect position.
- the transmitting end and/or the receiving end displays the corresponding visual effect by calculating particle parameters of a plurality of particles generated by the visual effect positions on a page and drawing the particles via a graphic database according to the particle parameters to display the particles on the page.
- the communication system is at least one of a GSM system, a CDMA/WCDMA system, an LTE system, a WiMAX system, and an Internet system, but such communication frameworks should not be limited to these specific types.
- FIG. 1 is a schematic diagram of a flow according to an embodiment of the present disclosure.
- FIG. 2 is a schematic diagram of a communication system according to an embodiment of the present disclosure.
- FIG. 3 shows formation of a visual effect message according to an embodiment of the present disclosure.
- FIG. 4 is a schematic diagram of providing an interactive visual effect message according to an embodiment of the present disclosure.
- FIG. 1 shows a schematic diagram of a flow 100 for providing a visual effect message according to an embodiment of the present disclosure.
- FIG. 2 shows a schematic diagram of a communication system 10 according to an embodiment of the present disclosure; the communication system 10 may realize the flow 100 .
- FIG. 3 shows a schematic diagram of a formation of a visual effect message according to an embodiment of the present disclosure.
- the communication system 10 in FIG. 2 comprises two terminals 12 a and 12 b.
- the communication system 10 is a wireless audio mobile communication system, and provides system services including routing, exchange, and transmission between the two terminals 12 a and 12 b via an exchange system 22 .
- the terminals 12 a and 12 b respectively comprise communication modules 20 and 24 for transmitting and receiving communication signals.
- the terminal 12 a Serving as a transmitting end of visual effect messages, the terminal 12 a further comprises an input module 14 , a visual effect editing module 16 , and an encoding module 18 .
- the terminal 12 b is a receiving end comprising a decoding module 26 and a display module 28 .
- the flow 100 performed by the communication system 10 , comprises steps to be described below in further detail.
- Step 102 an input message is received to provide a message content, and a corresponding visual effect is determined.
- the input module 14 receives the input message from a user via a keyboard, a cursor device, and/or a touch sensor.
- a content of a text message is inputted by a user at the terminal 12 a via a keyboard; alternatively, text and/or graphics are written by a user comprising message content via touch control or other means.
- the input module 14 may comprise a touch screen.
- one or multiple input messages comprising predetermined contents are built-in (i.e., stored in) the terminal 12 a for a user to select from.
- the input message may be static or dynamic.
- each input message may comprise a plurality of frames respectively comprising different contents to composite a dynamic message.
- the visual effect editing module 16 Upon obtaining the input message the visual effect editing module 16 provides a plurality of visual effect positions and visual information according to the input message.
- the visual effect module 16 defines a plurality of visual effect positions (e.g., P(i) and P(i+1)) according to text and/or graphics of the input message. It is the spirit of the present disclosure to display visual effects at the visual effect positions to present a visual effect message. Therefore, the visual effect editing module 16 provides the corresponding visual effect information of the visual effect to be presented at the visual effect positions.
- the visual effect is a particle system visual effect.
- the particle system is a system that presents a visual effect by simulating micro particles in a three-dimensional graphics space, and such visual effect may include glittering fireworks, smoke, snowflakes, stardust, clouds and fogs, fireflies, and/or bubbles.
- the visual effect information is for defining parameters of the particle system visual effect, e.g., parameters of initial positions, spawning rate, initial velocity vector, types, shapes, colors, sizes, density, brightness, distribution range, lifespan, and fuzzy parameters of the particles.
- the lifespan of the particle system is a period between a start time to an end time for presenting a special effect or a fade-in time to a fade-out time of the particles, and/or a lasting period of the particles.
- the spawning rate represents the number of particles generated within a unit time
- the initial velocity vector represents an initial velocity at the time when the particles are generated
- the fuzzy parameter represents a tolerable variation range for the visual effects produced by the parameters when the above values of parameters are regarded as a center. For example, for a particle having a lifespan of 50 frames, 20% as the fuzzy parameter means that the lifespan of the particle varies between 40 and 60 frames.
- the initial position is an initial position at the time when the particle is generated, and is also referred to as an emitter of the particle.
- the visual effect module 16 may automatically define coordinates of the visual effect positions along geometric coordinates of the input message according to a predetermined algorithm, and provide a visual effect editing interface that allows a user to select parameters of visual effects as well as providing a preview of the visual effect message.
- the visual effect editing module 16 allows a user to tune the visual effect position, or edit/select a background such as color, effect, graphics, texture, and/or patterns of the message to incorporate associated information of the background to the visual effect information.
- Step 104 the encoding module 18 consolidates and encodes the visual effect positions and the visual effect information to corresponding encoded information.
- an encoding approach similar to the run length coding is implemented to compress and encode the visual effect positions and the visual effect information to reduce the message transmission amount of the visual effect positions and the visual effect information.
- the encoded information may be regarded as a draft of the visual effect message and is stored in the terminal 12 a.
- Step 106 the visual effect message to be sent out is selected by the user. For example, the user selects the visual effect message to be sent out from the previously stored visual effect message draft.
- the communication module 20 in the terminal 12 a transmits the encoded information.
- the communication module 20 packages the encoded information to communication signals according to a predetermined message communication protocol, and transmits the communication signals to the exchange system 22 in the communication system 10 .
- the exchange system 22 is at least one of a GSM system, a CDMA/WCDMA system, an LTE system, a WiMAX system, and an Internet system.
- the exchange system 22 in the communication system 10 provides communication transmission services to transmit the communication signals from the terminal 12 a to the terminal 12 b.
- the exchange system 22 is provided with a base station and an exchange server (not shown) to realize routing, exchange, and transmission of the communication signals.
- Step 112 the communication module 24 in the communication system 12 b receives and decodes the communication signals from the terminal 12 a to retrieve the encoded information.
- the communication module 24 complies with the same message communication protocol as the communication module 20 so as to correctly retrieve the encoded information of the visual effect message from the communication signals.
- Step 114 the decoding module 26 decodes the encoded information to retrieve the visual effect positions and the visual effect information.
- the encoding module 18 and the decoding module 26 comply with the same encoding and decoding protocol so as to allow the decoding module 26 to correctly retrieve the visual effect positions and the visual effect information from the encoded information.
- Step 116 the display module 28 performs rendering and displays the visual effect at the visual effect positions according to the visual effect information to present the visual effect message to the user at the terminal 12 b, so as to accomplish the visual effect message transmission.
- the visual effect positions and the visual effect of the visual effect information may be 2D or 3D, and the user at the terminal 12 b may adjust an angle and a position of the visual effect message.
- the user at the terminal 12 b may adjust an angle and a position of the visual effect message.
- the terminal 12 b may automatically adjust the angle of the visual effect message to dynamically display the visual effect message.
- the modules in the terminals 12 a and 12 b may be realized by software, firmware, and/or hardware, and any combination thereof.
- the display module 28 is a hardware accelerating circuit with 3D graphic processing capabilities to quickly process the particle system visual effect in real time.
- the terminal 12 a may also be provided with the decoding 26 and the display module 28
- the terminal 12 b may also be provided with the input module 14 , the visual effect editing module 16 and the encoding module 18 , so that the terminal 12 b may also transmit visual effect messages to the terminal 12 a.
- the terminals 12 a and 12 b are mobile phones both supporting a same communication protocol, portable computers, PDAs, digital cameras, digital camcorders, or digital frames.
- a touch screen serving as a message input interface comprises a plurality of pixels, and may have a 640 ⁇ 480, 1024 ⁇ 768, or 1280 ⁇ 1024 pixel distribution combination according to a resolution of the touch screen, although any resolution and type of display and input device is likewise suitable.
- a predetermined pixel may be determined as a point on a coordinate axis to define a 2D coordinate plane.
- the coordinate corresponding to the touched position is stored as an initial position of the particle system.
- the coordinates may be defined with different spawning rates, initial velocity vectors, particle graphic patterns and lifespans, so as to present visual effects including glittering fireworks, smoke, snowflakes, stardust, clouds and fogs, fireflies, and bubbles according to the aforementioned parameters.
- the aforementioned parameters may be defined as particle parameters of the particles.
- the particle system sequentially updates particle conditions displayed by each frame according to a frame update rate.
- a corresponding display step may be in large divided into a simulation stage and a display stage.
- a coordinate of at least one particle is defined, to which corresponding type, shape, color, size, density, brightness, distribution range, lifespan, and fuzzy parameter are assigned according to the parameters of the coordinate.
- all existing particles are checked to determined whether their lifespans are fulfilled/exceeded. For example, when a predetermined particle is fulfilled or its lifespan exceeded, the particle and related parameters are removed so it is no longer displayed on the screen during the display stage.
- collisions and simulations between the particles and between the particles and the background are also processed in the simulation stage. Therefore, operations of the particles to be displayed in a next frame are completed in the simulation stage of the particles and stored, so as to display the particles on the screen in the display stage.
- colors and graphic textures corresponding to the particles with completed condition simulation are displayed on the screen via a graphic function database (e.g., an OPEN GL function database).
- a graphic function database e.g., an OPEN GL function database.
- the text or graphics generated by the touch screen are recorded by the user and converted to particle system parameters, and are presented on the screen in a visual effect of the particle system.
- the particle parameters corresponding to the touch position coordinate are also recorded during the process to realize the foregoing visual effect positions and visual effect information.
- a pixel is regarded as an initial position of a particle in this embodiment; however, since a user's finger generally covers a larger range than a single pixel on the touch screen, a plurality of pixels touched by a same-finger touch control of the user (i.e., pixels touched by the user within a predetermined time period) are regarded as a same initial position, so as to realize similar visual effect while reducing system resource consumption. More specifically, in the present disclosure, a plurality of coordinates are defined as one emitter to realize similar visual effects while reducing system resource consumption. Further, although a 2D coordinate plane is implemented for display, operations within the simulation stage are performed according to a 3D coordinate plane with respect to the particle system and projected to a 2D coordinate plane. To further reduce system resource consumption, the particle system may also perform the particle operations on a 2D coordinate plane.
- the present disclosure provides visual effects and interactivity to messages, so that the messages are offered with better flexibility, vividness and diversity to add amusement to the information world.
Abstract
Description
- This application claims the benefit of Taiwan Patent Application Serial No. 100116932, filed May 13, 2011, the subject matter of which is incorporated herein by reference.
- 1. Field of the Invention
- The present disclosure relates in general to a method for providing visual effect messages in an associated communication system and transmitting end, and more particularly to a method for providing messages of diversified visual effects by utilizing a small message transmission mechanism in an associated communication system and transmitting end.
- 2. Description of Related Art
- Apart from audio communications, communication systems also provide non-audio message services such as text and/or graphics short messages. For example, text of a message is inputted at a transmitting end by a user, and is transmitted in packets of predetermined message communication protocols to a receiving end via communication services of a communication system, so as to allow a user at the receiving end to read the text transmitted from the transmitting end. However, in typical conventional techniques, text messages are only monotonous text or graphics lacking in visual variations and interactivity ability.
- The present disclosure is directed to a technique for providing messages with visual effects to increase diversity and interactivity of the message communication utilized in a communication system.
- According to an aspect of the present disclosure, a method for providing a visual effect message in a communication system is provided. The method may comprise: receiving an input message at a transmitting end, determining a plurality of visual effect positions according to the input message and providing visual information corresponding to a visual effect (e.g., a particle system visual effect), encoding the visual effect positions and visual information into encoded information and transmitting the encoded information to a receiving end, decoding the encoded information to retrieve the visual effect positions and visual effect information at the receiving end, and displaying the visual effect at the visual effect positions according to the visual effect information to display the visual effect message.
- According to another aspect of the present disclosure, a communication system for providing a visual effect message is provided. The communication system may comprise: a transmitting end and a receiving end. The transmitting end comprises an input module, a visual effect editing module, an encoding module, and a first communication module. The receiving end comprises a second communication module, a decoding module, and a display module.
- At the transmitting end, the input module receives an input message, and the visual effect editing module provides a plurality of visual effect positions according to the input message and provides corresponding visual effect information. The encoding module encodes the visual effect positions and the visual effect information into encoded information, which is then transmitted by the first communication module.
- At the receiving end, the second communication module receives the encoded information, and the decoding module decodes the encoded information to retrieve the visual effect positions and the visual effect information from the encoded information. The display module performs visual effect rendering at the visual effect positions according to the visual effect information to display a corresponding visual effect.
- According to yet another aspect of the present disclosure, a transmitting end of a communication system for providing a visual effect message is provided. The transmitting end may comprise: a touch screen, a visual effect editing module, a first communication module, and an encoding module. The touch screen receives an input message. The visual effect editing module, coupled to the touch screen, provides a plurality of visual effect positions according to the input message, and provides visual information corresponding to the visual effect positions. The touch screen displays a corresponding visual effect at the visual effect positions according to the visual effect information. The first communication module transmits the visual effect positions and the visual effect information according to a corresponding communication protocol to a receiving end. The encoding module encodes the visual effect positions and the visual effect information into encoded information, such that the visual effect positions and the visual effect information transmitted by the first communication module are the encoded information.
- In an embodiment, the visual effect information is particle system mode visual effect information, and the visual effect positions correspond to a plurality of coordinates on a coordinate plane. In an embodiment, a plurality of coordinates obtained by same-finger touch control are defined as a visual effect position. In an embodiment, the transmitting end and/or the receiving end displays the corresponding visual effect by calculating particle parameters of a plurality of particles generated by the visual effect positions on a page and drawing the particles via a graphic database according to the particle parameters to display the particles on the page. In an embodiment, the communication system is at least one of a GSM system, a CDMA/WCDMA system, an LTE system, a WiMAX system, and an Internet system, but such communication frameworks should not be limited to these specific types.
- The above and other aspects of the invention will become better understood with regard to the following detailed description of the preferred but non-limiting embodiments. The following description is made with reference to the accompanying drawings.
-
FIG. 1 is a schematic diagram of a flow according to an embodiment of the present disclosure. -
FIG. 2 is a schematic diagram of a communication system according to an embodiment of the present disclosure. -
FIG. 3 shows formation of a visual effect message according to an embodiment of the present disclosure. -
FIG. 4 is a schematic diagram of providing an interactive visual effect message according to an embodiment of the present disclosure. -
FIG. 1 shows a schematic diagram of aflow 100 for providing a visual effect message according to an embodiment of the present disclosure.FIG. 2 shows a schematic diagram of acommunication system 10 according to an embodiment of the present disclosure; thecommunication system 10 may realize theflow 100.FIG. 3 shows a schematic diagram of a formation of a visual effect message according to an embodiment of the present disclosure. Thecommunication system 10 inFIG. 2 comprises twoterminals communication system 10 is a wireless audio mobile communication system, and provides system services including routing, exchange, and transmission between the twoterminals exchange system 22. To realize communication, theterminals communication modules terminal 12 a further comprises aninput module 14, a visualeffect editing module 16, and anencoding module 18. Correspondingly, theterminal 12 b is a receiving end comprising adecoding module 26 and adisplay module 28. - The
flow 100, performed by thecommunication system 10, comprises steps to be described below in further detail. - In
Step 102, an input message is received to provide a message content, and a corresponding visual effect is determined. At theterminal 12 a, theinput module 14 receives the input message from a user via a keyboard, a cursor device, and/or a touch sensor. For example, a content of a text message is inputted by a user at theterminal 12 a via a keyboard; alternatively, text and/or graphics are written by a user comprising message content via touch control or other means. For example, theinput module 14 may comprise a touch screen. Alternatively, one or multiple input messages comprising predetermined contents are built-in (i.e., stored in) theterminal 12 a for a user to select from. The input message may be static or dynamic. For example, each input message may comprise a plurality of frames respectively comprising different contents to composite a dynamic message. - Upon obtaining the input message the visual
effect editing module 16 provides a plurality of visual effect positions and visual information according to the input message. With reference toFIG. 3 , to provide a visual effect message, thevisual effect module 16 defines a plurality of visual effect positions (e.g., P(i) and P(i+1)) according to text and/or graphics of the input message. It is the spirit of the present disclosure to display visual effects at the visual effect positions to present a visual effect message. Therefore, the visualeffect editing module 16 provides the corresponding visual effect information of the visual effect to be presented at the visual effect positions. For example, the visual effect is a particle system visual effect. The particle system is a system that presents a visual effect by simulating micro particles in a three-dimensional graphics space, and such visual effect may include glittering fireworks, smoke, snowflakes, stardust, clouds and fogs, fireflies, and/or bubbles. The visual effect information is for defining parameters of the particle system visual effect, e.g., parameters of initial positions, spawning rate, initial velocity vector, types, shapes, colors, sizes, density, brightness, distribution range, lifespan, and fuzzy parameters of the particles. For example, the lifespan of the particle system is a period between a start time to an end time for presenting a special effect or a fade-in time to a fade-out time of the particles, and/or a lasting period of the particles. - In an embodiment, the spawning rate represents the number of particles generated within a unit time, the initial velocity vector represents an initial velocity at the time when the particles are generated, and the fuzzy parameter represents a tolerable variation range for the visual effects produced by the parameters when the above values of parameters are regarded as a center. For example, for a particle having a lifespan of 50 frames, 20% as the fuzzy parameter means that the lifespan of the particle varies between 40 and 60 frames. The initial position is an initial position at the time when the particle is generated, and is also referred to as an emitter of the particle.
- The
visual effect module 16 may automatically define coordinates of the visual effect positions along geometric coordinates of the input message according to a predetermined algorithm, and provide a visual effect editing interface that allows a user to select parameters of visual effects as well as providing a preview of the visual effect message. Alternatively, the visualeffect editing module 16 allows a user to tune the visual effect position, or edit/select a background such as color, effect, graphics, texture, and/or patterns of the message to incorporate associated information of the background to the visual effect information. - In
Step 104, theencoding module 18 consolidates and encodes the visual effect positions and the visual effect information to corresponding encoded information. For example, an encoding approach similar to the run length coding is implemented to compress and encode the visual effect positions and the visual effect information to reduce the message transmission amount of the visual effect positions and the visual effect information. The encoded information may be regarded as a draft of the visual effect message and is stored in the terminal 12 a. - In
Step 106, the visual effect message to be sent out is selected by the user. For example, the user selects the visual effect message to be sent out from the previously stored visual effect message draft. - In
Step 108, thecommunication module 20 in the terminal 12 a transmits the encoded information. For example, thecommunication module 20 packages the encoded information to communication signals according to a predetermined message communication protocol, and transmits the communication signals to theexchange system 22 in thecommunication system 10. For example, theexchange system 22 is at least one of a GSM system, a CDMA/WCDMA system, an LTE system, a WiMAX system, and an Internet system. - In
Step 110, theexchange system 22 in thecommunication system 10 provides communication transmission services to transmit the communication signals from the terminal 12 a to the terminal 12 b. For example, theexchange system 22 is provided with a base station and an exchange server (not shown) to realize routing, exchange, and transmission of the communication signals. - In
Step 112, thecommunication module 24 in thecommunication system 12 b receives and decodes the communication signals from the terminal 12 a to retrieve the encoded information. Thecommunication module 24 complies with the same message communication protocol as thecommunication module 20 so as to correctly retrieve the encoded information of the visual effect message from the communication signals. - In
Step 114, thedecoding module 26 decodes the encoded information to retrieve the visual effect positions and the visual effect information. Theencoding module 18 and thedecoding module 26 comply with the same encoding and decoding protocol so as to allow thedecoding module 26 to correctly retrieve the visual effect positions and the visual effect information from the encoded information. - In
Step 116, thedisplay module 28 performs rendering and displays the visual effect at the visual effect positions according to the visual effect information to present the visual effect message to the user at the terminal 12 b, so as to accomplish the visual effect message transmission. - The visual effect positions and the visual effect of the visual effect information may be 2D or 3D, and the user at the terminal 12 b may adjust an angle and a position of the visual effect message. For example, as shown in
FIG. 4 , when the visual effect message is displayed by the terminal 12 b at amonitor 30, the user is capable of assigning a display angle of the visual effect message to interact with the visual effect message. Alternatively, the terminal 12 b may automatically adjust the angle of the visual effect message to dynamically display the visual effect message. - In the embodiment shown in
FIG. 2 , the modules in theterminals display module 28 is a hardware accelerating circuit with 3D graphic processing capabilities to quickly process the particle system visual effect in real time. The terminal 12 a may also be provided with thedecoding 26 and thedisplay module 28, and the terminal 12 b may also be provided with theinput module 14, the visualeffect editing module 16 and theencoding module 18, so that the terminal 12 b may also transmit visual effect messages to the terminal 12 a. For example, theterminals - With reference to
FIG. 3 , a process of converting text or line patterns to a particle system shall be described. InFIG. 3 , a touch screen serving as a message input interface comprises a plurality of pixels, and may have a 640×480, 1024×768, or 1280×1024 pixel distribution combination according to a resolution of the touch screen, although any resolution and type of display and input device is likewise suitable. According to the pixel distribution, a predetermined pixel may be determined as a point on a coordinate axis to define a 2D coordinate plane. When the touch screen is touched by a user, a coordinate corresponding to the touched position is recorded to present different textures or color levels from a background at a corresponding pixel. - The coordinate corresponding to the touched position is stored as an initial position of the particle system. The coordinates may be defined with different spawning rates, initial velocity vectors, particle graphic patterns and lifespans, so as to present visual effects including glittering fireworks, smoke, snowflakes, stardust, clouds and fogs, fireflies, and bubbles according to the aforementioned parameters. The aforementioned parameters may be defined as particle parameters of the particles.
- The particle system according to the embodiment of the present disclosure sequentially updates particle conditions displayed by each frame according to a frame update rate. A corresponding display step may be in large divided into a simulation stage and a display stage. In the simulation stage, according to different spawning rates, a coordinate of at least one particle is defined, to which corresponding type, shape, color, size, density, brightness, distribution range, lifespan, and fuzzy parameter are assigned according to the parameters of the coordinate. Each time the frame is updated, all existing particles are checked to determined whether their lifespans are fulfilled/exceeded. For example, when a predetermined particle is fulfilled or its lifespan exceeded, the particle and related parameters are removed so it is no longer displayed on the screen during the display stage. Further, collisions and simulations between the particles and between the particles and the background are also processed in the simulation stage. Therefore, operations of the particles to be displayed in a next frame are completed in the simulation stage of the particles and stored, so as to display the particles on the screen in the display stage.
- In the display stage, colors and graphic textures corresponding to the particles with completed condition simulation are displayed on the screen via a graphic function database (e.g., an OPEN GL function database). Thus, the text or graphics generated by the touch screen are recorded by the user and converted to particle system parameters, and are presented on the screen in a visual effect of the particle system. The particle parameters corresponding to the touch position coordinate are also recorded during the process to realize the foregoing visual effect positions and visual effect information.
- It is to be noted that, a pixel is regarded as an initial position of a particle in this embodiment; however, since a user's finger generally covers a larger range than a single pixel on the touch screen, a plurality of pixels touched by a same-finger touch control of the user (i.e., pixels touched by the user within a predetermined time period) are regarded as a same initial position, so as to realize similar visual effect while reducing system resource consumption. More specifically, in the present disclosure, a plurality of coordinates are defined as one emitter to realize similar visual effects while reducing system resource consumption. Further, although a 2D coordinate plane is implemented for display, operations within the simulation stage are performed according to a 3D coordinate plane with respect to the particle system and projected to a 2D coordinate plane. To further reduce system resource consumption, the particle system may also perform the particle operations on a 2D coordinate plane.
- With the embodiments above, it is illustrated that the present disclosure provides visual effects and interactivity to messages, so that the messages are offered with better flexibility, vividness and diversity to add amusement to the information world.
- While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited thereto. On the contrary, it is intended to cover various modifications and similar arrangements and procedures, and the scope of the appended claims therefore should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements and procedures.
Claims (20)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW100116932 | 2011-05-13 | ||
TW100116932A TWI517728B (en) | 2011-05-13 | 2011-05-13 | Method and transmission terminal of associated communication system providing messages of visual effects |
TW100116932A | 2011-05-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120289262A1 true US20120289262A1 (en) | 2012-11-15 |
US8897821B2 US8897821B2 (en) | 2014-11-25 |
Family
ID=47142203
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/450,621 Active US8897821B2 (en) | 2011-05-13 | 2012-04-19 | Method for providing visual effect messages and associated communication system and transmitting end |
Country Status (2)
Country | Link |
---|---|
US (1) | US8897821B2 (en) |
TW (1) | TWI517728B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112700517A (en) * | 2020-12-28 | 2021-04-23 | 北京字跳网络技术有限公司 | Method for generating visual effect of fireworks, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080049025A1 (en) * | 1997-07-10 | 2008-02-28 | Paceworks, Inc. | Methods and apparatus for supporting and implementing computer based animation |
WO2008054062A1 (en) * | 2006-11-01 | 2008-05-08 | Polidigm Co., Ltd | Icon combining method for sms message |
US20080204362A1 (en) * | 2007-02-23 | 2008-08-28 | Sony Ericsson Mobile Communications Ab | Extension of display lifetime |
US20080220797A1 (en) * | 2007-03-09 | 2008-09-11 | Sony Ericsson Mobile Communications Ab | Portable communication device and method for media-enhanced messaging |
US20080280633A1 (en) * | 2005-10-31 | 2008-11-13 | My-Font Ltd. | Sending and Receiving Text Messages Using a Variety of Fonts |
US20110047476A1 (en) * | 2008-03-24 | 2011-02-24 | Hochmuth Roland M | Image-based remote access system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2193764A1 (en) | 1995-12-25 | 1997-06-25 | Yasuyuki Mochizuki | Selective call receiver |
JP4288449B2 (en) | 1999-02-16 | 2009-07-01 | 株式会社セガ | Image display device, image processing device, and image display system |
-
2011
- 2011-05-13 TW TW100116932A patent/TWI517728B/en not_active IP Right Cessation
-
2012
- 2012-04-19 US US13/450,621 patent/US8897821B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080049025A1 (en) * | 1997-07-10 | 2008-02-28 | Paceworks, Inc. | Methods and apparatus for supporting and implementing computer based animation |
US20080280633A1 (en) * | 2005-10-31 | 2008-11-13 | My-Font Ltd. | Sending and Receiving Text Messages Using a Variety of Fonts |
WO2008054062A1 (en) * | 2006-11-01 | 2008-05-08 | Polidigm Co., Ltd | Icon combining method for sms message |
US20080204362A1 (en) * | 2007-02-23 | 2008-08-28 | Sony Ericsson Mobile Communications Ab | Extension of display lifetime |
US20080220797A1 (en) * | 2007-03-09 | 2008-09-11 | Sony Ericsson Mobile Communications Ab | Portable communication device and method for media-enhanced messaging |
US20110047476A1 (en) * | 2008-03-24 | 2011-02-24 | Hochmuth Roland M | Image-based remote access system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112700517A (en) * | 2020-12-28 | 2021-04-23 | 北京字跳网络技术有限公司 | Method for generating visual effect of fireworks, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US8897821B2 (en) | 2014-11-25 |
TWI517728B (en) | 2016-01-11 |
TW201246974A (en) | 2012-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111279705B (en) | Method, apparatus and stream for encoding and decoding volumetric video | |
CN112235626B (en) | Video rendering method and device, electronic equipment and storage medium | |
US20180189980A1 (en) | Method and System for Providing Virtual Reality (VR) Video Transcoding and Broadcasting | |
US9717988B2 (en) | Rendering system, rendering server, control method thereof, program, and recording medium | |
CN107911708B (en) | Barrage display method, live broadcast method and related devices | |
US20090002368A1 (en) | Method, apparatus and a computer program product for utilizing a graphical processing unit to provide depth information for autostereoscopic display | |
CN107770618B (en) | Image processing method, device and storage medium | |
CN109725956B (en) | Scene rendering method and related device | |
CN108668168B (en) | Android VR video player based on Unity3D and design method thereof | |
CN108604389B (en) | Continuous depth-ordered image synthesis | |
US20170186243A1 (en) | Video Image Processing Method and Electronic Device Based on the Virtual Reality | |
CN113946402A (en) | Cloud mobile phone acceleration method, system, equipment and storage medium based on rendering separation | |
CN111464828A (en) | Virtual special effect display method, device, terminal and storage medium | |
CN113141537A (en) | Video frame insertion method, device, storage medium and terminal | |
CN111064863A (en) | Image data processing method and related device | |
CN109993817B (en) | Animation realization method and terminal | |
CN112218132B (en) | Panoramic video image display method and display equipment | |
CN113411537A (en) | Video call method, device, terminal and storage medium | |
US8897821B2 (en) | Method for providing visual effect messages and associated communication system and transmitting end | |
WO2014171066A1 (en) | Three-dimensional image display system, server for three-dimensional image display system, and three-dimensional image display method | |
CN111930233A (en) | Panoramic video image display method and display equipment | |
CN112019906A (en) | Live broadcast method, computer equipment and readable storage medium | |
CN110941413B (en) | Display screen generation method and related device | |
CN114693885A (en) | Three-dimensional virtual object generation method, apparatus, device, medium, and program product | |
CN116173496A (en) | Image frame rendering method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MSTAR SEMICONDUCTOR, INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HUANG, CHIH-HSIEN;YU, SHENG-CHI;SIGNING DATES FROM 20120318 TO 20120320;REEL/FRAME:028072/0700 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: MERGER;ASSIGNOR:MSTAR SEMICONDUCTOR, INC.;REEL/FRAME:052931/0468 Effective date: 20190115 |
|
AS | Assignment |
Owner name: XUESHAN TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEDIATEK INC.;REEL/FRAME:055486/0870 Effective date: 20201223 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: 7.5 YR SURCHARGE - LATE PMT W/IN 6 MO, LARGE ENTITY (ORIGINAL EVENT CODE: M1555); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |