MX2007012813A - Multimedia communication system and method - Google Patents

Multimedia communication system and method

Info

Publication number
MX2007012813A
MX2007012813A MX/A/2007/012813A MX2007012813A MX2007012813A MX 2007012813 A MX2007012813 A MX 2007012813A MX 2007012813 A MX2007012813 A MX 2007012813A MX 2007012813 A MX2007012813 A MX 2007012813A
Authority
MX
Mexico
Prior art keywords
communication
project
communication system
media
user
Prior art date
Application number
MX/A/2007/012813A
Other languages
Spanish (es)
Inventor
Neil Greer
Bennett Blank
Bryan Depew
Original Assignee
Bennett Blank
Bryan Depew
Neil Greer
Impact Engine Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bennett Blank, Bryan Depew, Neil Greer, Impact Engine Inc filed Critical Bennett Blank
Publication of MX2007012813A publication Critical patent/MX2007012813A/en

Links

Abstract

Systems and methods are disclosed for creating, editing, sharing and distributing high-quality, media-rich web-based communications. The communications are created in a layered fashion that integrates user-selected text, colors, background patterns, images, sound, music, video, or other media. The systems and methods are used to generate, edit, broadcast, and track electronic presentations, brochures, advertisements (such as banner advertisements on highly trafficked media websites), announcements, and interactive web pages, without the need for the user to understand complex programming languages.

Description

MULTIMEDIA SYSTEM AND COMMUNICATION METHOD BACKGROUND The present application claims priority according to 35 U.S.C. §119 of the US Provisional Application with serial number 60 / 671,170, filed on April 13, 2005 and entitled MULTIMEDIA SYSTEM AND METHOD OF COMMUNICATION, the description of which is incorporated herein by reference. In the current age of the Internet, the development of a piece of communication such as a presentation, advertising on banners, websites or brochures, whether static or dynamic using multimedia, is usually contracted with a professional graphic designer. Such a professional is usually part of a professional agency, such as an advertising agency, which is very expensive for small businesses (ie, single owner or small businesses), and may be unnecessarily expensive for larger companies. These agents or agencies consume large amounts of resources, in time and / or money in particular, to create a rich media communication, such as a website, an email campaign, a banner ad, or other communication. In accordance with the above, a system and a method is necessary to automate the process of creation and distribution of communications rich in media, with professional quality.
SUMMARY This document describes systems and methods for creating, editing, sharing and distributing media-rich Internet-based communications, also known as "engines" or "creative works." Communications can be created by layers through which text, colors, background patterns, images, sound, music and / or video are integrated. Other means may also be used. Systems and methods can be used to generate, edit, transmit and track electronic presentations, brochures, advertisements (such as banner advertisements on high-traffic websites), advertisements and interactive Web pages. In one aspect, a method and apparatus are provided for dividing the work of creating a multimedia file for communication, in a step-by-step logical process, in which from start to end no programming intervention is required. In a specific exemplary modality, the multimedia file is based on Flash ™, an authoring software developed by Macromedia for animation programs based on vector graphics with full screen navigation interfaces, graphic figures and simple interactivity in a resizable format with antialias, which is small enough to be transmitted through any type of Internet connection and played while downloading. Other multimedia software and / or other protocols may be used. In specific modalities, a system and method for creating and / or delivering multimedia files through a SaaS model, and for loading media resources into an advertising engine over the Internet are provided. In other embodiments, a system and method are provided for automatically creating and hosting specific data communications for use in Web sites, presentations, advertisements, brochures and the like, for use with various media, systems and networks. Specific data communications include, but are not limited to, data related to software programs, Web services, proprietary data of third-party databases, persons, locations, keywords, companies, and combinations thereof. In another aspect, a method and system for automatically extracting and formatting multimedia code, such as Flash ™ code or other action script (actionscript), are provided for use as a template that can be edited through the user interface. user without the intervention of a programmer, and to provide editorial control of multimedia files, keyword files and specific content or websites, by a master user that controls the editorial rights of a number of secondary users within the system that goes from one a N. In still other aspects, a method and apparatus for the creation and online edition of multimedia files compiled from a data set are provided.; for the creation, editing and distribution of multimedia files created from a wide variety of content including video, audio, images, text, raw data, Flash ™ programs, Web services or other media-rich content; and to automatically determine the "content" to be included in a communication, based on responses to a series of indications or questions of an interview and / or other metadata. In still other aspects, a method and apparatus is provided to automatically determine the "appearance" of a communication based on a series of interview questions and / or other metadata, and to combine data, content and "appearance" to create unique communications. . Other systems and methods are provided to convert unique communications to various formats and media, such as Web sites, multimedia files, print media, video, etc.
The details of one or more modalities are set forth in the attached figures and the description below. Other functions and advantages will be apparent from the description and figures, as well as from the claims.
BRIEF DESCRIPTION OF THE FIGURES These and other aspects will now be described in detail with reference to the following figures. Figure 1 illustrates a multimedia communication system. Figure 2 illustrates a method for creating a template that includes creating one or more communication templates. Figure 3 illustrates a method (300) for the personalization of templates and the use of media resources. Figure 4 illustrates a method (400) for distributing and tracking communications. Figure 5 illustrates the sharing of media resources between users. Figures 6-16 are block diagrams that illustrate a general system and method for creating, distributing and tracking multimedia communications based on hypermedia.
Similar reference symbols in the various figures indicate similar elements.
DETAILED DESCRIPTION The systems and methods described in this document are related to what is known as "software as a service" ("software as a service" or SaaS), a software distribution model in which applications are hosted as a service provider and made available to users through a network such as the Internet. The systems and methods include the use of templates and a thin client interface to create multimedia communications. The access to the low level functionality of the multimedia communication system is made through a set of calls to functions and incorporated components, easy to understand, to populate the template. In addition, an API provides user access to the full scope of a programming language, which allows the scalability of the templates without the need for in-depth programming knowledge or an authoring language to produce professional communications based on templates. Additionally, the systems provide sample source files to encourage reverse engineering.
Figure 1 illustrates a multimedia communication system (100) for creating, storing and distributing multimedia communications (hereinafter, "communications"), for example, content-rich e-mail messages, Web sites and Web site segments. The communication system (100) includes a communication building engine (102) that interacts with a client user interface (104) through a network (106). The client user interface (104) can be a window in a browser application running on a personal computer. The network (106) is preferably the Internet, but it can be any type of network, specifically when it is used in a client / server configuration. The communication building engine (102) includes a project builder (108) for generating a project viewer (118) through which a user can view and assemble various media or resource components in an integrated communication. The communication creation engine (102) also includes a media repository (110) for storing communication project templates, media resources, communication project metadata, and any other data resource used to create, store and distribute projects of complete communication. Complete communication projects are accessed from the media repository (110) and distributed to selected recipients through a distribution program (112). The distribution program (112) controls the format and communication protocols to distribute the communications. The communication building engine (102) also includes a file sharing program (114), which requests the user to provide distribution parameters such as the type of communication (email, website, etc.), a number and type of recipients and a means of communication through which the communication needs to be sent. The file sharing program (114) can also inform the sender of certain qualitative and quantitative data such as the result of the transmission, responses received from the recipients, etc. A communication is a set of slides. The number of slides for any given communication project can vary in the range of 0 to N. The types of slides available for each particular communication project depend on the type of communication, and are defined in the class XML file, and define a kind of template. The template class is chosen based on various information entered by the user. For example, a template class is chosen based on the responses to an interview / consultation process operated by the system user, before creating the communication. This allows the system to offer only the types of slides that are relevant to the user's responses to the interview and / or consultation process. Slides are a grouping of design layers, design elements, and content containers. The design layers are predefined and remain static. However, they can be adapted to any layout of content design that the designer of the template considers necessary. In an exemplary mode, the layers of the slide include the background, main, foreground, and navigation layers. There is a central design file for each layer, except for the main layer and they are as follows: background. fla, slideTypen. fla, foreground. fla, and nav.fla. The number of central files slideTypen. The number of types of slides defined for the class in question will depend on the number of slides available. For example, a specific class has five types of slide defined in its XML class file. Therefore, there are five main central design files (slideTypeO.fla, slideTypel.fla, slideType2.fla, slideType3.fla, and slideType4.fla).
A class is a unique set of type (s) of slide. The number of slide types in any particular class can vary from 1 to N. The classes are used to organize the types of communication by the amount and type of content shown on each slide in the class. For example, a template class can have five unique slide types, where each slide type contains no more than a certain number of content containers. In one mode, the slide type does not contain more than five content containers, although more than five content containers can be used. However, instead of adding the new type of slide to a slide class, for example, a new class can be created to include the new slide type (s). A class is defined, and a sufficient number of slide types are provided for the user to achieve their design goals, but the total number of slide types is limited so that the user is not overloaded with too many options. The system manages and controls the creation and maintenance of all classes. A type of slide is a single set of media containers. The number of containers on any particular type of slide can vary from 1 to N. Slide types are used to organize the amount and type of content that will be displayed on any particular slide. In an exemplary embodiment, a number of standard container types can be used when creating a type of slide. A text container includes text components and is used to display text formatted in HTML, an image container includes image components and is used to display images and .swf files, a video container includes video components and is used to show transmissions of video streams. An audio container includes audio components and is used to provide audio streams or audio files. The user is responsible for the design of the containers that appear on a slide. The quantities and types of containers for a given type of slide are defined in the class XML file. Except for following the convention defined for the XML class file names for the containers, the system is flexible and allows the user to use the containers in any design layout that he chooses. Each component of content type, or media resource, can be represented in a palette of content types, for selection by a user and their incorporation into a communication. The project viewer, like the project viewer (118) shown in Figure 1, is an application that generates or "places in series" the slides and content of the communication project and provides them with functionality. When you run the project viewer, you pass a structure of associated data and software programs called the project object. The project object contains the information necessary for the communication project to be generated and reproduced as configured by the end user. The slides are represented in the project object as elements in an array. Once the project object is loaded and interpreted, the project viewer determines a loading sequence for the content of the communication project. The purpose of the project is agnostic regarding the type of file it is generating and, therefore, it is capable of reproducing a wide variety of communications such as websites, dynamically created websites, advertisements in Flash ™, presentations, brochures, advertising in third party sites and / or similar. The content is loaded into the specific design layer (for example, background, foreground, etc.) assigned by the end user. As each layer is loaded in the loading sequence, the project viewer loads the content in the containers in that layer. Once the sequence has finished running, the communication project will start the reproduction. The reproduction of the communication project has two states: automatic playback on and automatic playback off. In one mode, if autoplay is turned on, the project viewer determines the duration property of the current slide. If the value of the property is greater than zero, the project viewer expects that value in seconds before automatically advancing to the next available slide in the communication project. If the value of that property is equal to zero, the slide viewer stops on the slide until the user navigates to a different slide. If autoplay is off, users can use the slide navigation controls to view a different slide. The project viewer also provides the conduit for the exchange of information and / or instructions between the different design layers, or between the project viewer itself and a specific layer, which is referred to here as the Slide Layer Interface. This interface not only activates the basic "built-in" functionality between the layers, their containers and the project viewer, but also allows much greater programming control for advanced developers. This is because the Slide Layer Interface is basically a set of pointers. In one modality, this interface allows the direct use of AS 1.0 as the command language. This allows the creation of highly functional and complex core files to satisfy all customization needs that fall within the scope of AS 1.0 programming, the specification of which is incorporated by reference in this document. Any content that loads on the main layer will change from one type of slide to another. Any content that is loaded into the background, foreground or navigation layers remains constant and does not change between slides. That content is known as "universal content" and usually consists of headline logos, communication titles, headers, etc. The mechanisms allow the layers of the slides to communicate with each other, as well as load any type of content in any layer. All the complex programming necessary to control the loading, reproduction and functionality of the content has been incorporated into the project viewer and container components. The system includes several central design files. One such file is "background. Fla." This file is loaded in the lowest position in the project viewer. Any content or design elements that need to appear behind other content or design elements should be placed in this central file. The background file fia has several native functions: initTemplateObject (): This function is activated after completely loading the first frame. This function creates the templateObject object, which is used by the project viewer. setValuesO: This function is activated after assembling and distributing ieController to the various layers.
The color information is retrieved from the ieController object and stored in the local variables (colorlValue, color2Value, color3Value). These values can be used to dynamically color the shape elements (that is, the movie files) used in the template. This function is also used to distribute any image content, .swf, video or HTML text to your movie files suitable for the currently selected slide. startPlayback (): This function is activated by the project viewer after this .swf has been fully loaded and initialized. Another central design file is "foreground. fla." This file is loaded just under the top position (nav.fla) in the project viewer.
Any content or design elements that need to appear on other content or design elements (except navigation controls) must be placed in this central file. The native functions of "foreground. fla" include: initTemplateObject (): This function is activated after fully loading the first frame. This function creates the templateObject object, which is used by the project viewer. setValues (): This function is activated after assembling and distributing ieController to the various layers. The color information is retrieved from the ieController object and stored in the local variables (colorlValue, color2Value, color3Value). These values can be used to dynamically color the shape elements (that is, the movie files) used in the template. This function is also used to distribute any image content, .swf, video or HTML text to your movie files suitable for the currently selected slide. startPlayback (): This function is activated by the project viewer after this .swf has been fully loaded and initialized. Another central design file is "intro.fla". This file is loaded before any other central file. No other central file will be generated until this file has finished its execution. It is located in the layer above the nav.fla file. The native functions of this file include: initTemplateObj ect (): This function is activated after completely loading the first frame. This function creates the templateObject object, which is used by the project viewer. setValues (): This function is activated after assembling and distributing ieController to the various layers.
The color information is retrieved from the ieController object and stored in the local variables (colorlValue, color2Value, color3Value). These values can be used to dynamically color the shape elements (that is, the movie files) used in the template. This function is also used to distribute any image content,. swf, video or HTML text to your movie files suitable for the currently selected slide. A central design file "slideTypen.fla" is loaded onto the background file and under the foreground file. The content of the main slide usually appears in this file. Its functions include: initTemplateObj ect (): This function is activated after fully loading the first frame. This function creates the templateObject object, which is used by the project viewer. setValuesO: This function is activated after assembling and distributing ieController to the various layers. The color information is retrieved from the ieController object and stored in the local variables (colorlValue, color2Value, color3Value). These values can be used to dynamically color the shape elements (that is, the movie files) used in the template. This function is also used to distribute any image content, .swf, video or HTML text to your movie files suitable for the currently selected slide. startPlayback (): This function is activated by the project viewer after this .swf has been fully loaded and initialized. A central design file "nav.fla" is loaded onto the foreground file and includes navigation controls. The visibility of the navigation controls is determined by the end user. Changing visibility to false causes the project viewer to skip loading this file. Its native functions include: initTemplateObject (): This function is activated after the first frame is fully loaded. This function creates the templateObj object ect which is used by the project viewer. setValuesO: This function is called after assembling and distributing ieController to the various layers. The color information is retrieved from the ieController object and stored in the local variables (colorlValue, color2Value, color3Value). These values can be used to dynamically color the shape elements (that is, the movie files) used in the template. This function is also used to distribute any image content, .swf, video or HTML text to your movie files suitable for the currently selected slide. buildNavigation (): This function is activated by the navPane segment after being fully loaded in the timeline and after creating and placing the ieNavXML XML object in this timeline. The ieNavXML XML object is created within the project viewer based on the tree structure of the slides (that is, how they are organized in the tree hierarchy). The main options are represented by the parent nodes in the XML object. The elements of the menus are Children of the specific Father node. changeSlide (optionNumber, item umber): This function is called when you click on an item in the navigation menu controls. The options are grouped by main options and secondary options. The first main option is indexed at zero and all the first secondary options are also indexed at zero. When you click on a menu item, you simply go to the main option where it is located as the option umber parameter. The value of the item umber parameter is the same as the position of the menu item in the list of secondary options. For example: The third secondary option "About our company" in the second main option "About us" would activate changeSlide () - changeSlide (1, 2). A configuration file "containerData. Xml" defines the class. It is provided only as a reference about how containers are declared within a type of slide, and how the types of slides are declared within the class. This file is used by the project viewer application and the project builder application to determine the types of slides available and locate the containers within the slide.
Container components Functional examples of container components are provided in a "Source.fla" folder to illustrate how container components are integrated into the design of the template. In these examples a fully functional template is shown, so it is not necessary to have a thorough knowledge of the functioning of the components. Once the user is comfortable with the central design files and how the components work, the system provides different ways to apply design style changes to the components.
Image Component The image component is a multimedia module used within the central design files to upload and view images and / or files. swf One such multimedia module is based on a Macromedia Flash MX® component, which in turn is based on AS 1.0. The user integrates and positions this component in its design. Once finished, the component can upload and display any image or content. swf that the user assigns to it. The image component is easy to integrate into any graphic design or animation scheme, and does not restrict the user from using Flash ™ animation or other visual effects. The image component is used only in edit mode. From the main timeline within a central template file (for example: a class of five slides, foreground, swf), this component can be found in the box called "staticView", inside a movie file called foreground GaphicA. The initLayoutO module is used to initialize the component and prepare it to begin loading image or .swf content. Properties include: containerWidth: sets the width of the display panel. containerHeight: sets the height of the display panel. containerPath: is a component, such as a Flash ™ component, as defined in the XML class file. slideLayer: defines the layer in which the component is located. Valid values can include "foreground", "background" and / or "communication".
Video component The video component is used within the central design files to upload and display video in .flv format. In one modality, the video component is a component of Macromedia Flash MX® based on AS 1.0. The template designer integrates and positions this component in its design. Once finished, the component can upload and display any .flv content that the user assigns to it. The video component is also easy to integrate into any graphic design or animation scheme, and does not restrict the user from using Flash ™ animation or other visual effects. The video component is used only in playback mode. In order to use the video component, from the main timeline within a central template file (for example: five-slide class, foreground, swf), the video component can be found inside a movie file called imageContainerl. videoContent The video component includes the following methods: initLayoutO - used to initialize the component and prepare it to start playing a video stream; and initVideoPane (videoURL, bufferTime, videoVolume) - used to start the video stream. The properties of the video component include: containerividti-: sets the width of the video panel. containerHeight: sets the height of the video panel. controllerXPos: sets the x position of the playback driver. A value of -1 aligns the left edge of the controller with the left edge of the video panel. controllerYPos: Sets the position and of the playback controller, where a value of -1 aligns the top edge of the controller with the bottom edge of the video panel. controllerWidth: sets the width of the playback controller, where a value of -1 causes the controller to adopt the width of the video panel. callback: a function that is activated or called when the video buffer is full. autoSizePane forces the video panel and playback controller to resize, align, and position. controlBarHeight: sets the height of the video driver.
Text component The text component is used within the central design files to upload and display video in HTML text format. In one modality, the text component is a component of Macromedia Flash MX® based on AS 1.0. The user integrates and positions this component in its design, and then names the component according to the class XML file. Once finished, the component can load and display any HTML text content that the user assigns to it. The text component is used only in edit mode. During playback, the specific text content is manually assigned by the user to a Flash ™ text field. The text component can be found, from the main timeline within a central template file (for example: five-slide classes, foreground. Swf), in the box labeled "staticView", within the foregroundTextA activated movie files and foregroundTextB. The activation of the "initLayout ()" function is used to initialize the component and prepare it to start displaying HTML text. The text component properties include: containerWidth sets the width of the text pane. containerHeight: Sets the height of the text panel. containerPath: The Flash path of the component as defined in the XML class file. headline-. A Boolean property that sets the display status of the component. Text components configured in the header state can use a custom movie file to display the text content. This allows the user to use custom fonts and text styles and disable the modification of the format by the user. staticHeadline: The name of the linked file in the library to be used to display the text content. slideLayer: The layer in which the component is located. Valid values can include "foreground", "background" and "communication".
Custom components Custom components are designed and implemented by the user or the designer of the template, and can be used just like the standard components for integration into the communication project. Custom components pass a configuration object to the slide viewer, which allows the user to configure any property of the component. This object is a basic name / value structure that represents a hash of property / value pairs. Next, this hash is dynamically integrated into the system and assigned to the slide where it is located. This scheme allows users / developers to create and incorporate powerful components that can handle tasks such as XML input (such as data from Google Adwords or the Overture system, or other proprietary data flows from proprietary databases, conference / instant messaging or services Web), along with many other applications. Custom components may include a narration made separately (ie, digital voice files), personal audio files, special images and / or graphics such as logos, and videos that a user provides to the system to store in the media repository, View modes The user constructs his design in the central design files. The project viewer can open and generate these files in a layered way so that the content is "stacked" according to the layer in which it is located. For example, the content in the background layer appears under the content in the foreground layer. In one modality, there are two project viewers. In a preferred exemplary embodiment, the two project viewers are essentially identical. One of the project viewers is provided for live reproduction of the communication project, while the other is embedded in the communication project's constructor and is necessary to generate the central files for the end user, so that the user can edit any desired content in the containers. In an alternative mode, another project viewer is provided to generate or "serialize" the complete communication files in various third-party formats, such as .swf, .pdf, xml, html, txt or any other format. Accordingly, all central files can support two states: a playback state and an editing status. These states are designated within each central file by a box label. When loaded into the constructor, the project viewer immediately sends the reproduction head within the central files to the box labeled "staticView". Otherwise, the playback head is positioned in the first frame and stops until the communication project is ready to play.
Live view The "Live View" describes the complete reproduction of a communication project. During the live view, all the functionality, design and animation are active and visible to the end user. It is the finished product as configured by the user.
Edit View The "Edit View" is experienced within the project builder and, in some cases, it is in the "Live View" where a user contributes the editing or comments to a communication. Although functionality and design remain intact, animations are disabled. This "sample" view offers users a context within the design so that the content can be configured and assigned to containers.
Groups "Groups" is an application that allows groups of users to create, edit, share and distribute communications created by the system according to a set of business rules. For example, a group of 25 users can use the system to communicate a uniform message, and still retain the autonomous controls to customize each piece of communications according to the rules established by the Administrator. Each group contains a defined set of functions and capabilities. These capabilities are defined by a system administrator, and then used by users in that Group.
In one mode, a user can gain access to a group of other users, known as a "Group account." In the group account, an administrator has the right to share communications with other users; creating, effectively, communications for them and granting them limited rights to edit the communication. In another modality, a user can acquire access to a business group of users. which can be an N number of users and an M number of administrators. This functionality gives the company the ability to use the same communication uniformly, but adapt it to a specific market, segment, opportunity or similar.
File sharing "File sharing" is an application that allows administrators and users to configure a system, where administrative users can create and share communications with an N number of users, in up to N accounts or physical locations. There are several types of file sharing, each of them has a set of advantages. In one example, all three types of sharing include: Live sharing, Linked sharing, and Smart sharing. Live sharing maintains a link between the communications in use, so that an administrator can make changes to a communication and the changes to the communication are updated in real time. That is, there is no delay in time between the moment of editing and when it is published live in the communication. Linked sharing allows an administrator to make changes to a "main" communication and up to N "derived" communications, so that changes to the main communication are propagated to each derived communication in real time. Accordingly, there is no delay in time between the time of editing and when it is published live to each of the relevant communications. Smart sharing allows an administrator to make changes to several "main" communications and up to N "derived" communications, so that changes to the main communication are propagated to each derived communication in real time. Therefore, there is no delay in time between the moment of editing and when it is published live to each of the derived communications. However, in intelligent sharing, business rules are applied so that a hierarchy can be created to manage the flow of the main communication and the derived communications. The sharing business rules also apply to allow the elimination of derived communications from the system, without affecting other communications derived in the linked chain. This allows the consistent and rapid propagation of information across a wide range of users, and is particularly useful for a corporate sales force or regional advertisers in maintaining a consistent communications message.
EXAMPLE The following describes an example of the functionality of the system and method described herein, as used by a user. A membership account includes online access to all functions for editing, distribution and tracking of your communications. Various selectable options are offered based on the individual needs of a user. The number of communications in an account are based on the level of membership acquired. A user can edit the communications as often as they wish, and as many copies as they want can be stored on a storage device, such as a hard disk drive on a computer. To access the account (and associated communications), a user must first log in from a home page, for example, www.impactengine.com. Then the user must specify the username and password that were used to register. To change account information, a user can select the "My Account" link from a main navigation bar, shown in the screenshots located on the left side of a page, and then select a "Make Changes" control to make a change.
EDITING PROCESS There is no limit on how often a communication can be updated. Accordingly, recipients and viewers can always see the most up-to-date information. To edit a communication, a user first enters the "Edit Mode" by selecting the "Edit" button next to the name of the selected communication. The "Edit" button is located in a Communication Control Panel on the "My Home Page" page, preferably at the top of the page. Once in Edit Mode, a user will see a new navigation menu above, and can click on the appropriate tab and make changes to the forms provided. When finished, the user selects the "finish" button and the communication will be updated. The communication is pre-populated with default text, however, all fields can be updated with any chosen information. The graphics can be loaded in the "Edit Mode" by selecting the "Load" button to access and load images. The steps to follow can be displayed to upload images from your hard drive. Each membership includes an amount of disk space memory, for example, up to one gigabyte of disk space, where the images are stored.
DISTRIBUTION Once a communication is created, a user can use it in various ways, including: as a website, as a printed communication, as an email or as a communication stored on a hard drive, CD-ROM or other media device. All functions are available from the main navigation within a user account. An e-mail function can be accessed by selecting the "Show" button next to the name of the communication to be sent. The "Show" button is located in the Communication Control Panel on the home page. The user is provided with a form to fill out, and the communication will be sent to the designated email recipients. An email message is sent to each recipient with a graphic link "see" at the bottom. This link launches the communication directly from a designated website. No attachments or downloads are necessary. The body, title and name of the "From" field of the message can be customized. The email interface allows a user to send a communication to one or more recipients at a time. In one mode, the number of recipients is limited to a specific number, for example, six recipients. A user can send as many emails as they want. The sending of unwanted email (spam) in conjunction with an account is prohibited. The CD-ROM cards that include the communication will also be created. Cards on CD-ROM are played in standard tray CD-ROM drives on Windows and Macintosh computers. The communication will be launched automatically for maximum impact. A communication can also be used as a user's home page. To execute this functionality, a user can click on "My Websites" from inside the account to generate a website based on the chosen communications. Next, the Domain Name Service (DNS) is automatically configured with the system servers, and the Web site will be available by typing any URL (for example, www.mywbsite.com). This function is used as the center to use any communication created by the communication building engine system as a dynamically created site for use with private Web sites such as Google, Overture, eBay, Amazon and the like. You can also add a communication to an existing Web page by clicking "Show" from inside the account to generate HTML code or actionscript ("objectembed") to directly embed the file on the page. This HTML can be placed anywhere in a Web page. In accordance with the above description, and as shown in Figures 2-5, a communication method includes a number of steps to create, store and distribute multimedia communications. As shown in Figure 2, a method (200) for creating a template includes the creation of one or more communication templates in (202). The templates are usually created by designers and represent structures and general distributions of multimedia communications suitable for distribution to several different receivers through several different transmission mechanisms. In a preferred embodiment, the templates are created in Flash ™ actionscript using a proprietary application programming interface (API) to be loaded into the media repository. In (204), the media resources are provided for general use by any user. Media resources include media components such as text, font types, audio files, video files, images or graphics, Flash ™ animations, etc. In (206), media resources for private use are received by the engine and communications construction system. These private media resources can include proprietary logos, images, sound files and the like. In (208), the project template (s), general-use media resources, and media resources for private use are loaded and stored in the media repository, for future access by the user. Only the user who provides them can access the resources of private media (or an agent authorized by it). Figure 3 illustrates a method (300) for the personalization of templates and the use of media resources. In (302), the communication construction engine interviews the user to determine the templates and / or media resources that will be appropriate for that user. For example, a user of a real estate agent may indicate a need to use standard house images, as opposed to images only of people in social situations. Similarly, the type, profession or characteristics of the user can be used to adapt the types of templates and / or media resources available for access by that user, so that the user is not overloaded with options, but are provided in a manner intelligent to the user with the most pertinent and efficient communication creation system possible. In (304), the communication construction engine provides the templates and / or media resources to the user, as determined by the interview or user data entry, to be personalized in a communication by the user. In (306), the customized communication project is received from the user and compiled in a format suitable for transmission. In (308), the compiled communications are stored as projects in the media repository, for access through the distribution and file sharing programs. Figure 4 illustrates a method (400) for distributing and tracking communications. In (402), the communication building engine receives a selection of complete communication projects that have been stored in the media repository. In (404), the communications construction engine receives from the user a selection of distribution mechanisms by which communications are transmitted. The distribution mechanism includes, but is not limited to, Web sites, e-mail systems, CD-ROMs, DVDs or through an offline copy (for example, a hard copy). In (406), the communications are distributed to the distribution mechanisms selected for transmission or sending to the selected recipients. Figure 5 illustrates file sharing by users of media resources with other users who may be affiliated by the employer, by contract or other agreement. In (502), the communications building engine receives a selection of communication projects that can be shared among one or more users. In (504), the user or other users are identified and received by the communication building engine. In (506), the file sharing program of the communication building engine processes the selections, and in (508) the processed selections and the associated communication projects and / or media resources are made available to the user or to the user. other users more. The embodiments of the invention and all the functional operations described in this specification can be implemented in digital electronic circuits, or in computer software, firmware or hardware, including the structures disclosed in this specification and their structural equivalents or combinations thereof. The embodiments of the invention can be implemented as one or more computer programs, that is, one or more computer program instructions modules encoded in a computer readable medium, for example, a machine-readable device, a means of storage readable by a machine, or a propagated signal readable by a machine, for execution by, or to control the operation of, data processing apparatuses. The term "data processing apparatus" encompasses all devices, devices and machines for data processing, including by way of example a programmable processor, a computer or multiple processors or computers. The apparatus may include, in addition to the hardware, code that creates an execution environment for the computation program in question, for example, code that constitutes processor firmware, a protocol stack, a database management system, a system operative or a combination of them. A propagated signal is an artificially generated signal, for example, an electrical, optical or electromagnetic signal generated by a machine, which is generated to encode the information for transmission to a suitable receiving apparatus. A computer program (also referred to as a program, software, application, application software, script or code) can be written in any form of programming language, including compiled or interpreted languages, and can be distributed in any form, including as an independent program or as a module, component, subroutine or some other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a part of a file that contains other programs or data (for example, one or more scripts stored in a markup document), in a single file dedicated to the program in question, or in several coordinated files ( for example, files that store one or more modules, subprograms or parts of code). A computer program can be distributed to run on a computer or on several computers located on a site or distributed through multiple sites and interconnected by a communication network. The processes and logical flows described in this specification can be performed by one or more programmable processors that execute one or more computation programs to perform functions when operating input data and generate an output. Logical processes and flows can also be performed by, and the apparatus can also be implemented as a special purpose logic circuit, for example, a programmable field gate array (FPGA) or a specific application integrated circuit (ASIC). ). Suitable processors for the execution of a computer program include, by way of example, both general purpose and special processors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory, a random access memory, or both. The essential elements of a computer are a processor to execute instructions and one or more memory devices to store instructions and data. Generally, a computer also includes, or is operatively connected to, a communication interface to receive data, transmit or both, one or more mass storage devices for storing data, for example, magnetic, optical or magneto-optical disks, or disks. optical In addition, a computer can be incorporated into another device, for example, a cell phone, a personal digital assistant (PDA), a portable audio player, a Global Positioning System (GPS) receiver, for its acronyms in English), to name only a few. Information bearers for incorporating instructions for computer programs and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, for example, EPROM, EEPROM and Flash memory devices; magnetic disks, for example, compact disks or removable disks; magnetic-optical discs; and CD-ROM and DVD-ROM discs. The processor and memory can be supplied by, or incorporated into, special purpose logic circuits. To provide interaction with a user, the embodiments of the invention can be implemented in a computer with a display device, for example, a cathode ray tube (CRT) monitor or a liquid quartz crystal monitor (LCD, for its acronym in English), to show information to the user and a keyboard and a pointing device, for example, a mouse or trackball, through which the user can provide input to the computer.
Other types of devices can also be used to interact with the user; for example, the feedback provided by the user can be any form of sensory feedback, ie, visual feedback, auditory feedback or tactile feedback; and user input can be received in any form, including an acoustic, voice or tactile input. The embodiments of the invention can be implemented in a computer system that includes a back-end component, for example, as a data server, or that includes a middleware component, for example, an application server, or which includes an interface component (front end), for example, a client computer having a graphical interface or a web browser through which a user can interact with an implementation of the invention, or any combination of such administrative components, middleware or interface. The components of the system can be interconnected by any form or means of communication of digital data, for example, a communication network. Some examples of communication networks include a local area network ("LAN") and a local area network ("WAN"), for example, the Internet.
The computer system can include clients and servers. A client and a server are usually away from each other and interact through a communications network. The relationship between client and server arises by virtue of the computer programs that run on the respective computers and that have a client-server relationship with each other. Certain features which, for clarity, are described in this specification in the context of separate embodiments may also be provided in combination with a simple embodiment. In contrast, various features which, for brevity, are described in the context of a simple embodiment may also be provided in multiple modalities separately or in any suitable secondary combination. Furthermore, although the features may be described above as acting in certain combinations and even claimed initially as such, one or more characteristics of a claimed combination may, in some cases, be exempted from the combination, and the claimed combination may be directed to a secondary combination. or a variation of a secondary combination. Specific embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the steps mentioned in the claims can be performed in a different order and still achieve desirable results. In addition, the embodiments of the invention are not limited to relational database architectures; for example, the invention can be implemented to provide indexing and archiving methods and systems for databases constructed on models other than the relational model, for example, relational databases or object-oriented databases, and for databases that have records with complex attribute structures, for example, object-oriented programming objects or markup language documents. The process described can be implemented through applications specifically performing functions of archiving and retrieval or incorporation into other applications.

Claims (20)

  1. CLAIMS 1. A multimedia communication system comprising: a media repository that stores communication project templates and media resources of various types of content; a project builder that provides a graphical user interface for a client computer, the graphical user interface includes controls to receive the user input to assemble a communication based on one of the communication project templates and having selected one of the media resources.
  2. 2. A communication system according to claim 1, which further comprises a compiler to integrate the selected media resources into one of the communication project templates to generate the communication.
  3. 3. A multimedia communication system according to claim 2, which further comprises a distribution program that formats the communication according to a selected format of the electronic distribution formats.
  4. 4. A multimedia communication system according to claim 1, which further comprises a file sharing program to receive user instructions for the shared use of communication with other users.
  5. 5. A multimedia communication system according to claim 1, wherein the project builder further comprises a plurality of media resource palettes, each media resource palette comprising a content type, which is displayed in the graphical user interface. user according to the user's preferences.
  6. 6. A multimedia communication system according to claim 1, wherein the media resources are classified according to the type of content, and where the types of content include images, audio, video and text.
  7. 7. A multimedia communication system according to claim 3, wherein the electronic distribution format includes an electronic mail, a Web page, an electronic booklet and an animated file for viewing on a computer.
  8. 8. A multimedia communication system according to claim 1, wherein the project builder also includes an interactive interview for visualization in the graphical user interface, the interview is configured to receive the user's preferences about the project templates and resources of media of the communication project.
  9. 9. A multimedia communication system according to claim 1, further comprising an application programming interface that provides a template designer with design options to design each of the communication project templates.
  10. 10. A multimedia communication system according to claim 1, which also comprises a project viewer that generates an assembled communication for its visualization in the graphical user interface.
  11. 11. A multimedia communication system comprising: a media repository that stores communication project templates and media resources of various types of content; a project builder that provides a graphical user interface for a client computer, the graphical user interface comprises controls to select at least one of the media resources for its integration into the communication project template to assemble a communication; and a project viewer that generates communication in the graphical user interface.
  12. 12. A communication system according to claim 11, which further comprises a compiler to integrate the selected media resources into one of the communication project templates to generate the communication.
  13. 13. A multimedia communication system according to claim 12, which further comprises a distribution program that formats the communication according to a selected format among the electronic distribution formats.
  14. 14. A multimedia communication system according to claim 11, which also includes a file sharing program to receive user instructions for the shared use of communication with other users.
  15. 15. A multimedia communication system according to claim 11, wherein the project builder further comprises a plurality of media resource palettes, each media resource palette comprising a content type, which is displayed in the graphical user interface. user according to the user's preferences.
  16. 16. A multimedia communication system according to claim 11, wherein the media resources are classified according to the type of content, and where the types of content include images, audio, video and text.
  17. 17. A multimedia communication system according to claim 13, wherein the electronic distribution format includes an electronic mail, a Web page, an electronic booklet and an animated file for viewing on a computer.
  18. 18. A multimedia communication method that comprises: providing, from a computer that acts as a server, an interactive interview to a graphical user interface of a client computer; receive, through the interactive interview, information about a user to indicate the user's preferences for a communication project template and a set of associated selectable media resources; and providing, in the graphical user interface, a palette of at least one communication project template, and a palette of selectable media resources of at least one type of content, for selection by the user.
  19. 19. A multimedia communication method according to claim 18, further comprising: receiving user selections from at least one communication project template, and at least one media resource; and integrate at least one media resource with the communication project template to generate a communication.
  20. 20. A multimedia communication method according to claim 19, which further comprises communication in selected elements of a plurality of electronic distribution formats.
MX/A/2007/012813A 2005-04-13 2007-10-15 Multimedia communication system and method MX2007012813A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US60/671,170 2005-04-13

Publications (1)

Publication Number Publication Date
MX2007012813A true MX2007012813A (en) 2008-10-03

Family

ID=

Similar Documents

Publication Publication Date Title
US11669863B1 (en) Multimedia communication system and method
US8527604B2 (en) Managed rich media system and method
US9807162B2 (en) Method and system for communication between a server and a client device
US7478163B2 (en) Method and apparatus for presenting multimedia content and for facilitating third party representation of an object
US20090099919A1 (en) Method, system and computer program product for formatting and delivery of playlist presentation content
US7711722B1 (en) Webcast metadata extraction system and method
TW498258B (en) Online focused content generation, delivery, and tracking
Bellinaso et al. ASP. NET Website Programming: Problem-Design-Solution, Visual Basic
MX2007012813A (en) Multimedia communication system and method
Blank et al. AdvancED Flex Application Development
Krishna Making DAM work for you—An insight into successful technology deployments