TITLE: SYSTEM AND METHOD FOR GENERATING INTERACTIVE
ANIMATED INFORMATION AND ADVERTISEMENTS
BACKGROUND OF THE INVENTION
1. Field of the Invention.
The present invention lies in the area of modular creation of digital media.
2. Description of the Related Art.
Digital media is pervasive; anyone who surfs the web, turns on the television, or plays with a multimedia CD-ROM has experienced digital media. In its most effective form, digital media is entertaining, enlightening, and educational. Currently, there exist many different competing standards for creating and storing digital media that fall into one of several major categories:
The first category is the pixel-based image, where each pixel comprises a dot on a computer screen. Each image is stored as hundreds of thousands of such dots, and this type of format, also known as a bitmap or pixmap (for pixel map), is generally used for scanned photographs or digitally generated pictures. Typical examples of this format are the JPEG, GIF, and the PICT.
A second category of digital media is the vector-based image, where the digital media is stored as a series of lines and curves, also known as splines or Beziers. These lines and curves can form or define regions that may be filled with colors and gradients. A vector-based image is generally better than a pixel-based image for representing a drawing and is thus a popular format for clip art. Vector-based images are also more compact than pixel-based images, as vector-based images are based on mathematical descriptions. Typical examples of vector-based format are
PostScπpt and Macromedia Flash, while Adobe Illustrator and CorelDRAW are popular vector drawing programs
A third category of digital media is the digital video format, where multiple pixel-based image frames are put together to represent video Digital video, which is really a vaπation on the pixel-based image format, is often used for CD-ROM titles, including games and multimedia. In addition, streaming video formats such as RealVideo, QuickTime and AVI essentially belong to the digital video format category Furthermore, most computer animations are stored in digital video format
There are many ways to create digital media but they generally follow a similar pattern (a) there is a canvas, source or scene file — a binary image or vector file in which digital artwork is created or imported, (b) this canvas or scene file is modified and then imported for pπnting or for a Web page Should animation be involved, a keyframer or timeline would be used to allow for modifying the scene or canvas to account changes in the image over time Animation software then interpolates over these changes in the image to produce the final animated result
Even so, digital media tends to be difficult to create Authoring tools often use a "bottom up" approach, where the scene file often must be created from scratch Once created, a scene or source file is often difficult to modify, and, in addition, takes up large amounts of hard drive space. Media authoring tools are usualK complex and require considerable investments of time and money Moreover, authoring tools typically are disconnected from each other and may not communicate among themselves Furthermore, clip art, which could save time in the creation process, is usually difficult to customize Thus, creating digital media is often an expensive and time-consuming task
SUMMARY OF THE INVENTION
Briefly descnbed, the invention comprises a system and method for creating, storing and retπeving digital media for the purpose of generating animations The invention compπses a digital media data object system as well as the data objects themselves, called WordChips
Each WordChip contains fields for basic Data and high level MetaData, as well as pipes for communicating with other WordChips (Frequency Pipes), user interface information (PMAP), identifying information (Standard Info), object parameters (States and Verbs), and a scπpt (ActionScπpt) for instructing the WordChip on performing basic technique The Data, or basic digital media data, can be formed from a vaπety of sources such as binary multimedia files, HTML/Javascnpt code, executable code or plug ins, and plain text files An editor (ALICE) is provided for putting these elements together
Once formed, each WordChip is stored in both a public dictionary and a private database The user can then create Metaphors, which are singular WordChips that are defined or deπved from other WordChips Sentences may be formed from both basic WordChips or Metaphors, and may be used to specify a sequence of images as well as background or other effects A Story, which is a combination of WordChips and Sentences with background animation elements, may be created and saved for future customization A story author may use a commercially available animation tool to rapidly create the Story's background animation elements, then uses ALICE to specify which types or genre of WordChips fit into the Story A subsequent user of the Story then can retπeve the Story and fill in particular WordChips to customize the Story for his or her particular use A Story that has been filled in is then sent to the Media Engine for a final preview and if satisfactory, producing the final animation in a number of different formats
BRIEF DESCRIPTION OF THE DRAWINGS
Fig 1 illustrates an overview of WordChip creation and storage, as well as how the WordChip is used in conjunction with the Media Engine to produce a finished animation product
Fig. 2 illustrates all the elements of a typical WordChip data structure, including data, metadata and elements used for interacting with other WordChips and with the system.
Fig. 3 illustrates the WordChip creation process.
Fig. 4 illustrates the story editing process.
Fig. 5 illustrates the WordChip in relation to the Metaphor, the Sentence and the Story, all of which build on the basic WordChip.
Fig. 6 illustrates the Metaphor, a WordChip that is defined based on other WordChips.
Fig. 7 illustrates the Sentence, a structure for combining WordChips in a sequence or with background effects.
Fig. 8 illustrates the ALICE, the Animated Language Interactive Commercial Editor.
Fig. 9 illustrates a Web-based wizard for creating a raw animation file.
Fig. 10 is a flow diagram that provides an overview on the story creation and WordChip creation processes.
Fig. 11 illustrates a sample story file with slots for future insertion of WordChips.
Fig. 12 conceptually illustrates a Story with Open, Semi-Open and Closed Slots.
Fig. 13 illustrates the relationship between raw animation file and an unfilled Story. DETAILED DESCRIPTION OF THE INVENTION
During the course of this description, like numbers will be used to identify like elements according to different diagrams illustrating the invention.
The present invention 10 proposes to solve the problems of digital media creation by providing the user with a process and system by which complex arbitrary digital images, animations, and Web pages, can be described quickly and put together to form complex animations. Referring to Fig.l, the key parts to the invention are The Animation Language ("ANIMAL"), which provides a way to describe animations via digital media data objects known as WordChips 12.
WordChips 12 are data objects for naming, sorting, and referencing different types of media elements The basic WordChip 12 is a digital media data object that contains not only raw digital media content but also additional information for interacting with other WordChips 12 to produce the desired media effects Referring to Fig 2, a prototypical WordChip 12 contains (a) Frequency Pipes 34, communication pipes that enable WordChips to communicate information to each other, (b) an ActionScπpt 36, as a way of encapsulating technique into a WordChip via a scπpt, (c) UI tags 38, geneπc user interface information for providing information on how to display the WordChip 12, (d) States & Verbs 40, for stoπng parameters, alternately called "conditions and actions", (e) MetaData 44, high level information about a WordChip, (f) Standard Info 46, which constitutes basic information about the Name of the WordChip, keywords, author name and contact info, as well as a preview picture of the digital media content, and (g) Data 42, the raw digital media content itself
Creation of the basic WordChip starts with creation of the raw digital media content The content may be graphics, multimedia or an animation created from software such as MacroMedia Flash, but a WordChip can be used for other types of digital media content such as sounds, music, 3D models, vectors (e g clip art) Digital media could even include HTML/Javascript code 18 that will produce the desired effects, and may also include text 22 Code-based effects or plug-ins 20,can also supply the digital media content, in which case the resulting WordChip would be termed a CodeChip as opposed to a data-only WordChip These code-based effects may be generated from a compiler or an object development tool such as Microsoft Visual Studio® The Data 42 found in a WordChip 12 could include an\ combination of these types and can include multiple data elements of each type as well
Once the digital media content has been created, it is "minted" or compiled into a WordChip 12 using a Java application named the Animation Language Interactive Commercial Editor (ALICE) 14 The digital media content oi Data 42, is combined with MetaData
information, which identifies what kind of data it is as well as key information pertaining to the digital media content itself For example, if the Data is a bitmap image of a target, the MetaData could contain the location of the center on that bitmap, so that the image of an arrow could properly hit the target In addition, basic information such as the name, keywords, type and author of the WordChip may be entered Of these, the keywords are the most important because they provide information to the WordChip system regarding the WordChip 's compatibility with other WordChips Referring to Fig 2, the resulting WordChip may be previewed and further edited, and is displayed as a standard 35mm photographic slide preview 15 Once minted, WordChips 12 may be stored in a WordChip Dictionary 26 as well as a user's own database 24 for faster retπeval or for propπetary graphics In either storage area the user is able to browse and select WordChips, all of which have a preview image and a title, by using ALICE 14 or a Web page interface 28
In addition to minting WordChips 12 from basic digital media, WordChips 12 may be created and defined in terms of other WordChips 12, these defined WordChips are termed Metaphors 56 (Fig 5 & 6) The user defines each Metaphor 56 in terms of slots, which are essentially parameters that match other WordChips 12 WordChips could then be inserted into the specified slots to form the Metaphor The user specifies the level of generalization for any given slot For example, as shown in Fig 6, in order to create a Metaphor 56 that descπbes a birthday cake, one could specify either "candles" or "an incendiary object" as a slot Finally, each Metaphor 56 is itself a singular WordChip 12, and so Metaphors 56 are recursive, so that one may create Metaphors 56 based on other Metaphors 56
After singular WordChips 12 have been created, either from basic digital media data or as Metaphors 56, they may be put together to a form a combination of WordChips known as a Sentence 58 Typically, the Sentence 58 describes animation, motion, or interactivity of some sort, and the user can specify instructions 60 for the interaction between the different WordChips, which may descnbe image, effects, or backgrounds These instructions 60 may include
conditional branches, such as if-then constructs or an event loop, and allow for flexibility in the final presentation For example, an explosion effect might need to wait until a mouse-click or rollover event Thus, as shown in Fig 7, a somewhat abstract Sentence 58 of WordChips 12 may be used to produce an animated sequence 61 of images and effects without requiπng the user to specify particular images or frames
All of the basic data elements such as Sentences 58, Metaphors 56, and basic WordChips 12, however, find their major use in creating Stones 62, which are combinations of singular or multiple WordChips 12 with background scene animations As the name suggests, a Story 62 itself is a full combination of digital media elements that is used to create and represent the final complete animation However, the Story is not a single animation but rather a template with animation vanables or Slots 70, parameters into which different WordChips 12 may be inserted Consquently, creating an animated presentation involves a creation phase and a use phase As shown in Fig 10, a Story 62 is first created by an author who uses an animation tool and ALICE 14 to transform a basic idea into a Story 62 with Slots 70 capable of receiving WordChips 12 A subsequent end user of the Story 62 then inserts his or her own WordChips before sending the Story 62 to a Media Engine 30 for rendenng of a final animated media presentation
Authonng, or creating, a Story 62 requires several different steps Referring to Fig 10, a raw animation file 66 is created or prototyped with the use of either web-based software Wizards 64 (see Fig 9) or a commercially available animation tool 80, such as Macromedia Flash Authonng Kit The author uses either tool to design a basic animated scene file 66 in which certain portions or elements are left blank so that different WordChips 12 may be filled in later Refemng to Fig 13, these WordChip blanks 68 are marked off from other elements 84 in the animation file by drawing or placing a gray square to act as a placeholder for where a WordChip is supposed to be placed The gray square was selected as a shape because one can more easily tell if a square were undergoing a stretching or a rotation during animation In addition, a gray
color itself would be less likely to conflict with colors in the raw animation file 66 itself As an example, if one wanted to author a Stor 62 in which an object were bounced off a floor, one would use off-the-shelf animation software tool to draw a raw animation file 66 in which a gray square bounces off a floor For simple animations, a Web-based Wizard (64) may be employed to similarly create the raw animation file based on user prompts as shown in Fig 9
Once the raw animation file 66 has been created, the file is read into ALICE 14, where the raw file 66 is effectively turned into a Story 62 with Slots 70 capable of accepting WordChips 12 The story author uses ALICE 14 to first convert the raw file 66 into a Story file 62 in which the gray blanks 68 are converted into Slots "O As part of this conversion process, the author may decide that only certain types of WordChips may fit into a given Slot 70 For example, the author could prevent sound WordChips 12 from being inserted into a slot for the rectangular object In addition, the author may deem certain slots as Closed Slots 72 by filling these slots with WordChips and locking them against end user editing (see Fig 12) On the other hand, the author will likely leave certain slots as Open Slots 74, where the author permits a later Story user to fill in Slots 70 with any WordChip 12 that matches the slot type Finally, as illustrated in Fig 12, the author may restπct WordChip selection by make a Slot 70 a Semi-Open Slot 76, so that only WordChips 12 from a author-specified list 77 may be inserted in that Slot 70 ALICE 14 has access to the WordChip Dictionary 26 in case the user wishes to confirm that certain WordChips exist Each Story contains many of these slots, and the author may choose to open, close or semi- open any or all of them to his or her preference In addition, the author may preview the animation m ALICE, which sends the Story to the Media Engine 30 to render a preview Once again, gray squares serve as placeholders for the various Slots, and the author may make adjustments before either saving the Ston in a database or furthei customizing the Story 62 prior to final animation
An end user of a Story 62 subsequently customizes a previously authored Story by using an end user editor 78 The end user editor 78 reads the structure 48 of a Story 62 and, using the Story Reader 50, displays the open slots 70 and semi-open slots 76, which are the parameters into which WordChips may be inserted The end user may then customize the Story by selecting WordChips 12 for insertion into the Story 62 To reduce the complexity of the editing, compatibility matching is done to limit the selection to only WordChips 12 compatible with the open slots 74 and semi-open slots 76 found in the Story 62 Whether or not a given WordChip 12 is compatible with a given slot depends on the keywords found in that WordChip 12 As noted a user may be prevented from inserting a sound WordChip into a graphical slot Moreover, the user can select WordChips 12 from either the public WordChip Dictionary 26 or the user's own pπvate WordChip database 24 The end user editor 78 is usually implemented as a Web-based Java application to ensure that any Internet-capable user could customize the animation without requmng extensive software installation
After the Story 62 has been customized or filled in by the user, it is sent to a Media Engine 30 for preview and for final production The Media Engine 30 is a rendenng engine that resides on a high-speed dedicated system for optimized rendenng performance, and contains software for interpreting the stories and rendering them in a number of user-specified formats The rendenng process takes place as follows the Story 62 is read, creating a set of frames 1 to n (a user specified number of frames) for animation For each frame, the Engine 30 creates a spatial transformation and a color transformation for each Slot 70, and thus each WordChip 12, for that particular story In the same way, each component (e g raw digital media element) of the WordChip 12 undergoes these transformations dunng the rendering process In addition, other animated elements 84 of the Story 62 are transformed within each frame to create, along with the WordChips and open slots, a set of frames capable of being assembled together into an animation
The Media Engine 30 may be implemented locally but users may also choose to use a central media engine remotely located on a network, such as the Internet. In either case, the Engine 30 requires only a few seconds to render most animations (typically commercials) for preview, allowing the user to go back and make changes if necessary. If satisfied with the results, the user then instructs the Media Engine 30 to produce the animation output 32 in any number of formats, depending on whether or not extensive animation is required: GIF, Flash, HTML, or QuickTime. As noted previously, the Media Engine 30 also serves to render animated previews for Stories while they are being authored in ALICE 14 or customized in the end user editor 78.
The WordChip System thus provides an easy method for creating and editing animations from any number of different sources of digital media. The user may create WordChips from the bitmap image and sound files traditionally associated with multimedia files, but may also use digital media in the form of HTML code or a plug in. The user may further use these WordChips to rapidly produce an animated presentation by selecting and combining particular WordChips in Sentences and Stories. The user does not need to specify a complete sequence of defined images or frames but needs only specify more conceptual aspects of the final animated presentation. The WordChip system and the Media Engine take these conceptual specifications and produce the complete animation, providing the user with a modular way to create new multimedia presentations.
While the invention has been described with reference to the preferred embodiment thereof, it will be appreciated by those of ordinary skill in the art that modifications can be made to the structure and elements of the invention without departing from the spirit and scope of the invention as a whole.