WO2000070477A1 - System and method for generating interactive animated information and advertisements - Google Patents

System and method for generating interactive animated information and advertisements Download PDF

Info

Publication number
WO2000070477A1
WO2000070477A1 PCT/US2000/013055 US0013055W WO0070477A1 WO 2000070477 A1 WO2000070477 A1 WO 2000070477A1 US 0013055 W US0013055 W US 0013055W WO 0070477 A1 WO0070477 A1 WO 0070477A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
digital media
data objects
story
media data
Prior art date
Application number
PCT/US2000/013055
Other languages
French (fr)
Inventor
Adam Lavine
Yu-Jen Dennis Chen
Dwight Rodgers
Original Assignee
Funyellow, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Funyellow, Inc. filed Critical Funyellow, Inc.
Priority to AU51314/00A priority Critical patent/AU5131400A/en
Publication of WO2000070477A1 publication Critical patent/WO2000070477A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting

Definitions

  • TITLE SYSTEM AND METHOD FOR GENERATING INTERACTIVE
  • the present invention lies in the area of modular creation of digital media.
  • Digital media is pervasive; anyone who surfs the web, turns on the television, or plays with a multimedia CD-ROM has experienced digital media. In its most effective form, digital media is entertaining, enlightening, and educational.
  • digital media is entertaining, enlightening, and educational.
  • competing standards for creating and storing digital media that fall into one of several major categories:
  • the first category is the pixel-based image, where each pixel comprises a dot on a computer screen.
  • Each image is stored as hundreds of thousands of such dots, and this type of format, also known as a bitmap or pixmap (for pixel map), is generally used for scanned photographs or digitally generated pictures.
  • this format is the JPEG, GIF, and the PICT.
  • a second category of digital media is the vector-based image, where the digital media is stored as a series of lines and curves, also known as splines or Beziers. These lines and curves can form or define regions that may be filled with colors and gradients.
  • a vector-based image is generally better than a pixel-based image for representing a drawing and is thus a popular format for clip art.
  • Vector-based images are also more compact than pixel-based images, as vector-based images are based on mathematical descriptions.
  • Typical examples of vector-based format are PostSc ⁇ pt and Macromedia Flash, while Adobe Illustrator and CorelDRAW are popular vector drawing programs
  • a third category of digital media is the digital video format, where multiple pixel-based image frames are put together to represent video Digital video, which is really a va ⁇ ation on the pixel-based image format, is often used for CD-ROM titles, including games and multimedia.
  • streaming video formats such as RealVideo, QuickTime and AVI essentially belong to the digital video format category
  • most computer animations are stored in digital video format
  • the invention comprises a system and method for creating, storing and ret ⁇ eving digital media for the purpose of generating animations
  • the invention comp ⁇ ses a digital media data object system as well as the data objects themselves, called WordChips
  • Each WordChip contains fields for basic Data and high level MetaData, as well as pipes for communicating with other WordChips (Frequency Pipes), user interface information (PMAP), identifying information (Standard Info), object parameters (States and Verbs), and a sc ⁇ pt (ActionSc ⁇ pt) for instructing the WordChip on performing basic technique
  • the Data, or basic digital media data can be formed from a va ⁇ ety of sources such as binary multimedia files, HTML/Javascnpt code, executable code or plug ins, and plain text files An editor (ALICE) is provided for putting these elements together
  • each WordChip is stored in both a public dictionary and a private database
  • the user can then create Metaphors, which are singular WordChips that are defined or de ⁇ ved from other WordChips Sentences may be formed from both basic WordChips or Metaphors, and may be used to specify a sequence of images as well as background or other effects
  • a Story which is a combination of WordChips and Sentences with background animation elements, may be created and saved for future customization
  • a story author may use a commercially available animation tool to rapidly create the Story's background animation elements, then uses ALICE to specify which types or genre of WordChips fit into the Story
  • a subsequent user of the Story then can ret ⁇ eve the Story and fill in particular WordChips to customize the Story for his or her particular use
  • a Story that has been filled in is then sent to the Media Engine for a final preview and if satisfactory, producing the final animation in a number of different formats
  • Fig 1 illustrates an overview of WordChip creation and storage, as well as how the WordChip is used in conjunction with the Media Engine to produce a finished animation product
  • Fig. 2 illustrates all the elements of a typical WordChip data structure, including data, metadata and elements used for interacting with other WordChips and with the system.
  • Fig. 3 illustrates the WordChip creation process.
  • Fig. 4 illustrates the story editing process.
  • Fig. 5 illustrates the WordChip in relation to the Metaphor, the Sentence and the Story, all of which build on the basic WordChip.
  • Fig. 6 illustrates the Metaphor, a WordChip that is defined based on other WordChips.
  • Fig. 7 illustrates the Sentence, a structure for combining WordChips in a sequence or with background effects.
  • Fig. 8 illustrates the ALICE, the Animated Language Interactive Commercial Editor.
  • Fig. 9 illustrates a Web-based wizard for creating a raw animation file.
  • Fig. 10 is a flow diagram that provides an overview on the story creation and WordChip creation processes.
  • Fig. 11 illustrates a sample story file with slots for future insertion of WordChips.
  • Fig. 12 conceptually illustrates a Story with Open, Semi-Open and Closed Slots.
  • Fig. 13 illustrates the relationship between raw animation file and an unfilled Story.
  • the present invention 10 proposes to solve the problems of digital media creation by providing the user with a process and system by which complex arbitrary digital images, animations, and Web pages, can be described quickly and put together to form complex animations.
  • the Animation Language (“ANIMAL"), which provides a way to describe animations via digital media data objects known as WordChips 12.
  • WordChips 12 are data objects for naming, sorting, and referencing different types of media elements
  • the basic WordChip 12 is a digital media data object that contains not only raw digital media content but also additional information for interacting with other WordChips 12 to produce the desired media effects
  • a prototypical WordChip 12 contains (a) Frequency Pipes 34, communication pipes that enable WordChips to communicate information to each other, (b) an ActionSc ⁇ pt 36, as a way of encapsulating technique into a WordChip via a sc ⁇ pt, (c) UI tags 38, gene ⁇ c user interface information for providing information on how to display the WordChip 12, (d) States & Verbs 40, for sto ⁇ ng parameters, alternately called "conditions and actions", (e) MetaData 44, high level information about a WordChip, (f) Standard Info 46, which constitutes basic information about the Name of the WordChip, keywords, author name and contact info, as well as a preview picture of the digital media content, and (g)
  • WordChip starts with creation of the raw digital media content
  • the content may be graphics, multimedia or an animation created from software such as MacroMedia Flash, but a WordChip can be used for other types of digital media content such as sounds, music, 3D models, vectors (e g clip art)
  • Digital media could even include HTML/Javascript code 18 that will produce the desired effects, and may also include text 22 Code-based effects or plug-ins 20,can also supply the digital media content, in which case the resulting WordChip would be termed a CodeChip as opposed to a data-only WordChip
  • These code-based effects may be generated from a compiler or an object development tool such as Microsoft Visual Studio®
  • the Data 42 found in a WordChip 12 could include an ⁇ combination of these types and can include multiple data elements of each type as well
  • the digital media content is "minted” or compiled into a WordChip 12 using a Java application named the Animation Language Interactive Commercial Editor (ALICE) 14
  • the digital media content oi Data 42 is combined with MetaData information, which identifies what kind of data it is as well as key information pertaining to the digital media content itself
  • MetaData information identifies what kind of data it is as well as key information pertaining to the digital media content itself
  • the Data is a bitmap image of a target
  • the MetaData could contain the location of the center on that bitmap, so that the image of an arrow could properly hit the target
  • basic information such as the name, keywords, type and author of the WordChip may be entered Of these, the keywords are the most important because they provide information to the WordChip system regarding the WordChip 's compatibility with other WordChips Referring to Fig 2, the resulting WordChip may be previewed and further edited, and is displayed as a standard 35mm photographic slide preview 15
  • WordChips 12 may be stored in a WordChip Dictionary 26
  • WordChips 12 may be created and defined in terms of other WordChips 12, these defined WordChips are termed Metaphors 56 (Fig 5 & 6)
  • Metaphors 56 Fig 5 & 6
  • the user defines each Metaphor 56 in terms of slots, which are essentially parameters that match other WordChips 12 WordChips could then be inserted into the specified slots to form the Metaphor
  • the user specifies the level of generalization for any given slot For example, as shown in Fig 6, in order to create a Metaphor 56 that desc ⁇ bes a birthday cake, one could specify either "candles" or "an incendiary object" as a slot
  • each Metaphor 56 is itself a singular WordChip 12, and so Metaphors 56 are recursive, so that one may create Metaphors 56 based on other Metaphors 56
  • WordChips 12 After singular WordChips 12 have been created, either from basic digital media data or as Metaphors 56, they may be put together to a form a combination of WordChips known as a Sentence 58 Typically, the Sentence 58 describes animation, motion, or interactivity of some sort, and the user can specify instructions 60 for the interaction between the different WordChips, which may descnbe image, effects, or backgrounds These instructions 60 may include conditional branches, such as if-then constructs or an event loop, and allow for flexibility in the final presentation For example, an explosion effect might need to wait until a mouse-click or rollover event Thus, as shown in Fig 7, a somewhat abstract Sentence 58 of WordChips 12 may be used to produce an animated sequence 61 of images and effects without requi ⁇ ng the user to specify particular images or frames
  • a Story 62 itself is a full combination of digital media elements that is used to create and represent the final complete animation
  • the Story is not a single animation but rather a template with animation vanables or Slots 70, parameters into which different WordChips 12 may be inserted
  • creating an animated presentation involves a creation phase and a use phase
  • a Story 62 is first created by an author who uses an animation tool and ALICE 14 to transform a basic idea into a Story 62 with Slots 70 capable of receiving WordChips 12
  • a subsequent end user of the Story 62 then inserts his or her own WordChips before sending the Story 62 to a Media Engine 30 for rendenng of a final animated media presentation
  • a raw animation file 66 is created or prototyped with the use of either web-based software Wizards 64 (see Fig 9) or a commercially available animation tool 80, such as Macromedia Flash Authonng Kit
  • the author uses either tool to design a basic animated scene file 66 in which certain portions or elements are left blank so that different WordChips 12 may be filled in later Refemng to Fig 13, these WordChip blanks 68 are marked off from other elements 84 in the animation file by drawing or placing a gray square to act as a placeholder for where a WordChip is supposed to be placed
  • the gray square was selected as a shape because one can more easily tell if a square were undergoing a stretching or a rotation during animation
  • a gray color itself would be less likely to conflict with colors in the raw animation file 66 itself As an example, if one wanted to author a Stor 62 in which an object were bounced off a floor, one would use off-the
  • the file is read into ALICE 14, where the raw file 66 is effectively turned into a Story 62 with Slots 70 capable of accepting WordChips 12
  • the story author uses ALICE 14 to first convert the raw file 66 into a Story file 62 in which the gray blanks 68 are converted into Slots " O
  • the author may decide that only certain types of WordChips may fit into a given Slot 70
  • the author could prevent sound WordChips 12 from being inserted into a slot for the rectangular object
  • the author may deem certain slots as Closed Slots 72 by filling these slots with WordChips and locking them against end user editing (see Fig 12)
  • the author will likely leave certain slots as Open Slots 74, where the author permits a later Story user to fill in Slots 70 with any WordChip 12 that matches the slot type
  • the author may rest ⁇ ct WordChip selection by make a Slot 70 a Semi-Open Slot 76, so that only WordCh
  • the Media Engine 30 is a rendenng engine that resides on a high-speed dedicated system for optimized rendenng performance, and contains software for interpreting the stories and rendering them in a number of user-specified formats
  • the rendenng process takes place as follows the Story 62 is read, creating a set of frames 1 to n (a user specified number of frames) for animation
  • the Engine 30 creates a spatial transformation and a color transformation for each Slot 70, and thus each WordChip 12, for that particular story
  • each component (e g raw digital media element) of the WordChip 12 undergoes these transformations dunng the rendering process
  • other animated elements 84 of the Story 62 are transformed within each frame to create, along with the WordChips and open slots, a set of frames capable of being assembled together into an animation
  • the Media Engine 30 may be implemented locally but users may also choose to use a central media engine remotely located on a network,
  • the Engine 30 requires only a few seconds to render most animations (typically commercials) for preview, allowing the user to go back and make changes if necessary. If satisfied with the results, the user then instructs the Media Engine 30 to produce the animation output 32 in any number of formats, depending on whether or not extensive animation is required: GIF, Flash, HTML, or QuickTime. As noted previously, the Media Engine 30 also serves to render animated previews for Stories while they are being authored in ALICE 14 or customized in the end user editor 78.
  • the WordChip System thus provides an easy method for creating and editing animations from any number of different sources of digital media.
  • the user may create WordChips from the bitmap image and sound files traditionally associated with multimedia files, but may also use digital media in the form of HTML code or a plug in.
  • the user may further use these WordChips to rapidly produce an animated presentation by selecting and combining particular WordChips in Sentences and Stories.
  • the user does not need to specify a complete sequence of defined images or frames but needs only specify more conceptual aspects of the final animated presentation.
  • the WordChip system and the Media Engine take these conceptual specifications and produce the complete animation, providing the user with a modular way to create new multimedia presentations.

Abstract

A digital animation system relies on digital media data objects (12) called WordChips (12). Each WordChip (12) contains basic digital media Data (42) that may be either a binary data file (16), HTML/Javascript code (18), executable code (20), or plain text (22), as well as MetaData high level information (44). Each WordChip (12) also contains identifying information (46) as well as elements (34, 38, 40) for interacting with other WordChips. A script (36) controlling WordChip behavior may also be added. The digital system allows the user to create WordChips (12) from basic data as well as to form Metaphors (56) from other WordChips. The WordChips (12) may be combined to form Sentences (58) that include instructions (60) for specifying interaction between WordChips. Finally, a Story (62) may be authored from a raw animation file that is modified by adding Slots (70) for receiving WordChips (12). Subsequent users of the Story (62) insert their own WordChips to complete the Story (62). An animation engine (30) then produces an animated presentation (32) based on the completed Story (62).

Description

TITLE: SYSTEM AND METHOD FOR GENERATING INTERACTIVE
ANIMATED INFORMATION AND ADVERTISEMENTS
BACKGROUND OF THE INVENTION
1. Field of the Invention.
The present invention lies in the area of modular creation of digital media.
2. Description of the Related Art.
Digital media is pervasive; anyone who surfs the web, turns on the television, or plays with a multimedia CD-ROM has experienced digital media. In its most effective form, digital media is entertaining, enlightening, and educational. Currently, there exist many different competing standards for creating and storing digital media that fall into one of several major categories:
The first category is the pixel-based image, where each pixel comprises a dot on a computer screen. Each image is stored as hundreds of thousands of such dots, and this type of format, also known as a bitmap or pixmap (for pixel map), is generally used for scanned photographs or digitally generated pictures. Typical examples of this format are the JPEG, GIF, and the PICT.
A second category of digital media is the vector-based image, where the digital media is stored as a series of lines and curves, also known as splines or Beziers. These lines and curves can form or define regions that may be filled with colors and gradients. A vector-based image is generally better than a pixel-based image for representing a drawing and is thus a popular format for clip art. Vector-based images are also more compact than pixel-based images, as vector-based images are based on mathematical descriptions. Typical examples of vector-based format are PostScπpt and Macromedia Flash, while Adobe Illustrator and CorelDRAW are popular vector drawing programs
A third category of digital media is the digital video format, where multiple pixel-based image frames are put together to represent video Digital video, which is really a vaπation on the pixel-based image format, is often used for CD-ROM titles, including games and multimedia. In addition, streaming video formats such as RealVideo, QuickTime and AVI essentially belong to the digital video format category Furthermore, most computer animations are stored in digital video format
There are many ways to create digital media but they generally follow a similar pattern (a) there is a canvas, source or scene file — a binary image or vector file in which digital artwork is created or imported, (b) this canvas or scene file is modified and then imported for pπnting or for a Web page Should animation be involved, a keyframer or timeline would be used to allow for modifying the scene or canvas to account changes in the image over time Animation software then interpolates over these changes in the image to produce the final animated result
Even so, digital media tends to be difficult to create Authoring tools often use a "bottom up" approach, where the scene file often must be created from scratch Once created, a scene or source file is often difficult to modify, and, in addition, takes up large amounts of hard drive space. Media authoring tools are usualK complex and require considerable investments of time and money Moreover, authoring tools typically are disconnected from each other and may not communicate among themselves Furthermore, clip art, which could save time in the creation process, is usually difficult to customize Thus, creating digital media is often an expensive and time-consuming task SUMMARY OF THE INVENTION
Briefly descnbed, the invention comprises a system and method for creating, storing and retπeving digital media for the purpose of generating animations The invention compπses a digital media data object system as well as the data objects themselves, called WordChips
Each WordChip contains fields for basic Data and high level MetaData, as well as pipes for communicating with other WordChips (Frequency Pipes), user interface information (PMAP), identifying information (Standard Info), object parameters (States and Verbs), and a scπpt (ActionScπpt) for instructing the WordChip on performing basic technique The Data, or basic digital media data, can be formed from a vaπety of sources such as binary multimedia files, HTML/Javascnpt code, executable code or plug ins, and plain text files An editor (ALICE) is provided for putting these elements together
Once formed, each WordChip is stored in both a public dictionary and a private database The user can then create Metaphors, which are singular WordChips that are defined or deπved from other WordChips Sentences may be formed from both basic WordChips or Metaphors, and may be used to specify a sequence of images as well as background or other effects A Story, which is a combination of WordChips and Sentences with background animation elements, may be created and saved for future customization A story author may use a commercially available animation tool to rapidly create the Story's background animation elements, then uses ALICE to specify which types or genre of WordChips fit into the Story A subsequent user of the Story then can retπeve the Story and fill in particular WordChips to customize the Story for his or her particular use A Story that has been filled in is then sent to the Media Engine for a final preview and if satisfactory, producing the final animation in a number of different formats
BRIEF DESCRIPTION OF THE DRAWINGS
Fig 1 illustrates an overview of WordChip creation and storage, as well as how the WordChip is used in conjunction with the Media Engine to produce a finished animation product Fig. 2 illustrates all the elements of a typical WordChip data structure, including data, metadata and elements used for interacting with other WordChips and with the system.
Fig. 3 illustrates the WordChip creation process.
Fig. 4 illustrates the story editing process.
Fig. 5 illustrates the WordChip in relation to the Metaphor, the Sentence and the Story, all of which build on the basic WordChip.
Fig. 6 illustrates the Metaphor, a WordChip that is defined based on other WordChips.
Fig. 7 illustrates the Sentence, a structure for combining WordChips in a sequence or with background effects.
Fig. 8 illustrates the ALICE, the Animated Language Interactive Commercial Editor.
Fig. 9 illustrates a Web-based wizard for creating a raw animation file.
Fig. 10 is a flow diagram that provides an overview on the story creation and WordChip creation processes.
Fig. 11 illustrates a sample story file with slots for future insertion of WordChips.
Fig. 12 conceptually illustrates a Story with Open, Semi-Open and Closed Slots.
Fig. 13 illustrates the relationship between raw animation file and an unfilled Story. DETAILED DESCRIPTION OF THE INVENTION
During the course of this description, like numbers will be used to identify like elements according to different diagrams illustrating the invention.
The present invention 10 proposes to solve the problems of digital media creation by providing the user with a process and system by which complex arbitrary digital images, animations, and Web pages, can be described quickly and put together to form complex animations. Referring to Fig.l, the key parts to the invention are The Animation Language ("ANIMAL"), which provides a way to describe animations via digital media data objects known as WordChips 12. WordChips 12 are data objects for naming, sorting, and referencing different types of media elements The basic WordChip 12 is a digital media data object that contains not only raw digital media content but also additional information for interacting with other WordChips 12 to produce the desired media effects Referring to Fig 2, a prototypical WordChip 12 contains (a) Frequency Pipes 34, communication pipes that enable WordChips to communicate information to each other, (b) an ActionScπpt 36, as a way of encapsulating technique into a WordChip via a scπpt, (c) UI tags 38, geneπc user interface information for providing information on how to display the WordChip 12, (d) States & Verbs 40, for stoπng parameters, alternately called "conditions and actions", (e) MetaData 44, high level information about a WordChip, (f) Standard Info 46, which constitutes basic information about the Name of the WordChip, keywords, author name and contact info, as well as a preview picture of the digital media content, and (g) Data 42, the raw digital media content itself
Creation of the basic WordChip starts with creation of the raw digital media content The content may be graphics, multimedia or an animation created from software such as MacroMedia Flash, but a WordChip can be used for other types of digital media content such as sounds, music, 3D models, vectors (e g clip art) Digital media could even include HTML/Javascript code 18 that will produce the desired effects, and may also include text 22 Code-based effects or plug-ins 20,can also supply the digital media content, in which case the resulting WordChip would be termed a CodeChip as opposed to a data-only WordChip These code-based effects may be generated from a compiler or an object development tool such as Microsoft Visual Studio® The Data 42 found in a WordChip 12 could include an\ combination of these types and can include multiple data elements of each type as well
Once the digital media content has been created, it is "minted" or compiled into a WordChip 12 using a Java application named the Animation Language Interactive Commercial Editor (ALICE) 14 The digital media content oi Data 42, is combined with MetaData information, which identifies what kind of data it is as well as key information pertaining to the digital media content itself For example, if the Data is a bitmap image of a target, the MetaData could contain the location of the center on that bitmap, so that the image of an arrow could properly hit the target In addition, basic information such as the name, keywords, type and author of the WordChip may be entered Of these, the keywords are the most important because they provide information to the WordChip system regarding the WordChip 's compatibility with other WordChips Referring to Fig 2, the resulting WordChip may be previewed and further edited, and is displayed as a standard 35mm photographic slide preview 15 Once minted, WordChips 12 may be stored in a WordChip Dictionary 26 as well as a user's own database 24 for faster retπeval or for propπetary graphics In either storage area the user is able to browse and select WordChips, all of which have a preview image and a title, by using ALICE 14 or a Web page interface 28
In addition to minting WordChips 12 from basic digital media, WordChips 12 may be created and defined in terms of other WordChips 12, these defined WordChips are termed Metaphors 56 (Fig 5 & 6) The user defines each Metaphor 56 in terms of slots, which are essentially parameters that match other WordChips 12 WordChips could then be inserted into the specified slots to form the Metaphor The user specifies the level of generalization for any given slot For example, as shown in Fig 6, in order to create a Metaphor 56 that descπbes a birthday cake, one could specify either "candles" or "an incendiary object" as a slot Finally, each Metaphor 56 is itself a singular WordChip 12, and so Metaphors 56 are recursive, so that one may create Metaphors 56 based on other Metaphors 56
After singular WordChips 12 have been created, either from basic digital media data or as Metaphors 56, they may be put together to a form a combination of WordChips known as a Sentence 58 Typically, the Sentence 58 describes animation, motion, or interactivity of some sort, and the user can specify instructions 60 for the interaction between the different WordChips, which may descnbe image, effects, or backgrounds These instructions 60 may include conditional branches, such as if-then constructs or an event loop, and allow for flexibility in the final presentation For example, an explosion effect might need to wait until a mouse-click or rollover event Thus, as shown in Fig 7, a somewhat abstract Sentence 58 of WordChips 12 may be used to produce an animated sequence 61 of images and effects without requiπng the user to specify particular images or frames
All of the basic data elements such as Sentences 58, Metaphors 56, and basic WordChips 12, however, find their major use in creating Stones 62, which are combinations of singular or multiple WordChips 12 with background scene animations As the name suggests, a Story 62 itself is a full combination of digital media elements that is used to create and represent the final complete animation However, the Story is not a single animation but rather a template with animation vanables or Slots 70, parameters into which different WordChips 12 may be inserted Consquently, creating an animated presentation involves a creation phase and a use phase As shown in Fig 10, a Story 62 is first created by an author who uses an animation tool and ALICE 14 to transform a basic idea into a Story 62 with Slots 70 capable of receiving WordChips 12 A subsequent end user of the Story 62 then inserts his or her own WordChips before sending the Story 62 to a Media Engine 30 for rendenng of a final animated media presentation
Authonng, or creating, a Story 62 requires several different steps Referring to Fig 10, a raw animation file 66 is created or prototyped with the use of either web-based software Wizards 64 (see Fig 9) or a commercially available animation tool 80, such as Macromedia Flash Authonng Kit The author uses either tool to design a basic animated scene file 66 in which certain portions or elements are left blank so that different WordChips 12 may be filled in later Refemng to Fig 13, these WordChip blanks 68 are marked off from other elements 84 in the animation file by drawing or placing a gray square to act as a placeholder for where a WordChip is supposed to be placed The gray square was selected as a shape because one can more easily tell if a square were undergoing a stretching or a rotation during animation In addition, a gray color itself would be less likely to conflict with colors in the raw animation file 66 itself As an example, if one wanted to author a Stor 62 in which an object were bounced off a floor, one would use off-the-shelf animation software tool to draw a raw animation file 66 in which a gray square bounces off a floor For simple animations, a Web-based Wizard (64) may be employed to similarly create the raw animation file based on user prompts as shown in Fig 9
Once the raw animation file 66 has been created, the file is read into ALICE 14, where the raw file 66 is effectively turned into a Story 62 with Slots 70 capable of accepting WordChips 12 The story author uses ALICE 14 to first convert the raw file 66 into a Story file 62 in which the gray blanks 68 are converted into Slots "O As part of this conversion process, the author may decide that only certain types of WordChips may fit into a given Slot 70 For example, the author could prevent sound WordChips 12 from being inserted into a slot for the rectangular object In addition, the author may deem certain slots as Closed Slots 72 by filling these slots with WordChips and locking them against end user editing (see Fig 12) On the other hand, the author will likely leave certain slots as Open Slots 74, where the author permits a later Story user to fill in Slots 70 with any WordChip 12 that matches the slot type Finally, as illustrated in Fig 12, the author may restπct WordChip selection by make a Slot 70 a Semi-Open Slot 76, so that only WordChips 12 from a author-specified list 77 may be inserted in that Slot 70 ALICE 14 has access to the WordChip Dictionary 26 in case the user wishes to confirm that certain WordChips exist Each Story contains many of these slots, and the author may choose to open, close or semi- open any or all of them to his or her preference In addition, the author may preview the animation m ALICE, which sends the Story to the Media Engine 30 to render a preview Once again, gray squares serve as placeholders for the various Slots, and the author may make adjustments before either saving the Ston in a database or furthei customizing the Story 62 prior to final animation An end user of a Story 62 subsequently customizes a previously authored Story by using an end user editor 78 The end user editor 78 reads the structure 48 of a Story 62 and, using the Story Reader 50, displays the open slots 70 and semi-open slots 76, which are the parameters into which WordChips may be inserted The end user may then customize the Story by selecting WordChips 12 for insertion into the Story 62 To reduce the complexity of the editing, compatibility matching is done to limit the selection to only WordChips 12 compatible with the open slots 74 and semi-open slots 76 found in the Story 62 Whether or not a given WordChip 12 is compatible with a given slot depends on the keywords found in that WordChip 12 As noted a user may be prevented from inserting a sound WordChip into a graphical slot Moreover, the user can select WordChips 12 from either the public WordChip Dictionary 26 or the user's own pπvate WordChip database 24 The end user editor 78 is usually implemented as a Web-based Java application to ensure that any Internet-capable user could customize the animation without requmng extensive software installation
After the Story 62 has been customized or filled in by the user, it is sent to a Media Engine 30 for preview and for final production The Media Engine 30 is a rendenng engine that resides on a high-speed dedicated system for optimized rendenng performance, and contains software for interpreting the stories and rendering them in a number of user-specified formats The rendenng process takes place as follows the Story 62 is read, creating a set of frames 1 to n (a user specified number of frames) for animation For each frame, the Engine 30 creates a spatial transformation and a color transformation for each Slot 70, and thus each WordChip 12, for that particular story In the same way, each component (e g raw digital media element) of the WordChip 12 undergoes these transformations dunng the rendering process In addition, other animated elements 84 of the Story 62 are transformed within each frame to create, along with the WordChips and open slots, a set of frames capable of being assembled together into an animation The Media Engine 30 may be implemented locally but users may also choose to use a central media engine remotely located on a network, such as the Internet. In either case, the Engine 30 requires only a few seconds to render most animations (typically commercials) for preview, allowing the user to go back and make changes if necessary. If satisfied with the results, the user then instructs the Media Engine 30 to produce the animation output 32 in any number of formats, depending on whether or not extensive animation is required: GIF, Flash, HTML, or QuickTime. As noted previously, the Media Engine 30 also serves to render animated previews for Stories while they are being authored in ALICE 14 or customized in the end user editor 78.
The WordChip System thus provides an easy method for creating and editing animations from any number of different sources of digital media. The user may create WordChips from the bitmap image and sound files traditionally associated with multimedia files, but may also use digital media in the form of HTML code or a plug in. The user may further use these WordChips to rapidly produce an animated presentation by selecting and combining particular WordChips in Sentences and Stories. The user does not need to specify a complete sequence of defined images or frames but needs only specify more conceptual aspects of the final animated presentation. The WordChip system and the Media Engine take these conceptual specifications and produce the complete animation, providing the user with a modular way to create new multimedia presentations.
While the invention has been described with reference to the preferred embodiment thereof, it will be appreciated by those of ordinary skill in the art that modifications can be made to the structure and elements of the invention without departing from the spirit and scope of the invention as a whole.

Claims

We claim:
1. A media data assembly system 10 comprising the following: creating means (14) for creating digital media data objects (12) where said media data objects comprise digital media information, metadata information, and data for interacting between said media data objects. storage means (24, 26) for storing said data objects (12); retrieval means (24, 26) for retrieving said data objects (12); display means (28) for displaying said data objects (12) for user preview; selection means (14) for selecting data objects (12) for a given presentation; and, rendering means (30) for rendering a digital media presentation (32) based on said objects, wherein the resulting digital media presentation may be easily assembled and edited.
2. The system of claim 1, wherein said selection means includes querying means for querying said user; and, means for selecting said data objects based on user responses to said queries.
3. The system of claim 2, wherein said display, selection, and querying means are displayed using an HTML web page.
4. The system of claim 1, wherein said storage means (24, 26) include a data dictionary (26) for public storage.
5. The system of claim 1, wherein said storage means (24, 26) include a user database (24) for personal storage.
6. The system of claim 1 , wherein said creation means (14) permit the user to create said data object from binary data sources.
7. The system of claim 1, wherein said creation means (14) include means (56) to form said data objects (12) from other already-created data objects (12). The system of claim 1 , wherein said rendenng means (30) reside on a different computer than said creation (14), storage (24.26), retnev al (24,26), display (28), and selection (14) means and communicate via a netw ork with said creation storage, retneval, display and selection means A data object (12) for stonng and manipulating binary media data compnsing the following raw digital media data (42) wherein said data corresponds to basic content, high-level information (44) that interacts with similar information found in other similar data objects, basic identification data (46) with said data object (12), communication means (34) for communicating between data objects (12), a field for stonng a scnpt (36) of instructions for manipulating the digital media data objects (12), genenc user interface information (38), and, a field for stonng temporary parameter conditions and actions 40, wherein said data objects (12) may be used to descnbe and store different types of digital media (16, 18, 20,
22) The data object (12) of claim 9 wherein the raw digital media data compnses a static binary image (16) The data object (12) of claim 9 w herein the raw digital media data comprises a senes of binary images in animation (16) The data object of claim 9 wherein the raw digital media data comprises an HTML scπpt (18) The data object of claim 9 wherein the raw digital media data comprises a sound file (16) The data object of claim 9 wherein the raw digital media data compnses a plug-in (20)
15. The data object of claim 9 wherein the raw digital media data comprises a text file (22).
16. The data object of claim 9 wherein the raw digital media data comprises a Flash file (16).
17. The data object of claim 9 wherein the raw digital media data comprises a plurality of multimedia data types.
18. The data object of claim 9 wherein the data object (56) is formed as a combination of a plurality of other similar data objects (12).
19. A data sentence 58 for grouping multiple data objects (12) comprising the following: a first data object (12) as recited in claim 9; and, at least one other data object (12) as recited in claim 9, such that said data objects (12) interact each other in a user-supplied sequence.
20. The data sentence of claim 19, wherein said data objects (12) are sequentially controlled by instructions (60) provided by a user.
21. The data sentence of claim 19, wherein at least one of said data objects (12) represents a multimedia effect.
22. A digital story (62) comprising the following: a plurality of data object receiving means (70) capable of receiving digital media data objects (12); and, a plurality of additional digital media elements, such that an end user may insert digital media data objects (12) into said data object receiving means to provide specifications for producing a complete animated presentation.
23. The digital story (62) of claim 22 wherein said data object receiving means (70) can accept any digital media data objects (12).
24. The digital story (62) of claim 22 wherein said data object receiving means (70) can accept any digital media data objects (12) that match specified keywords. The digital story (62) of claim 22 wherein said data object receiving means (76) can accept only digital media data objects (12) on an enumerated list (77) of particular data objects (12) The digital story (62) of claim 22 wherein said data object receiving means (72) are closed to end user editing, such that only a creator of the Story may insert or change the data objects found m said data object receiving means The digital story (62) of claim 22 wherein several of the data objects (12) are grouped into a data sentence (58) A method for producing a rendered animated presentation (32) from a plurality of digital media data objects (12) and a raw animated data file (66) compnsing the steps of
(a) creating a raw animated data file (66) ith portions (68) designated as blank,
(b) adding receiving means (70) to said data file (66) for receiving a plurality of digital media data objects so that said receiving means (70) replace said blank portions (68), thereby creating a story file (62),
(c) selecting a plurality of digital media data objects (12) from a repository of said digital media data objects (12), where each said digital media data object (12) compnses basic digital media data (42) as well as
(1) linking means (34), for communicating with other data objects (12),
(2) code (36) that controls the behavior of that data object (12),
(3) identification information (46) for said data object (12), and,
(4) parameter settings (40) for said data object (12),
(d) inserting said selected digital media data objects (12) into said receiving means (70) found within said story file (62), and,
(e) rendenng an animated presentation (32) from the story file (62) of selected digital media data objects (12)
29. The method of claim 28, further comprising the steps of:
(f) editing the rendered presentation (32) by allowing for substitution of other digital media data objects for those in the original story (62); and,
(g) presenting a preview 15 of the rendered presentation (32).
30. The method of claim 28 wherein said creation step (a) is accomplished via commercially available animation software (80).
31. The method of claim 28 wherein said creation step (a) is accomplished using a web-based application.
32. The method of claim 28 wherein said adding receiving means (70) includes specifying limits on which digital media data objects (12) may be inserted into said receiving means (70).
33. The method of claim 28 wherein said specifying step comprises limiting digital media data objects (12) based on keyword information found in said data object.
34. The method of claim 28 wherein said specifying step comprises limiting digital media data objects (12) by specifying an enumerated list (77) of particular data objects that can be inserted into said receiving means (70).
35. The method of claim 28 wherein specifying step comprises inserting particular digital media data objects (12) into particular receiving means (70) and preventing further editing of said receiving means (70).
36. The method of claim 28 wherein said selecting step (c) includes selecting from a public dictionary (26) of previously created data objects (12).
37. The method of claim 28 wherein said selecting step (c) includes selecting from a private database (24) of previously created data objects (12).
38. The method of claim 28 wherein said rendering step (e) is comprised of the following steps: (a) creating a sequence of rendering frames based on said raw animation file (66);
(b) spatially transforming all of the story's digital media data objects(12) for each frame in said sequence.
(c) color transforming all of the story's digital media data objects (12) for each frame in said sequence, such that the resulting sequence of frames contains the digital media data object animated within the frame.
39. The method of claim 28 wherein said rendering step (e) is achieved on a remotely located server accessible via a network.
40. The method of claim 28 wherein said steps (c)-(d) are displayed and executed via Internet web page (64).
41. The method of claim 28 wherein steps (a)-(b) are completed and the story file (62) is stored in a database at a time prior to completing steps (c)-(g).
42. A method for creating and storing a digital media data object (12) that contains basic content data along with identification (46) and metadata (44) information comprising the following steps: a) providing the basic content data (42) for the data object (12); b) compiling said basic content data (42) with high level information (44) for use in communicating and interacting with other data objects; c) previewing said compilation before final editing; and, d) storing said compilation as a data object (12) in a database (24, 26) for further retrieval, in order to allow for use of the data object (12) at a future time.
43. The method of claim 42 wherein basic content data comprises binary multimedia data (16) and providing said basic content data comprises creating the binary multimedia data from binary multimedia software.
44. The method of claim 42 wherein basic content data comprises browser-readable code (18) that produces a desired multimedia effect and said providing step (a) comprises generating said code.
45. The method of claim 42 wherein basic content data comprises executable code (20) for producing a desired multimedia effect and said providing step comprises generating said code using a software development tool.
46. The method of claim 42 wherein basic content data comprises executable code (20) for producing a desired multimedia effect and said providing step comprises generating said code using a compiler.
47. The method of claim 42 wherein said storing step (c) comprises storing said data object in a public dictionary (26) for all users.
48. The method of claim 42 wherein said storing step (d) comprises storing said data object in a user's own personal database (28).
49. The method of claim 42 wherein said compiling step (b) includes adding keyword, name, type and author information for said digital media data object (12).
PCT/US2000/013055 1999-05-14 2000-05-12 System and method for generating interactive animated information and advertisements WO2000070477A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU51314/00A AU5131400A (en) 1999-05-14 2000-05-12 System and method for generating interactive animated information and advertisements

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13423199P 1999-05-14 1999-05-14
US60/134,231 1999-05-14

Publications (1)

Publication Number Publication Date
WO2000070477A1 true WO2000070477A1 (en) 2000-11-23

Family

ID=22462366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/013055 WO2000070477A1 (en) 1999-05-14 2000-05-12 System and method for generating interactive animated information and advertisements

Country Status (2)

Country Link
AU (1) AU5131400A (en)
WO (1) WO2000070477A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080040227A1 (en) * 2000-11-03 2008-02-14 At&T Corp. System and method of marketing using a multi-media communication system
WO2009042330A3 (en) * 2007-09-21 2009-05-14 Microsoft Corp Animating objects using a declarative animation scheme
CN103136353A (en) * 2013-02-28 2013-06-05 武汉刻度科技发展有限公司 Method and system for processing man-machine interaction events of enterprise information management system
US9230561B2 (en) 2000-11-03 2016-01-05 At&T Intellectual Property Ii, L.P. Method for sending multi-media messages with customized audio

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5818512A (en) * 1995-01-26 1998-10-06 Spectravision, Inc. Video distribution system
US5903892A (en) * 1996-05-24 1999-05-11 Magnifi, Inc. Indexing of media content on a network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5818512A (en) * 1995-01-26 1998-10-06 Spectravision, Inc. Video distribution system
US5903892A (en) * 1996-05-24 1999-05-11 Magnifi, Inc. Indexing of media content on a network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ABOWD ET AL.: "Teaching and learning as multimedia authoring: the classroom 2000 project", ACM MULTIMEDIA 96, pages 187 - 198, XP002930455 *
JU ET AL.: "Analysis of gesture and action in technical talks for video indexing", IEEE, 1997, pages 595 - 601, XP002930456 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080040227A1 (en) * 2000-11-03 2008-02-14 At&T Corp. System and method of marketing using a multi-media communication system
US9230561B2 (en) 2000-11-03 2016-01-05 At&T Intellectual Property Ii, L.P. Method for sending multi-media messages with customized audio
US9536544B2 (en) 2000-11-03 2017-01-03 At&T Intellectual Property Ii, L.P. Method for sending multi-media messages with customized audio
US10346878B1 (en) 2000-11-03 2019-07-09 At&T Intellectual Property Ii, L.P. System and method of marketing using a multi-media communication system
WO2009042330A3 (en) * 2007-09-21 2009-05-14 Microsoft Corp Animating objects using a declarative animation scheme
CN103136353A (en) * 2013-02-28 2013-06-05 武汉刻度科技发展有限公司 Method and system for processing man-machine interaction events of enterprise information management system
CN103136353B (en) * 2013-02-28 2016-04-13 武汉刻度信息科技股份有限公司 Enterprise information management system man-machine interaction event-handling method and system

Also Published As

Publication number Publication date
AU5131400A (en) 2000-12-05

Similar Documents

Publication Publication Date Title
US10755745B2 (en) Automatic generation of video from structured content
US10600445B2 (en) Methods and apparatus for remote motion graphics authoring
CN101512553B (en) A method and a system for arranging virtual content
CN102752639B (en) Metadata is used to process the method and apparatus of multiple video flowing
JP2007521588A (en) Automatic multimedia object model
WO2000039997A2 (en) Creating and editing digital video movies
JP7177175B2 (en) Creating rich content from text content
WO2000070477A1 (en) System and method for generating interactive animated information and advertisements
Mateevitsi et al. A game-engine based virtual museum authoring and presentation system
Terrazas et al. Java Media APIs: Cross-platform Imaging, Media, and Visualization
KR20000024334A (en) The method of a three dimensional virtual operating simulation
Powers Painting the Web: Catching the User's Eyes-and Keeping Them on Your Site
Perkins Flash Professional CS5 Bible
Knowles The Temporal Image Mosaic and its Artistic Applications in Filmmaking
Kerman Macromedia Flash 8@ work: Projects and Techniques to Get the Job Done (with CD)(SAMS)
Johansson Investigation and Integration of a Scalable Vector Graphics Engine on a Set-Top Box
Wild Presentations with Macromedia Flash MX
Renow-Clarke et al. Learn programming with Flash MX
Beckhaus et al. Methods and Tools for Storytelling in Virtual Environments

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 10009664

Country of ref document: US

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP