WO2006050198A2 - Logiciels, systemes et procedes client-serveur d'animation - Google Patents

Logiciels, systemes et procedes client-serveur d'animation Download PDF

Info

Publication number
WO2006050198A2
WO2006050198A2 PCT/US2005/039140 US2005039140W WO2006050198A2 WO 2006050198 A2 WO2006050198 A2 WO 2006050198A2 US 2005039140 W US2005039140 W US 2005039140W WO 2006050198 A2 WO2006050198 A2 WO 2006050198A2
Authority
WO
WIPO (PCT)
Prior art keywords
animation
processor
instructions executable
client computer
computer
Prior art date
Application number
PCT/US2005/039140
Other languages
English (en)
Other versions
WO2006050198A3 (fr
Inventor
Donald Alvarez
Mark Parry
Original Assignee
Accelerated Pictures, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accelerated Pictures, Llc filed Critical Accelerated Pictures, Llc
Publication of WO2006050198A2 publication Critical patent/WO2006050198A2/fr
Publication of WO2006050198A3 publication Critical patent/WO2006050198A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35438Joystick
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35448Datasuit, arm sleeve, actor, operator wears datasuit and generates motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/16Indexing scheme for image data processing or generation, in general involving adaptation to the client's capabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/08Animation software package

Definitions

  • the present invention relates to the field of animation and filmmaking in general and, in particular, to software, systems and methods for creating and/or editing animations and/or films, including any type of film-based and/or digital still and/or video image production.
  • Animated films have long been favorites of both children and adults. More recently, advances in computer animation have facilitated the process of making animated films (and in storyboarding and/or adding computer effects to live-action films).
  • animation software has been used on a PC with display, keyboard, mouse, animation software and rendering software. Each PC is a standalone unit that contains the animation data to be worked on and has the animation software that will provide to the animation data the added contributions and movements imparted by the programmer or artist at the PC.
  • a typical network might comprise a central server system S with version tracking software 100, which stores the animation data files in bulk storage 101.
  • version tracking software 100 stores the animation data files in bulk storage 101.
  • a user accesses the server, checks out the relevant data files, and alters the animation data files with animation software 111 and actuators 115 (here represented as a keyboard and mouse) resident in the PC.
  • animation software 111 and actuators 115 here represented as a keyboard and mouse
  • the artist replays the altered animation data locally through rendering software 112 resident on the PC, viewing the animation data movements at the PC's local display 114.
  • the user often will check in the altered data files as a new version added to bulk storage 101 and tracked by version tracking software 100.
  • client-based animation systems make the management of data (including version control, security of intellectual property, etc.) quite cumbersome.
  • data including version control, security of intellectual property, etc.
  • incompatible variations can be introduced. These incompatible variations are a direct result of the local temporary storage of the modified data.
  • the presence of incompatible variations can present severe complication.
  • both animation software and the work product of the animators is subject to a high risk of piracy.
  • the suite of animation software at the local PC 110 and/or allowing a user to obtain all relevant files related to an animation, the producer of a movie exposes these assets to unauthorized copying.
  • Model A three-dimensional shape, usually described in terms of coordinates and mathematical data, describing the shape of any character or object. Examples of characters include actors, animals, or other beings whose animation can tell or portray the story.
  • the model is typically provided in a neutral pose (known in the art as a "da Vinci pose"), in which the model is shown standing with limbs spread apart and head looking forward. It is understood in the art that in, many situations, the generation of the model can be extraordinarily expensive, hi some cases, the model is generated, scanned or otherwise digitized with recorded spatial coordinates of numerous points on its surface. A virtual representation of the model can occur when the data is reconstructed.
  • the model may include connectivity data, such that the collection of points defining the model can be treated as the vertices of polygonal approximations of the surface shape of the model.
  • the model may include various mathematical smoothing and/or interpolation algorithms. Such models can include collections of spatial points ranging from hundreds of points to hundreds of thousands or more points.
  • Render To make a model viewable as an image, such as by applying textures to a model and/or imaging the model using a real or virtual camera or by photographing a real object.
  • the term "rig” is used to refer to a deformation engine that specifies how movement of the model should translate into animation of a character based on the model. This is the software and data used to deform or transform the "neutral pose" of the model into a specific "active pose” variation of the model. Taking the example of the human figure, the rig would impart to the model the skeletal joint movement including shoulder, elbow, hand, finger, neck, head, hip, knee, and foot movement and the like. By having animation software manipulate a rig incorporated to a model, animated movement of the model is achieved.
  • Virtual Character The model as deformed by the rig and presented by the texture in animation.
  • Virtual Set The vicinity or fiducial reference point and coordinate system with respect to which the location of any element may be specified.
  • An object on the virtual set usually comprising a model without a rig.
  • Scene A virtual set, one or more props, and one or more virtual characters.
  • Action Animation associated with a scene. It should be noted that upon editing of the final animation story, portions of an action may be distributed without regard to time for example at the beginning, middle and end of the animation story.
  • Editing The process by which portions of actions are assembled to construct a story, narrative, or other product.
  • Actuator A device such as a mouse or keyboard on a personal computer enabling input to the animation software. This term includes our novel adaptation of a "game controller" for imparting animation to characters.
  • a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects.
  • This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • An exemplary system includes an animation client computer, which may comprise a first processor, a display device, at least one input device, and/or animation client software.
  • the system may further include an animation server computer comprising a processor and animation server software.
  • the animation client software may comprise instructions executable by the first processor to accept a set of input data from the at least one input device.
  • the set of input data may indicate a desired position for an animated object, which might comprise a set of one or more polygons and/or a set of one or more textures to be applied to the set of one or more polygons.
  • the animation client software might further comprise instructions executable by the first processor to transmit the set of input data for reception by the animation server computer.
  • the animation server software can comprise instructions executable by the second processor to receive the set of input data from the animation client computer and/or to process the input data to determine the desired position of the animated object.
  • the animation server software may also comprise additional instructions executable by the second processor to calculate a set of joint rotations defining the desired position of the animated object and/or to transmit the set of joint rotations for reception by the animation client computer.
  • the animation client software may comprise further instructions executable by the first processor to receive the set of joint rotations defining the position of the animated object and/or to calculate (perhaps based on the set of joint rotations) a set of positions for the set of one or more polygons.
  • the rendered animated object then may be displayed by the animation client, and/or the set of joint rotations may be stored at a data store associated with the animation server computer.
  • the animation client computer may be a plurality of animation client computers including a first animation client computer and a second animation client computer.
  • the first animation client computer might comprise the input device(s), while the second animation client computer might comprise the display device(s).
  • the animation server computer then, might receive the set of input data from the first animation client computer and/or transmit the set of joint rotations for reception by the second animation client computer, which might be configured to receive the set of joint rotations, calculate a set of positions for the set of one or more polygons based on the set of joint rotations, apply to the set of one or more polygons at least one of the textures from the set of one or more textures to render the animated object in the desired position, and/or display on the display device the rendered animated object.
  • the animation client software may comprise instructions executable by the first processor to accept a set of input data (which might indicate a desired position for an object) from the at least one input device and/or instructions executable by the first processor to transmit the set of input data for reception by the animation server computer.
  • the animation server software comprises instructions executable by the second processor to receive the set of input data from the animation client computer and/or to transmit for reception by the animation client computer a set of position data, perhaps based on the set of input data received from the animation client computer.
  • the animation client software might further comprise instructions executable by the first processor to receive the set of position data from the animation server computer and/or to place the object in the desired position, based at least in part on the set of position [0036]
  • the object can be a virtual object (including without limitation a virtual camera, a virtual light source, etc.) and/or a physical object (including without limitation a device, such as a camera, a light source, etc., in communication with the animation client computer, and/or any other appropriate object).
  • the object may be an animated character, which might comprise a set of polygons and at least one texture, such that placing the object in the desired position comprises rendering the animated character in the desired position.
  • the set of position data might comprise data (such as joint rotations, joint angles, etc.) defining a position of the object and/or defining a deformation of a rig describing the object.
  • the set of position data might comprise a position and/or orientation of a real or virtual camera; the position of the object in the scene may be affected by the position and/or orientation of the real or virtual camera, such that the placement of the object depends on the position and/or orientation of the real or virtual camera.
  • the animation server has an associated data store configured to hold a set of one or more object definition files for the animated object, the set of one or more object definition files collectively specifying a set of polygons and textures that define the object (e.g., the object definition files may comprise one or more textures associated with the object).
  • the animation client software may comprise instructions executable by the first processor to download from the animation server computer at least a portion of the set of one or more object definition files necessary to render the object, hi some cases, however, the downloaded portion of the set of one or more object definition files may be insufficient to independently recreate the animated object without additional data, which might be resident on the animation server computer.
  • the animation client computer might be unable to upload to the animation server computer any modifications of the at least a portion of the set of one or more object definition files.
  • the animation client software comprises further instructions executable by the first processor to modify the object definition files to produce a set of modified object definition files.
  • the animation server software comprises instructions executable by the second processor to receive the set of modified object definition files and/or to track changes to the set of object definition files.
  • the animation server computer may be configured to identify a user of the animation client computer and/or to determine whether to accept the set of modified object definition files, perhaps based on an identity of the user of the animation client computer.
  • the animation server computer may be configured to distribute the set of modified object definition files to a set of animation client computers comprising at least a second animation client computer.
  • the data store is configured to hold a plurality of sets of one or more object definition files for a plurality of animated objects.
  • the animation server software might comprise further instructions executable by the second processor to determine whether to provide to the animation client computer one or more of the sets of the object definition files, based on, for example, a set of payment or billing information and/or an identity of a user of the animation client computer.
  • the animation server software further comprises instructions executable by the second processor to identify a user of the animation client computer and/or to determine, (e.g., based on an identification of the user and/or a set of payment or billing information) whether to allow the animation client computer to interact with the animation server software.
  • the animation server software comprises instructions executable by the second processor to store the set of position data at a data store (which might be associated with the animation server computer).
  • the animation server software comprises instructions to store a plurality of sets of position data (each of which may be, but need not be, based on a separate set of input data) and/or to track a series of changes to a position of the object, based on the plurality of sets of position data.
  • the animation client computer is a first animation client computer
  • the system comprises a second animation client computer in communication with the animation server computer.
  • the second animation client computer may comprise a third processor, a second display device, a second input device, and/or second animation client software.
  • the second animation software may comprise instructions executable by the third processor to accept a second set of input data (which may indicate a desired position for a second object) from the second input device and/or to transmit the second set of input data for reception by the animation server computer.
  • the animation server software may comprise instructions executable by the second processor to receive the second set of input data from the animation client computer and/or to transmit (e.g., for reception by the second animation client computer) a second set of position data, which may be based on the second set of input data received from the second animation client computer.
  • the second animation client software may further comprise instructions executable by the third processor to receive the second set of position data from the animation server computer and/or to place the second object in the desired position, perhaps based on the second set of position data.
  • the first object and the second object may be the same object.
  • the animation server software might comprise instructions to transmit the second set of position data for reception by the first animation client computer, and the animation client software on the first animation client computer might further comprise instructions to place the object in a position defined by the second set of position data, such that the first display displays the object in a position desired by a user of the second animation client computer.
  • the second set of position data might have no impact on a rendering of the first object on the first client computer, and/or the first set of position data might have no impact on a rendering of the second object on the second client computer.
  • a variety of input devices may be used. Exemplary devices include a joystick, a game controller, a mouse, a keyboard, a steering wheel, an inertial control system, an optical control system, a full or partial body motion capture unit, an optical, mechanical or electromagnetic system configured to capture the position or motion of an actor, puppet or prop, and/or the like.
  • a system for producing animated works comprises a first animation client computer comprising a first processor, a first display device, at least one first input device, and first animation client software.
  • the system further comprises an animation server computer in communication with the animation client computer and comprising a second processor and animation server software.
  • the first animation client software comprises instructions executable by the first processor to accept a first set of input data from the at least one input device; the first set of input data indicates a desired position for a first object.
  • the first animation client software also comprises instructions to transmit the first set of input data for reception by the animation server computer.
  • the animation server software comprises instructions executable by the second processor to receive the first set of input data from the first animation client computer, to calculate a first set of position data (perhaps based on the first set of input data received from the first animation client computer) and to render the first object, based at least in part on the first set of position data.
  • the first animation client software further comprises instructions to display the first object in the desired position.
  • the system may further comprise a second animation client computer comprising a third processor, a second display device, at least one second input device, and second animation client software.
  • the second animation client software can comprise instructions executable by the third processor to accept a second set of input data from the input device, the set of input data indicating a desired position for a second object and/or to transmit the second set of input data for reception by the animation server computer.
  • the animation server software may further comprise instructions to receive the second set of input data from the second animation client computer and/or instructions to transmit for reception by the second animation client computer a second set of position data, based on the second set of input data received from the second animation client computer.
  • the second animation client software comprises instructions to receive the second set of position data from the animation server computer.
  • the second animation client software may also comprise instructions to place the second object in the desired position that object, based at least in part on the second set of position data.
  • Another set of embodiments provides animation client computers and/or animation server computers, which may be similar to those described above.
  • a further set of embodiments provides animation software, including software that can be used to operate the systems described above.
  • An exemplary animation software package may be embodied on at least one computer readable medium and may comprise an animation client component and an animation server component.
  • the animation client component might comprise instructions executable by a first computer to accept a set of input data from at least one input device at the first computer and/or to transmit the set of input data for reception by a second computer.
  • the input data may indicate a desired position for an object.
  • the animation server component may comprise instructions executable by a second computer to receive the set of input data from the first computer and/or to transmit for reception by the first computer a set of position data, based on the set of input data received from the first computer.
  • the animation client component may comprise further instructions executable by the first computer to receive the set of position data from the second computer and/or to place the animated object in the desired position, based at least in part on the set of position data.
  • An exemplary method of creating an animated work comprises accepting at an animation client computer a set of input data (which might indicate a desired position for an object) from at least one input device, and/or transmitting the set of input data for reception by an animation server computer.
  • the method further comprises receiving at the animation server computer the set of input data from the animation client computer and/or transmitting for reception by the animation client computer a set of position data, based on the set of input data received from the animation client computer.
  • the set of position data from the animation server computer may be received at the client computer.
  • the method can further include placing the object in the desired position, based at least in part on the set of position data.
  • FIG 1 is a block diagram of the prior art animation software design illustrating artist and/or programmer PCs connected to a server system for checking out animation data files, processing the animation data files, and returning the animation data files to bulk storage of the animation data at the server, the exemplary server here being shown with version tracking software;
  • Fig 2 is a block diagram of an animation system in accordance with one set of embodiments
  • Fig 3 is a block diagram an animation system in accordance with another set of embodiments
  • Fig 4A is a representation of a model that can be animated by various embodiments of the invention.
  • Fig 4B is a schematic representation of a rig suitable for deforming the model of Fig. 4A, the rig here having manipulation at the neck, shoulders, elbow, hand, hips, knees, and ankles;
  • Fig 4C is a schematic representation of texture for placement over the model of Fig 4 A to impart a texture to a portion of the exterior of the model in the form of a man's suit;
  • Fig.5 is a representation of a scene
  • FIG. 6 is a generalized schematic drawing illustrating various components of a client/server animation system, in accordance with embodiments of the invention.
  • Fig. 7 is a flow diagram illustrating a method of creating an animated work, in accordance with various embodiments of the invention.
  • FIG. 8 is a generalized schematic drawing of a computer architecture that can be used in various embodiments of the invention.
  • Various embodiments of the invention provide novel software, systems and methods for animation and/or filmmaking (the term “filmmaking” is used broadly herein to connote creating and/or producing any type of film-based and/or digital still and/or video image production, including without limitation feature-length films, short films, television programs, etc.).
  • a client-server system provides the ability to control various aspects of a live-action and/or an animated scene, including cameras and/or light sources (either real and/or virtual), animated characters, and other objects. This can include, merely by way of example, moving cameras, lights and/or the like, as well as rendering animated objects (e.g., based on movements of the objects themselves and/or based on movements of cameras, lights, etc.).
  • a client animation computer accepts input (e.g., via one or more input devices) and provides that input to an animation server computer.
  • the client animation may provide raw input from the input device.
  • the input indicates a desired movement and/or position of an animated character, relative to other objects in a virtual scene.
  • the animation server computer after receiving the input, calculates a set of data (including, merely by way of example, data describing a deformation of a model, such as joint rotations and/or joint angles) that describe the desired movement and/or position of the character.
  • the animation server computer After calculating the set of joint angles, the animation server computer transmits the set of joint angles to the animation client computer.
  • the animation client computer then renders the animated character in the desired position, based on the set of joint angles, as well as a set of polygons and one or more textures defining the animated character.
  • polygons broadly refers not only to the traditional polygons used to form a model of an object, but also to any other structures that commonly are used to form a model of an object, including merely by way of example, NURBS surfaces, subdivision surfaces, level sets, volumetric representations, and point sets, among others.
  • the animation client computer can store some of the files necessary to render the character, and can in fact render the character if provided the proper joint angles. This is beneficial, in many situations, because it relieves the animation server of the relatively processor-intensive task of rendering the animation. This arrangement, however, also allows the server to perform the joint calculations, which, while generally not as processor-intensive as the rendering process, often impose relatively high file input/output ("I/O") requirements, due to the extensive size of the databases used to hold data for performing the calculation of joint angles.
  • I/O file input/output
  • This exemplary system provides a distribution of work that takes advantage of the strength of the animation client (that is, the ability to provide a plurality of animation client computers for performing the processor-intensive rendering tasks for various animation projects), while also taking advantage of the strength of typical server computers (that is, the ability to accommodate relatively high file I/O requirements).
  • a central server provides rendering services
  • extremely powerful (and therefore expensive) servers and in many cases, server farms
  • Such systems often also feature relatively powerful workstations as animation clients, but the processing power of the workstations is not harnessed for the rendering.
  • This exemplary system provides additional advantages, especially when compared with systems on which the animation (i.e., joint rotation calculation) and rendering processes occur on the animation client.
  • the exemplary system described above facilitates the maintenance of data. For instance, since the joint rotations for a particular animation are calculated at the animation server, they can easily be stored there as well, and a variety of version-tracking and change-management protocols may be employed.
  • individual clients as opposed to an animation server
  • joint rotations and/or other position data
  • the system can be configured to prevent the animation client from accessing sufficient data to independently perform the animation process, preventing unauthorized copying of animations and thereby providing greater security for that intellectual property.
  • Model 10 the form of a human figure is disclosed.
  • Model 10 includes face 11, neck 12, arms 14 with elbow 15 and wrist 16 leading to hand 17.
  • the model further includes hip 18 knees 19 and ankles 20.
  • model In a virtual character, the "model” (or “virtual model”) is a geometric description of the shape of the character in one specific pose (commonly called the “model pose,” “neutral pose,” or “reference pose.”
  • the neutral pose used in the model is commonly a variation on the so called “da Vinci pose” in which the model is shown standing with eyes and head looking forward, arms outstretched, legs straight with feet approximately shoulder width apart.
  • the model can be duplicated in any number of ways.
  • a clay model or human model is scanned or digitized, recording the spatial coordinates of a numerous points on the surface of the physical model so that a virtual representation of the model may be reconstructed from the data. It is to be understood that such models can be the product of great effort, taking man years to construct.
  • the model also includes connectivity data (also called an "edge list"). This data is recorded at the time of scanning or inferred from the locations of the points, so that the collection of points can be treated as the vertices of a polygonal approximation of the surface shape of the original physical model. It is common, but not required, in the prior art for various mathematical smoothing and interpolation algorithms to be performed on the virtual model, so as to provide for a smoother surface representation than is achieved with a pure polygonal representation.
  • virtual models commonly include collections of spatial coordinates ranging from hundreds of points to hundreds of thousands or more points.
  • a rig 30 is illustrated which is compatible with model 10 shown in Fig4A.
  • Rig 30 includes head 31, neck 32, eyes 33, shoulders 34 elbows 35 and wrist 36. Further, hips 38 knees 39 and ankles 40 are also disclosed.
  • rig 30 is mathematically disposed on model 10 so that animation can move the rig 30 at neck 32, shoulders 34 elbows 35 and wrist 36. Further, movement of hips 38, knees 39, and ankles 40 can also occur through manipulation of the rig 30.
  • the rig 30 enables the model 10 to move with realistic changes of shape.
  • the rig 30 thus turns the model 10 into a virtual character commonly required to move and bend, such as at the knees or elbows, in order to convey a virtual performance.
  • the software and data used to deform (or transform) the "neutral pose" model data into a specific "active pose” variation of the model is commonly called a "rig” or "IK rig” where ("IK” is a shortened form of "Inverse Kinematics").
  • Inverse Kinematics (as in “EEC Rig”) is a body of mathematics that enables the computation of joint angles (or joint rotations) from joint locations and skeletal relationships.
  • Force Kinematics is the term of art for computing joint locations based on a collection of joint angles and skeletal relationships.
  • a rig is a piece of software which has as its inputs a collection of joint rotations, joint angles and/or joint locations ("the right elbow is bent 30 degrees” or “the tip of the left index finger is positioned 2cm above the center of the light switch"), the skeletal relationships between the joints (“the head bone is connected to the neck bone”) and a neutral pose representation of the virtual model, and has as its output a collection of spatial coordinates and connectivity data describing the shape that the virtual actor's body takes when posed as described by the input date.
  • a rig is a visual representation of the skeleton of the virtual actor, with graphical or other controls which allow the artist to manipulate the virtual actor.
  • an "Inverse Kinematics Rig” the artist might place a mouse on the left index finger of the virtual actor and dragging the left index finger across the screen so as to cause the virtual actor's arm to extend in a pointing motion.
  • a "Forward Kinematics Rig” the artist might click on the elbow of the virtual character and bend or straighten the rotation of the elbow joint by dragging the mouse across the screen or by typing a numeric angle on the keyboard.
  • texture 50 is illustrated here in the form of only a man's suit having a coat 51 and pants 52.
  • texture 50 would include many other surfaces.
  • a human face, socks, shoes, hands would all be part of the illustrated texture 50.
  • a virtual actor to be drawn (or “rendered") to the screen with, for example, blue eyes and a red jacket.
  • the virtual model described earlier contains purely spatial data. Additional data and/or software, commonly called “Textures,” “Maps,” “Shaders,” or “Shading” is employed to control the colors used to render the various parts of the model.
  • Vertex Color is commonly an RGB triplet (Red 0-255, Green 0-255, Blue 0-255) assigned to a specific vertex in the virtual model.
  • RGB triplet Red 0-255, Green 0-255, Blue 0-255
  • the model may be colored in such a way as to convey blue eyes and a red dress.
  • a digital image of an eye will be acquired (possibly via a digital camera or possibly by an artist who paints such a picture using computer software).
  • the digital image of the eye is a "texture map” (or “texture”).
  • the vertices in the model that comprise the surface of the eye will be tagged with additional data (“texture coordinates") that is analogous to latitude and longitude coordinates on a globe.
  • the texture is "mapped" onto the surface by use of the texture coordinates.
  • the virtual character is instructed to look to the left (for example by rotating the neck controls 32 or eye controls 33 in the rig), the virtual model is deformed in a manner which rotates all of the vertices making up the head 11 to the left.
  • the head texture is then rendered in the desired location on the screen based upon the vertex locations and the texture coordinates of those vertices.
  • an animated scene comprises some or all of the following: a virtual set (set 60 being shown), one or more props (floor 69, wall 70, chair 61 and table 62 being shown), and one or more virtual characters (model 10 manipulated by rig 30 having texture 50 being shown [here designated only by the 10])
  • Each of the virtual sets, props, and characters has a fiducial reference point with respect to which the location of the element may be specified.
  • the fiducial reference point is shown at 65.
  • the virtual set 60, props (see chair 61 and table 62), and character 10 are assembled together by specifying their spatial locations using a shared coordinate system from fiducial 65.
  • the choice of coordinate system is arbitrary, but a common practice is to locate the virtual set at the origin of the coordinate system.
  • the background (or "virtual set”) is essentially a virtual character with either no rig (in the case of a purely static virtual set) or what is commonly a very simple rig (where the joint angles might control the opening angles of a door 67 or the joint locations might control the opening height of a window 68). It is common to embellish the scene used in an action with a variety of props. As with the background, props are again essentially virtual characters which are used to represent inanimate objects.
  • animation data is associated with the elements of a scene to create an action.
  • a sequence of images may be constructed by providing the rig of character 10 with a sequence of input data ("animation data") such as 24 sets of joint angles per second, so as to produce a 24 frame per second movie.
  • animation data input data
  • the animation data provided to the rigs are commonly compressed through the use of various interpolation techniques.
  • the artist can move an indicator on the timeline or press a "play" button to watch the animation that she has created.
  • the artist can create the desired performance.
  • motion capture method the artist performs the motion in some manner while the computer records the motion.
  • Common input devices for use with motion capture include a full-body suit equipped with sensing devices to record the physical joint angles of a human actor and so-called “Waldo” devices which allow a skilled puppeteer to control a large number of switches and knobs with their hands (Waldo devices are most commonly used for recording facial animations). It is common to perform multiple captures of the same motion, during which sequence of captures the actor repeatedly reenacts the same motions until data is collected which is satisfactory both artistically and technically.
  • the motion capture and/or procedural method is used to specify the initial data.
  • the initial movement of bird 63 would be used in the procedural method.
  • the data obtained via the motion capture or procedural method is then compressed in a manner that makes it technically similar (or compatible with) data obtained via the animation method. For example, presuming that bird 63 was going to interact with character 10 in scene 60, modification of the procedural image of bird 63 would occur.
  • the initial data is then in the hybrid method manipulated, re-timed, and/or extended through the use of the animation method.
  • the animation software often plays back previously specified animations by interpolating animation data at a specific point in time, providing the interpolated animation data to the rigs, making use of the rigs to deform the models, applying textures to the models, and presenting a rendered image on the display.
  • the animation software then advances to a different point in the timeline and repeats the process.
  • Fig. 2 illustrates a client/server animation system in accordance with a set of embodiments.
  • the system comprises a plurality of animation client computers 200 in communication (e.g., via a network) with server computer 210 as shown in Fig. 2.
  • the network can be any suitable network, including without limitation a local area network, wide area network, wired and/or wireless network, the Internet, an intranet or extranet, etc.
  • models 10, rigs 30 and/or textures 50 may be stored at the client 200, e.g., at model, rig, and texture storage 201. In this particular embodiment, such storage has advantages.
  • the animation client computer may, in some embodiments, include rendering software 203 operatively connected to model, rig, and texture storage 201.
  • the rendering software may be part of an animation client application.
  • a controller 202 (here shown as a keyboard and mouse) operates through network connection 205. It should be noted that any suitable controller, including those described in U.S. Patent Application No. —
  • the animation server computer 210 includes animation data storage 211, animation software 212, and/or version tracking software 214. Presuming that the artist or programmer has created the models 10, rigs 30, and textures 50, manipulation of an Action on a scene 60 can either occur from the beginning (de novo) or, alternately, the artist and/or programmer may check out a previous version of the Action through the network connection 205 by accessing animation server 210 and retrieving from animation data storage 211 the desired data to animation software 212.
  • input data (e.g., based from the controller 202 and/or an actuator thereof) is received by the client 200.
  • the input data may be described by an auxiliary coordinate system, in which case the input data may be processed as described in U.S. Patent Application No. — / (attorney docket number 020071-000210), already incorporated by reference. Other processing may be provided as well, as necessary to format the raw input data received from the controller 202.
  • the animation client 200 in turn transmits the input (either as raw input data and/or after processing by the animation client computer 200), is transmitted (e.g., via network connection 205) to the animation server computer 210, and, more particularly, animation software 212 (which might be incorporated in an animation server software). Processing of the selected Action will occur at animation software 212 within animation server 210. Such processing will utilize the techniques above described. In particular, a set of joint rotations may be calculated, based on the input data. The joint rotations will describe the position and/or motion desired in the Action.
  • Playback will occur by having animation software 212 emit return animation information through network connection 205 and then to rendering software 203.
  • Rendering software 203 will access model, rig, and texture storage 201 to display at display 204 the end result of modifications introduced by the artist and or programmer at the client 200.
  • animation server 210 (and/or another server in communication therewith) can provide a number of services.
  • the server 210 can provide access control; for instance, client 200 is required to log in to server 210.
  • subscription may be required as a prerequisite for access to server 210.
  • server 210 can deliver different capabilities for different users.
  • PC 200A can be restricted to modification of character motion while PC 200 modifies animation of bird 68.
  • the client 200 controls the time when playback starts and stops for any individual Action. Moreover, the client 200 may arbitrarily change the portion of the Action being worked on by simply referring to that Action at a specified time period.
  • the rendering software 203 and the model, rig, and/or texture software in storage 201 the data transmitted over the network through network connection 205 is maintained at a minimum. Specifically, just a small section of data need be transmitted. This data will include that which is needed to play the animation (e.g., a set of joint rotations, etc.). As the rendering software 203 and some or all of the model, rig, and/or texture storage 201 may be resident at PC 200, only small batches of data need be transmitted over the Internet.
  • the server 210 is useful in serving multiple clients. Further, the server 210 can act as a studio, providing the artist and/or programmer at client 200 with a full range of services including storage and delivery of updated model, texture, and rig data to client 200 and client 200A. In a set of embodiments, the server 210 will store all animation data. Furthermore, through version tracking software 214, animation data storage 211 will provide animation data (such as joint rotations, etc.) to the respective client 200A on an as needed basis.
  • client 300 includes a network connection 305, a controller 302, and a display 304.
  • Server 310 includes animation data storage 211, version tracking software 214, and animation software 212. Additionally, server 310 includes rendering software 303.
  • the manipulation of the animation software from controller 302 through network connection 305 of the client 300 is identical to that shown in Fig 2.
  • the animation software for calculating joint rotations, etc. is resident on the server 310.
  • the rendering component also resides on the server 310 Specifically, rendering software 303 will generate actual images (e.g., bitmaps, etc.), which images will be sent through the network to network connection 305 and may be displayed thereafter at display 304.
  • Fig. 6 provides a generalized schematic diagram of a client/server system in accordance with some embodiments of the invention.
  • the system 600 includes an animation server computer 605, which may be a PC server, minicomputer, mainframe, etc. running any of a variety of available operating systems including UNIXTM (and/or any of its derivatives, such as Linux, BSD, etc.), various varieties of Microsoft WindowsTM (e.g., NTTM, XPTM, 2003, VistaTM, MobileTM, CETM, etc.), Apple's Macintosh OSTM and/or any other appropriate server operating system.
  • the animation server computer also includes animation server software 610, which provides animation services in accordance with embodiments of the invention.
  • the animation server computer 605 may also comprise (and/or have associated therewith) one or more storage media 615, which can include storage for the animation server software 610, as well as a variety of associated databases (such as a database of animation data 615a, a data store 615b for model data, such as the polygons and textures that describe an animated character, a data store 615c for scene data, and any other appropriate data stores).
  • storage media 615 can include storage for the animation server software 610, as well as a variety of associated databases (such as a database of animation data 615a, a data store 615b for model data, such as the polygons and textures that describe an animated character, a data store 615c for scene data, and any other appropriate data stores).
  • the system 600 further comprises one or more animation client computers 620, one or more of which may include local storage (not shown), as well as animation client software 625.
  • animation client computers 620 one or more of which may include local storage (not shown), as well as animation client software 625.
  • the rendering subsystem may reside on the animation server 620, as described with respect to Fig. 3, for example. In this way, thin clients, such as wireless phones, PDAs, etc. may be used to provide input even if they have insufficient processing power to render the objects).
  • the animation client computer thus 620 may be, inter alia, a PC, workstation, laptop, tablet computer, PDA, wireless phone, etc. running any appropriate operating system (such as Apple's Macintosh OSTM, UNIX and/or its derivatives, Microsoft WindowsTM, etc.)
  • Each animation client 620 may also include one or more display devices 630 (such as monitors, LCD panels, projectors, etc.) and/or one or more input devices 635 (such as the controllers described above and in U.S. Patent Application No. --/ (attorney docket number 020071-000210), already incorporated by reference, as well as, to name but a few examples, a telephone keypad, a stylus, etc.).
  • the system 600 may operate in the following exemplary manner, which is described by additional reference to Fig. 7, which illustrates a method 700 of creating an animated work in accordance with some embodiments of the invention.
  • Fig. 7 illustrates a method 700 of creating an animated work in accordance with some embodiments of the invention.
  • the method 700 of Fig. 7 is described in conjunction with the system 600 of Fig. 6, that description is provided for exemplary purposes only, and the methods of the invention are not limited to any particular hardware or software implementation.
  • the operation of the system 600 of Fig. 6 is not limited to the described methods.
  • the animation client software 625 comprises instructions executable by the animation client computer 620 to accept a set of input data from one or more input devices (block 705).
  • the input data may, for example, indicate a desired position of an object in a scene (which may be a virtual scene, a physical set, etc.) hi particular embodiments, the object may be an animated object, which may comprise a plurality of polygons and/or textures, as described above.
  • the animation client software optionally may process the input data, for example as described above.
  • the animation client software then transmits the set of input data for reception by the animation server computer (block 710).
  • the animation server computer 605 receives the input data (block 715).
  • the animation server software 610 calculates a set of position data (block 720), based on the received input data, hi some cases, calculating the set of position data can include processing the input data to determine a desired position of an animated object and/or calculating a set of joint rotations defining that desired position (and/or defining the deformation of a rig defining the character, in order to place the character in the desired position).
  • the position can be determined based solely on the input data, perhaps in conjunction with a current position of the object.
  • the object may be an animated character (or other object in a virtual scene), and the position of the object in the scene may be affected by the position of a virtual camera and/or light source.
  • the position data might comprise data about the position and/or orientation of the virtual camera/light.
  • the animation server computer 605 (perhaps based on instructions from the server software 610) then transmits the set of position data (e.g., joint rotations, etc.) for reception by the animation client 620 (block 725).
  • the animation client computer receives the set of position data (block 730)
  • the animation client software 625 is responsible for placing the object in the desired position (block 735).
  • This procedure necessarily will vary according to the nature of the object.
  • placing the object in the desired position generally will comprise rendering the animated character in the desired position, for example by calculating a set of positions for the polygons that describe the character and/or by applying any necessary textures to the model.
  • placing the object in the desired position may require interfacing with a movement system, which is not illustrated on Fig. 6 but examples of which are described in detail in U.S. Patent Application No. — / (attorney docket number 020071-000210), already incorporated by reference).
  • the object (for instance, if the object is a virtual object) may be displayed on a display device 630 (block 740).
  • the object may be displayed in the desired position.
  • the client 620 may be configured to upload the rendered object to the animation server 605 for storage and/or distribution to other computers (which might be, inter alia, other animation servers and/or clients).
  • the system 600 may provide a number of other features, some of which are described above.
  • the animation server 605 can provide animation services to a plurality of animation client computers (e.g., 620a, 620b).
  • input may be received at a first client 620a, and the position data may be transmitted to a second client 620b for rendering and/or display.
  • the plurality of client computers 620 may perform rendering tasks in parallel for a given scene).
  • each client 620a, 620b accepts input and receives position data, such that two artists may collaborate on a given character and/or scene, each being able to view changes made by the other.
  • each client 620a, 620b may interact individually with the server 605, with each client 620 providing its own input and receiving position data based on that input. (That is, the position data received by one client has no impact on the rendering of an object on another client.)
  • the animation server software 610 may be configured not only to calculate the position data, but also to render the object (which can include, merely by way of example, not only applying one or more textures to a model of the object, but also to calculating the positions of the polygons that make up the model, based on the position data).
  • the rendered object may be provided to an animation client computer 620 (which may or may not be the same client computer that provided the input on which the position data is based), which then can display the object in the desired position.
  • the animation server 605 might render a first object for a first client 620 and might merely provide to a second client a set of position data describing a desired position of a second object.
  • one or more of the data stores may be used to store object definition files, which can include some or all of the information necessary for rending a given object, such as the model, rig, polygons, textures, etc. describing that object.
  • An animation client 620 then can be configured to download from the server 605 the object definition files (and/or a subset therefore) to perform the rendering of the object in accordance with embodiments of the invention. It should be noted, however, that for security, the downloaded object definition files (and/or portions thereof) may be insufficient to allow a user of the client 620 to independently recreate the object without additional data resident on the server.
  • the system 600 may be configured such that a user of the client 620 is not allowed to modify these object definition files locally at the client 620 and/or, if local modification is allowed, the client 620 may not be allowed to upload modified object definition files. In this way, the system 600 can prevent the unauthorized modification of a "master copy" of the object definition files.
  • the server software 610 may be configured to allow modified object definition files to be uploaded (and thus to receive such files), perhaps based on an identification of the user of the animation client computer —that is, the server 620 may be configured to identify the user and determine whether the user has sufficient privileges to upload modified files. (It should be noted that the identification, authentication and/or authorization of users may be performed either by the animation server 605 and/or by another server, which might communicate such identification, authorization and/or authentication data to the animation server 605.)
  • the animation server software 610 may be configured to determine whether to allow an animation client 620 to interact with the server software 610.
  • the animation server software 620 may control access to rendered objects, object definition files, the position data, and/or the software components used to create either of these, based on any number of factors.
  • the server software 610 (and/or another component) may be configured to identify, authenticate and/or authorize a user of the animation client 620.
  • the animation server software 610 may determine whether it will receive input from the client computer 620, whether it will provide position data to the animation client computer 620 and/or whether it will allow the animation client computer 620 to access files and/or animation services on the animation server 605.
  • the animation server 605 may be configured to provide for- fee services.
  • the animation server software (and/or another component) may be configured to evaluate a set of payment and/or billing information (which may be, but is not necessarily, associated with an identity of a user of the animation client computer 620), and based on the set of payment and/or billing information, determine whether to allow the client 620 to interact with the server software 610 (including, as mentioned above, whether it will accept/provide data and/or allow access to files and/or services).
  • the set of billing and/or payment data can include, without limitation, information about whether a user has a subscription for animation services and/or files, whether the user has paid a per-use fee, whether the user's account is current, and/or any other relevant information.
  • various levels of interaction with the server software 610 may be allowed.
  • the animation server computer 605 stores a plurality of sets of rendered objects and/or object definition files (wherein, for example, each set of files comprises information describing a different animated character)
  • the animation server 605 may allow an unregistered user to download files for a few "free" characters, while paid subscribers have access to files an entire library of characters (it should be appreciated that there may be various levels of subscription, with access to files for corresponding various numbers of characters).
  • a user may be allowed to pay a per-character fee for a particular character, upon which the user is allowed to download the set of files for that character. (Such commerce functionality may be provided by a separate server, third-party service, etc.)
  • a user has a subscription (and/or pays a per-use fee)
  • the animation server software 610 may also be configured to perform change tracking and/or version management of object definition files (and/or rendered objects, position data, etc.).
  • any of several known methods of change tracking and/or version management may be used for this purpose.
  • the change tracking/version management functions may be configured to allow various levels of access to files based on an identity of a user and/or a project that the identified user is working on.
  • an artist in a group working on a particular character, scene, film, etc. may be authorized to access (as well, perhaps, as download) files related to that character, scene, film, while a manager or senior artist might be authorized to modify such files.
  • An artist working on another project might not have access to any such files.
  • the animation server software 605 may also be configured to distribute (e.g., to other clients and/or servers) a set of modified object definition files, such that each user has access to the most recent version of these files. As described above, access to a distribution of these modified files may be controlled based on an identity of the user, various payment or billing information, etc.
  • Embodiments of the invention can be configured to protect stored and/or transmitted data, including without limitation object definition files, rendered objects, input data, position data, and the like.
  • data can be protected in a variety of ways.
  • data may be protected with access control mechanisms, such as those described above.
  • other protection measures may be implemented as well.
  • data may be encrypted prior to being stored at an animation server and/or prior to being transmitted between an animation server and an animation client, to prevent unauthorized access to such data.
  • data may be digitally signed and/or certified before storage and/or before transmission between computers. Such signatures and/or certifications can be used, inter alia, to verify the identification of an entity that created and/or modified such data, which can also facilitate change tracking and/or version management of various data used by embodiments of the invention.
  • Fig. 8 provides a generalized schematic illustration of one embodiment of a computer system 800 that can perform the methods of the invention and/or the functions of computer, such as the animation server and client computers described above.
  • Fig. 8 is meant only to provide a generalized illustration of various components, any of which may be utilized as appropriate.
  • the computer system 800 can include hardware components that can be coupled electrically via a bus 805, including one or more processors 810; one or more storage devices 815, which can include without limitation a disk drive, an optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read ⁇ only memory (“ROM”), which can be programmable, flash-updateable and/or the like (and which can function as a data store, as described above).
  • RAM random access memory
  • ROM read ⁇ only memory
  • Also in communication with the bus 805 can be one or more input devices 820, which can include without limitation a mouse, a keyboard and/or the like; one or more output devices 825, which can include without limitation a display device, a printer and/or the like; and a communications subsystem 830; which can include without limitation a modem, a network card (wireless or wired), an infra ⁇ red communication device, and/or the like).
  • input devices 820 can include without limitation a mouse, a keyboard and/or the like
  • output devices 825 which can include without limitation a display device, a printer and/or the like
  • a communications subsystem 830 which can include without limitation a modem, a network card (wireless or wired), an infra ⁇ red communication device, and/or the like).
  • the computer system 800 also can comprise software elements, shown as being currently located within a working memory 835, including an operating system 840 and/or other code 845, such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention.
  • an operating system 840 and/or other code 845, such as the application programs (including without limitation the animation server and client software) described above and/or designed to implement methods of the invention.
  • application programs including without limitation the animation server and client software
  • the application programs including without limitation the animation server and client software

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)

Abstract

Dans divers modes de réalisation, l'invention a trait à de nouveaux logiciels, systèmes et procédés destinés à l'animation et/ou la réalisation de films. Dans un ensemble de modes de réalisation, par exemple, un système client-serveur donne la possibilité de commander divers aspects d'une scène en direct et/ou d'une scène animée, notamment des caméras et/ou des sources de lumière (réelles et/ou virtuelles), des personnages animés et d'autres objets. Ladite commande peut reposer entre autres sur le déplacement de caméras, de lampes et/ou analogues, ainsi que sur le rendu d'objets animés (par exemple, sur la base des mouvements des objets eux-mêmes et/ou des mouvements des caméras, des lampes, etc.).
PCT/US2005/039140 2004-10-28 2005-10-28 Logiciels, systemes et procedes client-serveur d'animation WO2006050198A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US62341504P 2004-10-28 2004-10-28
US62341404P 2004-10-28 2004-10-28
US60/623,415 2004-10-28
US60/623,414 2004-10-28

Publications (2)

Publication Number Publication Date
WO2006050198A2 true WO2006050198A2 (fr) 2006-05-11
WO2006050198A3 WO2006050198A3 (fr) 2009-04-16

Family

ID=36319712

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2005/039139 WO2006050197A2 (fr) 2004-10-28 2005-10-28 Dispositif de commande de camera et d'animation, systemes et procedes associes
PCT/US2005/039140 WO2006050198A2 (fr) 2004-10-28 2005-10-28 Logiciels, systemes et procedes client-serveur d'animation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2005/039139 WO2006050197A2 (fr) 2004-10-28 2005-10-28 Dispositif de commande de camera et d'animation, systemes et procedes associes

Country Status (2)

Country Link
US (3) US20060109274A1 (fr)
WO (2) WO2006050197A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI448999B (zh) * 2010-10-13 2014-08-11 Univ Nat Cheng Kung 塗鴉生命化方法及使用該方法之系統

Families Citing this family (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7266425B2 (en) * 2004-09-30 2007-09-04 Rockwell Automation Technologies, Inc. Systems and methods that facilitate motion control through coordinate system transformations
WO2006050197A2 (fr) * 2004-10-28 2006-05-11 Accelerated Pictures, Llc Dispositif de commande de camera et d'animation, systemes et procedes associes
US20070109304A1 (en) * 2005-11-17 2007-05-17 Royi Akavia System and method for producing animations based on drawings
DE102005058867B4 (de) * 2005-12-09 2018-09-27 Cine-Tv Broadcast Systems Gmbh Verfahren und Vorrichtung zum Bewegen einer auf einem Schwenk- und Neigekopf angeordneten Kamera entlang einer vorgegebenen Bewegungsbahn
JP5204381B2 (ja) * 2006-05-01 2013-06-05 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム及びゲーム処理方法
US8281281B1 (en) * 2006-06-07 2012-10-02 Pixar Setting level of detail transition points
WO2008011352A2 (fr) * 2006-07-16 2008-01-24 The Jim Henson Company Système et procédé pour animer un personnage via les performances d'une seule personne
US8339402B2 (en) * 2006-07-16 2012-12-25 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
US7880770B2 (en) * 2006-07-28 2011-02-01 Accelerated Pictures, Inc. Camera control
WO2008014487A2 (fr) * 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Organisation de scènes lors d'un tournage assisté par ordinateur
WO2008042425A2 (fr) * 2006-10-03 2008-04-10 Wms Gaming Inc. Moteur physique partagé dans un système de jeu de mises
KR20090002292A (ko) * 2007-06-26 2009-01-09 삼성전자주식회사 가상 캐릭터를 동기화하고 공유하는 장치 및 방법
US8443284B2 (en) * 2007-07-19 2013-05-14 Apple Inc. Script-integrated storyboards
CN101354639A (zh) * 2007-07-25 2009-01-28 联想(北京)有限公司 在终端之间操作对象的方法及终端
US20090046097A1 (en) * 2007-08-09 2009-02-19 Scott Barrett Franklin Method of making animated video
US8169438B1 (en) * 2008-03-31 2012-05-01 Pixar Temporally coherent hair deformation
JP4760896B2 (ja) * 2008-11-04 2011-08-31 ソニー株式会社 カメラ制御装置及びカメラ制御方法
US8704832B2 (en) * 2008-09-20 2014-04-22 Mixamo, Inc. Interactive design, synthesis and delivery of 3D character motion data through the web
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
US8749556B2 (en) * 2008-10-14 2014-06-10 Mixamo, Inc. Data compression for real-time streaming of deformable 3D models for 3D animation
US20100110081A1 (en) * 2008-10-30 2010-05-06 Microsoft Corporation Software-aided creation of animated stories
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
US8659596B2 (en) * 2008-11-24 2014-02-25 Mixamo, Inc. Real time generation of animation-ready 3D character models
KR101588666B1 (ko) * 2008-12-08 2016-01-27 삼성전자주식회사 디스플레이 장치 및 그의 표시방법
US8698898B2 (en) * 2008-12-11 2014-04-15 Lucasfilm Entertainment Company Ltd. Controlling robotic motion of camera
US20100259547A1 (en) 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US20100231582A1 (en) * 2009-03-10 2010-09-16 Yogurt Bilgi Teknolojileri A.S. Method and system for distributing animation sequences of 3d objects
WO2010129721A2 (fr) * 2009-05-05 2010-11-11 Mixamo, Inc. Capture distribuée de mouvement sans marqueur
US20120188333A1 (en) * 2009-05-27 2012-07-26 The Ohio State University Spherical view point controller and method for navigating a network of sensors
KR20100138700A (ko) * 2009-06-25 2010-12-31 삼성전자주식회사 가상 세계 처리 장치 및 방법
US8527217B2 (en) * 2009-09-08 2013-09-03 Dynamic Athletic Research Institute, Llc Apparatus and method for physical evaluation
US20110081959A1 (en) * 2009-10-01 2011-04-07 Wms Gaming, Inc. Representing physical state in gaming systems
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
EP2593197A2 (fr) * 2010-07-14 2013-05-22 University Court Of The University Of Abertay Dundee Améliorations relatives à une visualisation d'environnements générés par ordinateur en temps réel
US8797328B2 (en) * 2010-07-23 2014-08-05 Mixamo, Inc. Automatic generation of 3D character animation from 3D meshes
US9652201B2 (en) * 2010-10-01 2017-05-16 Adobe Systems Incorporated Methods and systems for physically-based runtime effects
US20130242096A1 (en) * 2010-11-24 2013-09-19 Aquadownunder Pty Ltd. Apparatus and method for environmental monitoring
US8922547B2 (en) * 2010-12-22 2014-12-30 Electronics And Telecommunications Research Institute 3D model shape transformation method and apparatus
EP2680931A4 (fr) 2011-03-04 2015-12-02 Eski Inc Dispositifs et procédés pour fournir une manifestation répartie dans un environnement
US8540438B1 (en) 2011-03-24 2013-09-24 CamMate Systems. Inc. Systems and methods for positioning a camera crane
US8333520B1 (en) 2011-03-24 2012-12-18 CamMate Systems, Inc. Systems and methods for detecting an imbalance of a camera crane
US9724600B2 (en) * 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
EP2538330A1 (fr) * 2011-06-21 2012-12-26 Unified Computing Limited Procédé de rendu de fichier de scène dans une ferme de rendu de type nuage
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US10748325B2 (en) 2011-11-17 2020-08-18 Adobe Inc. System and method for automatic rigging of three dimensional characters for facial animation
US9044857B2 (en) * 2012-02-14 2015-06-02 Jerry Neal Sommerville Control system that guides a robot or articulated device with a laser distance meter for 3D motion, or guides a robot or articulated device with a computer pointing device (such as a mouse) for 2D motion
US9747495B2 (en) 2012-03-06 2017-08-29 Adobe Systems Incorporated Systems and methods for creating and distributing modifiable animated video messages
US20130318424A1 (en) * 2012-05-28 2013-11-28 Ian A. R. Boyd System and method for the creation of an e-enhanced multi-dimensional pictostory using pictooverlay technology
US9659397B2 (en) * 2013-01-11 2017-05-23 Disney Enterprises, Inc. Rig-based physics simulation
US9892539B2 (en) 2013-01-11 2018-02-13 Disney Enterprises, Inc. Fast rig-based physics simulation
US9130492B2 (en) * 2013-04-22 2015-09-08 Thermadyne, Inc. Animatronic system with unlimited axes
US20150269855A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Systems and methods for interacting with a visual cell
WO2015143303A1 (fr) 2014-03-20 2015-09-24 Digizyme, Inc. Systèmes et procédés destinés à la fourniture d'un produit de visualisation
US20150269870A1 (en) * 2014-03-20 2015-09-24 Digizyme, Inc. Visual cell
JP6385725B2 (ja) * 2014-06-06 2018-09-05 任天堂株式会社 情報処理システム及び情報処理プログラム
US10073488B2 (en) 2014-09-11 2018-09-11 Grayhill, Inc. Multifunction joystick apparatus and a method for using same
CN108702425B (zh) 2016-02-25 2021-04-13 Kddi株式会社 设备控制装置、通信终端、设备控制方法、对价计算方法以及设备控制系统
US10025550B2 (en) * 2016-03-15 2018-07-17 Intel Corporation Fast keyboard for screen mirroring
US10559111B2 (en) 2016-06-23 2020-02-11 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
EP3475920A4 (fr) 2016-06-23 2020-01-15 Loomai, Inc. Systèmes et procédés pour générer des modèles d'animation adaptés à l'ordinateur d'une tête humaine à partir d'images de données capturées
WO2018027191A1 (fr) 2016-08-05 2018-02-08 MotoCrane, LLC Support amovible pour caméra de véhicule
WO2018045446A1 (fr) 2016-09-07 2018-03-15 Eski Inc. Systèmes de projection pour manifestation répartie et procédés associés
US10847330B2 (en) 2017-10-06 2020-11-24 Grayhill, Inc. No/low-wear bearing arrangement for a knob system
US10795494B2 (en) 2018-01-03 2020-10-06 Grayhill, Inc. Touch encoder, touch panel, and input method editor with integrated development environment and methods thereof
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions
JP7341685B2 (ja) * 2019-03-19 2023-09-11 キヤノン株式会社 電子機器、電子機器の制御方法、プログラム、及び、記憶媒体
US11551393B2 (en) 2019-07-23 2023-01-10 LoomAi, Inc. Systems and methods for animation generation
CN110941216B (zh) * 2019-11-25 2021-03-12 江苏徐工工程机械研究院有限公司 无线急停系统和方法
US20220002128A1 (en) * 2020-04-09 2022-01-06 Chapman/Leonard Studio Equipment, Inc. Telescoping electric camera crane

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6084590A (en) * 1997-04-07 2000-07-04 Synapix, Inc. Media production with correlation of image stream and abstract objects in a three-dimensional virtual stage
US6377257B1 (en) * 1999-10-04 2002-04-23 International Business Machines Corporation Methods and apparatus for delivering 3D graphics in a networked environment
US6559845B1 (en) * 1999-06-11 2003-05-06 Pulse Entertainment Three dimensional animation system and method
US20040167924A1 (en) * 2003-02-21 2004-08-26 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and distributed processing system
US20040179013A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model
US20040181548A1 (en) * 2003-03-12 2004-09-16 Thomas Mark Ivan Digital asset server and asset management system

Family Cites Families (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1217419A (en) 1968-02-09 1970-12-31 Euchar Nehmann Instructional optical kit
EP0331265B1 (fr) * 1988-03-01 1995-08-23 Hitachi Construction Machinery Co., Ltd. Dispositif de commande de position/force pour machine à usiner avec des degrés de liberté multiples
US5091849A (en) 1988-10-24 1992-02-25 The Walt Disney Company Computer image production system utilizing first and second networks for separately transferring control information and digital image data
GB2229058B (en) * 1989-02-07 1993-12-08 Furuno Electric Co Detection system
US5268996A (en) * 1990-12-20 1993-12-07 General Electric Company Computer image generation method for determination of total pixel illumination due to plural light sources
US5764276A (en) * 1991-05-13 1998-06-09 Interactive Pictures Corporation Method and apparatus for providing perceived video viewing experiences using still images
US5658238A (en) 1992-02-25 1997-08-19 Olympus Optical Co., Ltd. Endoscope apparatus capable of being switched to a mode in which a curvature operating lever is returned and to a mode in which the curvature operating lever is not returned
US5921659A (en) * 1993-06-18 1999-07-13 Light & Sound Design, Ltd. Stage lighting lamp unit and stage lighting system including such unit
US5617515A (en) 1994-07-11 1997-04-01 Dynetics, Inc. Method and apparatus for controlling and programming a robot or other moveable object
JP3262465B2 (ja) * 1994-11-17 2002-03-04 シャープ株式会社 スケジュール管理装置
US6199082B1 (en) * 1995-07-17 2001-03-06 Microsoft Corporation Method for delivering separate design and content in a multimedia publishing system
US6219045B1 (en) 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US5790124A (en) 1995-11-20 1998-08-04 Silicon Graphics, Inc. System and method for allowing a performer to control and interact with an on-stage display device
IL126142A0 (en) 1996-03-15 1999-05-09 Zapa Digital Art Ltd Programmable computer graphic objects
US5852435A (en) * 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
US5909218A (en) 1996-04-25 1999-06-01 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
CN1146805C (zh) * 1996-04-25 2004-04-21 松下电器产业株式会社 通信型计算机图象动画方法
US5752244A (en) * 1996-07-15 1998-05-12 Andersen Consulting Llp Computerized multimedia asset management system
EP0959444A4 (fr) 1996-08-14 2005-12-07 Nurakhmed Nurislamovic Latypov Procede de suivi et de representation de la position et de l'orientation d'un sujet dans l'espace, procede de presentation d'un espace virtuel a ce sujet, et systemes de mise en oeuvre de ces procedes
US5886702A (en) 1996-10-16 1999-03-23 Real-Time Geometry Corporation System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
EP0851207B1 (fr) * 1996-12-31 2004-05-12 DATALOGIC S.p.A. Procédé pour la mesure du volume d'un objet par un balayeur laser et un détecteur CCD
US6058397A (en) 1997-04-08 2000-05-02 Mitsubishi Electric Information Technology Center America, Inc. 3D virtual environment creation management and delivery system
US6463444B1 (en) * 1997-08-14 2002-10-08 Virage, Inc. Video cataloger system with extensibility
US6766946B2 (en) 1997-10-16 2004-07-27 Dentsu, Inc. System for granting permission of user's personal information to third party
US6546405B2 (en) * 1997-10-23 2003-04-08 Microsoft Corporation Annotating temporally-dimensioned multimedia content
US6268864B1 (en) * 1998-06-11 2001-07-31 Presenter.Com, Inc. Linking a video and an animation
US6278466B1 (en) * 1998-06-11 2001-08-21 Presenter.Com, Inc. Creating animation from a video
EP0973129A3 (fr) * 1998-07-17 2005-01-12 Matsushita Electric Industrial Co., Ltd. Système de compression de données d'images mobiles
US6697869B1 (en) 1998-08-24 2004-02-24 Koninklijke Philips Electronics N.V. Emulation of streaming over the internet in a broadcast application
US6313833B1 (en) * 1998-10-16 2001-11-06 Prophet Financial Systems Graphical data collection and retrieval interface
CA2345954C (fr) * 1998-10-21 2004-12-21 Omron Corporation Detecteur de mines et appareil d'inspection
US6222551B1 (en) * 1999-01-13 2001-04-24 International Business Machines Corporation Methods and apparatus for providing 3D viewpoint selection in a server/client arrangement
JP4006873B2 (ja) * 1999-03-11 2007-11-14 ソニー株式会社 情報処理システム、情報処理方法及び装置、並びに情報提供媒体
US6538651B1 (en) 1999-03-19 2003-03-25 John Hayman Parametric geometric element definition and generation system and method
US6947044B1 (en) 1999-05-21 2005-09-20 Kulas Charles J Creation and playback of computer-generated productions using script-controlled rendering engines
US6738065B1 (en) 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
DE19958443C2 (de) * 1999-12-03 2002-04-25 Siemens Ag Bedieneinrichtung
US7012627B1 (en) * 1999-12-28 2006-03-14 International Business Machines Corporation System and method for presentation of room navigation
US6741252B2 (en) * 2000-02-17 2004-05-25 Matsushita Electric Industrial Co., Ltd. Animation data compression apparatus, animation data compression method, network server, and program storage media
US6714200B1 (en) * 2000-03-06 2004-03-30 Microsoft Corporation Method and system for efficiently streaming 3D animation across a wide area network
US6760010B1 (en) * 2000-03-15 2004-07-06 Figaro Systems, Inc. Wireless electronic libretto display apparatus and method
US20020138843A1 (en) 2000-05-19 2002-09-26 Andrew Samaan Video distribution method and system
US6943794B2 (en) * 2000-06-13 2005-09-13 Minolta Co., Ltd. Communication system and communication method using animation and server as well as terminal device used therefor
US20020024517A1 (en) * 2000-07-14 2002-02-28 Komatsu Ltd. Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space
KR100736472B1 (ko) * 2000-11-14 2007-07-06 지멘스 악티엔게젤샤프트 차량 내부의 탑승 유무를 결정하는 방법 및 장치
US6646643B2 (en) 2001-01-05 2003-11-11 The United States Of America As Represented By The Secretary Of The Navy User control of simulated locomotion
US6966837B1 (en) * 2001-05-10 2005-11-22 Best Robert M Linked portable and video game systems
WO2002093497A1 (fr) 2001-05-14 2002-11-21 Netdimension Corporation Systeme de distribution d'informations et procede de distribution d'informations
US7423666B2 (en) * 2001-05-25 2008-09-09 Minolta Co., Ltd. Image pickup system employing a three-dimensional reference object
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7274380B2 (en) 2001-10-04 2007-09-25 Siemens Corporate Research, Inc. Augmented reality system
US20030195853A1 (en) * 2002-03-25 2003-10-16 Mitchell Cyndi L. Interaction system and method
US6898484B2 (en) * 2002-05-01 2005-05-24 Dorothy Lemelson Robotic manufacturing and assembly with relative radio positioning using radio based location determination
US7246322B2 (en) * 2002-07-09 2007-07-17 Kaleidescope, Inc. Grid-like guided user interface for video selection and display
US6822653B2 (en) * 2002-06-28 2004-11-23 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US7004914B2 (en) * 2002-08-26 2006-02-28 Kensey Nash Corporation Crimp and cut tool for sealing and unsealing guide wires and tubular instruments
US20040138959A1 (en) 2002-09-09 2004-07-15 Michal Hlavac Artificial intelligence platform
US20040061781A1 (en) * 2002-09-17 2004-04-01 Eastman Kodak Company Method of digital video surveillance utilizing threshold detection and coordinate tracking
US7305112B2 (en) * 2002-10-15 2007-12-04 The Scripps Research Institute Method of converting rare cell scanner image coordinates to microscope coordinates using reticle marks on a sample media
US20040114786A1 (en) * 2002-12-06 2004-06-17 Cross Match Technologies, Inc. System and method for capturing print information using a coordinate conversion method
KR100507780B1 (ko) * 2002-12-20 2005-08-17 한국전자통신연구원 고속 마커프리 모션 캡쳐 장치 및 방법
US7426423B2 (en) * 2003-05-30 2008-09-16 Liebherr-Werk Nenzing—GmbH Crane or excavator for handling a cable-suspended load provided with optimised motion guidance
EP1638340B1 (fr) * 2003-06-23 2013-08-07 Sony Corporation Procede et dispositif de traitement d'images, et programme associe
KR20050000276A (ko) * 2003-06-24 2005-01-03 주식회사 성진씨앤씨 감시 카메라 제어용 가상 조이스틱 시스템 및 제어 방법
US20060036162A1 (en) * 2004-02-02 2006-02-16 Ramin Shahidi Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
US7372463B2 (en) 2004-04-09 2008-05-13 Paul Vivek Anand Method and system for intelligent scalable animation with intelligent parallel processing engine and intelligent animation engine
US20050248577A1 (en) 2004-05-07 2005-11-10 Valve Corporation Method for separately blending low frequency and high frequency information for animation of a character in a virtual environment
US20060022983A1 (en) * 2004-07-27 2006-02-02 Alias Systems Corp. Processing three-dimensional data
JP5043844B2 (ja) 2004-08-30 2012-10-10 トレース オプティック テクノロジーズ ピーティーワイ リミテッド カメラ制御の方法と装置
US7266425B2 (en) * 2004-09-30 2007-09-04 Rockwell Automation Technologies, Inc. Systems and methods that facilitate motion control through coordinate system transformations
WO2006050197A2 (fr) 2004-10-28 2006-05-11 Accelerated Pictures, Llc Dispositif de commande de camera et d'animation, systemes et procedes associes
WO2008014487A2 (fr) 2006-07-28 2008-01-31 Accelerated Pictures, Inc. Organisation de scènes lors d'un tournage assisté par ordinateur
US7880770B2 (en) 2006-07-28 2011-02-01 Accelerated Pictures, Inc. Camera control

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6084590A (en) * 1997-04-07 2000-07-04 Synapix, Inc. Media production with correlation of image stream and abstract objects in a three-dimensional virtual stage
US6559845B1 (en) * 1999-06-11 2003-05-06 Pulse Entertainment Three dimensional animation system and method
US6377257B1 (en) * 1999-10-04 2002-04-23 International Business Machines Corporation Methods and apparatus for delivering 3D graphics in a networked environment
US20040167924A1 (en) * 2003-02-21 2004-08-26 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and distributed processing system
US20040181548A1 (en) * 2003-03-12 2004-09-16 Thomas Mark Ivan Digital asset server and asset management system
US20040179013A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SCHNECK, P.B.: 'Persistent access control to prevent piracy of digital information' PROCEEDINGS OF THE IEEE vol. 87, no. ISS.7, July 1999, pages 1239 - 1250 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI448999B (zh) * 2010-10-13 2014-08-11 Univ Nat Cheng Kung 塗鴉生命化方法及使用該方法之系統

Also Published As

Publication number Publication date
US7433760B2 (en) 2008-10-07
US20060109274A1 (en) 2006-05-25
WO2006050198A3 (fr) 2009-04-16
US20080312770A1 (en) 2008-12-18
WO2006050197A3 (fr) 2007-12-21
WO2006050197A2 (fr) 2006-05-11
US20060106494A1 (en) 2006-05-18

Similar Documents

Publication Publication Date Title
US20060109274A1 (en) Client/server-based animation software, systems and methods
US7116330B2 (en) Approximating motion using a three-dimensional model
Latoschik et al. FakeMi: A fake mirror system for avatar embodiment studies
US7460130B2 (en) Method and system for generation, storage and distribution of omni-directional object views
Wei et al. Modeling hair from multiple views
US11024098B1 (en) Augmenting a physical object with virtual components
Hauswiesner et al. Free viewpoint virtual try-on with commodity depth cameras
EP3980974A1 (fr) Animation de corps en temps réel basée sur une image unique
Dutreve et al. Feature points based facial animation retargeting
Ponton et al. Combining Motion Matching and Orientation Prediction to Animate Avatars for Consumer‐Grade VR Devices
Nishino et al. 3d object modeling using spatial and pictographic gestures
Lam et al. Human-avatar interaction in metaverse: Framework for full-body interaction
WO2001063560A1 (fr) Creation d'un avatar de jeu en trois dimensions en utilisant des caracteristiques physiques
Balcisoy et al. Interaction between real and virtual humans in augmented reality
Li et al. Collaborative distributed virtual sculpting
KR101859318B1 (ko) 360˚ 가상카메라를 활용한 영상콘텐츠 제작 방법
US11450054B2 (en) Method for operating a character rig in an image-generation system using constraints on reference nodes
US11074738B1 (en) System for creating animations using component stress indication
JP7459199B1 (ja) 画像処理システム
US9128516B1 (en) Computer-generated imagery using hierarchical models and rigging
Magnenat-Thalmann et al. Applications of interactive virtual humans in mobile augmented reality
Huynh Development of a standardized framework for cost-effective communication system based on 3D data streaming and real-time 3D reconstruction
Lai et al. Extra detail addition based on existing texture for animated news production
Mashalkar et al. Creating Personalized Avatars
De Aguiar Animation and performance capture using digitized models

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05815396

Country of ref document: EP

Kind code of ref document: A2