CN101952879A - Methods and apparatus for designing animatronics units from articulated computer generated characters - Google Patents

Methods and apparatus for designing animatronics units from articulated computer generated characters Download PDF

Info

Publication number
CN101952879A
CN101952879A CN2008801267633A CN200880126763A CN101952879A CN 101952879 A CN101952879 A CN 101952879A CN 2008801267633 A CN2008801267633 A CN 2008801267633A CN 200880126763 A CN200880126763 A CN 200880126763A CN 101952879 A CN101952879 A CN 101952879A
Authority
CN
China
Prior art keywords
electronic cartoon
response
unit
cartoon unit
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2008801267633A
Other languages
Chinese (zh)
Inventor
J·安德森
R·酷克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PI KESA
Pixar
Original Assignee
PI KESA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/958,239 external-priority patent/US8232998B2/en
Priority claimed from US11/958,233 external-priority patent/US8390629B2/en
Application filed by PI KESA filed Critical PI KESA
Publication of CN101952879A publication Critical patent/CN101952879A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Processing Or Creating Images (AREA)
  • Toys (AREA)
  • Numerical Control (AREA)

Abstract

A method for specifying a design for an animatronics unit includes receiving motion data comprising artistically determined motions, determining a design for construction of at least a portion of the animatronics unit in response to the motion data, and outputting the design for construction of the animatronics unit.

Description

Be used for generating the method and apparatus that the role designs the Electronic cartoon unit according to hinged computing machine
Background technology
The present invention relates to animation.More specifically, the present invention relates to be used for generating the method and apparatus that the role designs the Electronic cartoon unit based on hinged computing machine.
For many years, the cineaste usually attempts to tell about the story of the biology that relates to the imagination, So Far Away and unusual things.For this reason, they usually depend on cartoon technique and give the imagination with " life ".Traditionally, two main paties in the animation comprise: based on the cartoon technique and the stop-motion animation technology of drawing.
Based on the cartoon technique of drawing at twentieth century by having carried out refinement such as cineastes such as Walt Disney, and in such as " Snow White and the Seven Dwarfs " (1937) and " Fantasia " films such as (1940), obtain use.This cartoon technique needs the artist that the image Freehandhand-drawing (or drawing) of animation is arrived on the transparent medium or xylonite (cel) usually.After drawing, each xylonite can be caught or is recorded on the film then, with as the one or more frames in the film.
Usually need construct micro setting, stage property and role based on the cartoon technique that fixes.The cineaste will construct setting, add stage property, and place the micro role who is in certain attitude.After the animation teacher pleases oneself to all of arranging, can take a frame or the multiframe film of this specific arrangements.The stop-motion animation technology is used for the films such as (1933) such as " King Kong " by such as film making merchants such as Willis O ' Brien exploitation.Subsequently, these technology are used to comprise the film of " Mighty Joe Young " (1948) and Clash Of The Titans (1981) by such as animation teacher refinements such as Ray Harryhausen.
Along with computing machine extensively the popularizing of twentieth century second half, the animation teacher begins to depend on computing machine and assists animation process.This comprises using a computer and promotes animation based on drawing, for example by drawing image, generating intermediate image (" animation between benefit ") etc.This also comprises using a computer and strengthens the stop-motion animation technology.For example, physical model can be represented and handled by the dummy model in the computer memory.
It is Pixar that computing machine generates one of the pioneer company on image (CGI) boundary.Pixar is called as Pixar Animation Studios comparatively widely, it is the creator such as " Toy Story " (1995) and " Toy Story 2 " (1999), " A Bugs Life " (1998), " Monsters, Inc. " (2001), " Finding Nemo " (2003), " The Incredibles " (2004), " Cars " (2006), " Ratatouille " cartoons such as (2007).Except the creation cartoon, Pixar has developed the computing platform that designs at CGI specially, and is called as RenderMan now
Figure BPA00001204497800021
CGI software.RenderMan
Figure BPA00001204497800022
Software comprises " render engine ", and two dimensional image " is played up " or be converted to its geometry and/or mathematical description with animate object or role.RenderMan
Figure BPA00001204497800023
Well received at animation circle, and obtained AcademyAwards twice
Figure BPA00001204497800024
The present inventor's expectation now outside two dimensional image, and expands to (for example, physical world) in the three-dimensional with the range expansion of its cartoon role.For this reason, the inventor has proposed to be used for to construct and be controlled at the method for the physics version (for example, electricity ground, the equipment mechanically, pneumatically and/or controlled) of the cartoon role that various positives occur with surging.
Equipment physics, machinery is used for instant entertainment purposes and is started by Ward Disney company (The Walt Disney Company), and in the industry cycle be called as " Electronic cartoon (animatronics) " now.Be used for known theme park program (for example, performance, ride installation) before Electronic cartoon role or the unit, such as the Pirates of Caribbean, Enchanted Tiki Room, Great Moments with Mr.Lincoln or the like.
The problem that the inventor recognizes is: because the Electronic cartoon unit is used to specific purpose, can't buys ready-made hardware, but usually must customize.The main challenge that the inventor recognizes is: how to formulate and make up the Electronic cartoon unit that can move according to the mode that spectators approved.More specifically, problem is: how to make up and control such Electronic cartoon role, it can be than the present feasible animation of role in positive, film, playlet etc. of more verily representing.
What be sure of is, generates the structure that (CG) role's animation data never is used to help to specify the Electronic cartoon role from computing machine.Be sure of in addition, never be used to control this Electronic cartoon role from CG role's animation data.
In view of this, desired is the method and apparatus that is used to solve above-mentioned challenge.
Summary of the invention
The present invention relates to Electronic cartoon.More specifically, the present invention relates to be used for designing the method and apparatus of Electronic cartoon role or unit according to exercise data, wherein exercise data such as: generate the motion of role's animation data appointment, the motion of using the specified facial characteristics of motion, the animation data of capturing movement technology appointment, capturing movement or the like by computing machine.
According to one aspect of the present invention, a kind of method that is used to specify the design of Electronic cartoon unit is disclosed.A kind of technology comprises: receive exercise data, this exercise data comprises the motion that art is upward determined; And, determine design to the structure of small part at this Electronic cartoon unit in response to this exercise data.Process can comprise: export this design to be used for the structure of this Electronic cartoon unit.
According to another aspect of the present invention, the Electronic cartoon unit that forms according to the whole bag of tricks described here is disclosed.
According to another aspect of the present invention, a kind of computer system has been described.A kind of device comprises storer, and configuration is used for the storing moving data, and this exercise data comprises the motion that art is upward determined.A kind of equipment, comprise the processor that is coupled to this storer, wherein the configuration of this processor is used for determining the design to the structure of small part at this Electronic cartoon unit in response to this exercise data, and wherein, this processor configuration is used to export this design to be used for the structure of this Electronic cartoon unit.
According to another aspect of the present invention, a kind of computer program that resides on the tangible medium comprises the executable code that can carry out on computer system, and wherein this computer system comprises processor and storer.This computer program can comprise: configuration is used to guide processor to obtain the code of exercise data from storer, and this exercise data comprises that art goes up the motion of determining; And configuration is used to guide processor to determine the code to the design of the structure of small part of Electronic cartoon unit in response to this exercise data.This computer program can also comprise that configuration is used for instruction processorunit and exports the code of this design with the structure that is used for this Electronic cartoon unit.Code can reside on the computer-readable tangible medium such as light medium (DVD, HD DVD, blue-ray DVD, holographic media etc.), magnetic medium (hard disk drive, floppy disk etc.), semiconductor medium (flash memory, RAM, ROM etc.).
According to another aspect of the present invention, a kind of method that is used to specify the design of Electronic cartoon unit is described.Various operations comprise: receive design to the structure of small part at this Electronic cartoon unit, and should be to small part in response to what construct this Electronic cartoon unit at the design to the structure of small part of this Electronic cartoon.In each embodiment,, determine being somebody's turn to do at this Electronic cartoon unit to the design of the structure of small part in response to comprising the artistic exercise data of going up the motion of determining.
In each embodiment, the exercise data that exercise data can driven painter's appointment is derived, for example at FA Facial Animation, role animation, object animation etc.And exercise data can be derived from the physics representation of data, and for example facial facial representation of data that has the performer of mark has the performer's of mark physical manifestations data etc. on the health.In each embodiment, exercise data can be designated as the one or more attitudes about the time such as animate object, performer.
Different with traditional industrial robot, the motion of Electronic cartoon unit is based on the representation of data of determining on the art (for example, animation data, physics performer move).Different therewith, the motion of industrial robot is " industry ", is not to represent at artistic expression or definite.In each embodiment, the animation teacher is the artist who goes in for the study in human motion and expression field.These animations teacher can use its artistic skill and judgement then, at each time point that is called " key frame " CG role's attitude (animation data) is set, to transmit emotion or expression.Use a computer, can carry out interpolation to CG role's attitude at the time point between these " key frames ".Thus, CG role determined in the attitude of each time point, thereby obtained CG role's " animation " of seeing in cartoon etc.In the embodiments of the present invention, can design and control the Electronic cartoon unit based on the animation data that drives on this art.In other embodiments, the input data can derive from the physics performance of performer when moving or make a sign with the hand.The performance of these physics is a capturing movement, and can be according to being used with the similar mode of the animation data of definition " key frame " above etc.
According to above, should be appreciated that the term " animation data " that relates to animate object or role only is an example of " exercise data ".Thus, in appropriate circumstances, " animation data " that be used for animate object or role mentioned in this instructions or " exercise data " that also can relate to other types are such as performer's physics representation of data (for example, data are caught in action) etc.
Description of drawings
In order to understand the present invention more all sidedly, be described in conjunction with the accompanying drawings.Should be understood that accompanying drawing should not be considered to the restriction to invention scope,, describe embodiment of describing at present and the optimal mode of understanding at present of the present invention in more detail by using accompanying drawing.
Fig. 1 is the block diagram according to the typical computer system of each embodiment of the present invention;
Fig. 2 A-Fig. 2 B shows the block diagram according to the process of each embodiment of the present invention;
Fig. 3 shows the block diagram according to the additional process of each embodiment of the present invention;
Fig. 4 A-Fig. 4 B shows the example according to each embodiment of the present invention;
Fig. 5 A-Fig. 5 C shows the example according to the physics control structure of each embodiment of the present invention; And
Fig. 6 shows the block diagram of the high level explanation of additional embodiment.
Embodiment
Fig. 1 is the block diagram according to the typical computer system 100 of one embodiment of the present invention.
In the present embodiment, computer system 100 generally includes display/monitor 110, computing machine 120, keyboard 130, user input device 140, computer interface 150 etc.
In the present embodiment, user input device 140 is presented as computer mouse, trace ball, track pad, operating rod, wireless remote control, drawing board, voice command system, eye tracking system etc. usually.User input device 140 allows users by such as orders such as button clicks usually, the object of selecting to occur on the monitor 110, icon, text etc.In some embodiments, monitor 110 can be an interactive touch-screen, the Cintiq that makes such as Wacom etc.Graphics card 185 common driving displays 110.
The embodiment of computer interface 150 generally includes Ethernet card, modulator-demodular unit (phone, satellite, cable, ISDN), (asynchronous) Digital Subscriber Line unit, fire-wire interfaces, USB interface etc.For example, computer interface 150 can be coupled to computer network, fire wire bus etc.In other embodiments, computer interface 150 can physically be integrated on the mainboard of computing machine 120, can be such as software programs such as soft DSL.
In each embodiment, computing machine 120 generally includes familiar machine element, such as: processor 160, such as memory storage device such as random access storage device (RAM) 170, disk drives 180 and with the system bus 190 of above-mentioned component interconnect.
In one embodiment, computing machine 120 comprises the one or more Xeon microprocessors from Intel.In addition, in the present embodiment, computing machine 120 generally includes the operating system based on UNIX.
RAM 170 and disk drive 180 are that configuration is used to store the example such as the tangible medium of following item number certificate: animation data, animation is list regularly, the animation environment, Electronic cartoon unit design structure, the realization of mathematical algorithm, the emulation of Electronic cartoon unit, image file, the software model that comprises the object geometric description, the orderly geometric description of object, the process description of model, the scene descriptor file, render engine, comprise computer executable code, the embodiments of the present invention of human readable codes etc.The computer-readable tangible medium of other types comprises: magnetic storage medium, such as floppy disk, networking hard disk or removable hard disk; Optical storage media is such as CD-ROM, DVD, holographic memory and bar code; Semiconductor memory is such as flash memory, ROM (read-only memory) (ROMS); Battery backed volatile memory; The networking memory device, or the like.
In the present embodiment, computer system 100 can also comprise the software of realization by communicating such as networks such as HTTP, TCP/IP, RTP/RTSP agreements.In alternate embodiment of the present invention, also can use other communication softwares and host-host protocol, for example IPX, UDP etc.
Fig. 1 is the capable representative of specializing the computer system of various aspects of the present invention.What easily see to those skilled in the art is that multiple other hardware and software configurations are suitable for being used in combination with the present invention.For example, computing machine can be install or the board-like configuration of desk-top, portable, frame.In addition, computing machine can be a series of Net-connected computers.In addition, can expect use, such as Xeon to other microprocessors TM, Pentium TMOr Core TMMicroprocessor; From Advanced Micro Devices, the Turion of Inc TM64, Opteron TMPerhaps Athlon TMMicroprocessor; Or the like.In addition, can expect the operating system of other types, such as Windows from Microsoft Corporation
Figure BPA00001204497800061
WindowsXP
Figure BPA00001204497800062
WindowsNT
Figure BPA00001204497800063
Deng, from the Solaris of Sun Microsystems, LINUX, UNIX etc.In other embodiment, above-mentioned technology can realize on chip or auxiliary process plate.
Fig. 2 A-Fig. 2 B shows the block diagram according to the process of each embodiment of the present invention.More specifically, Fig. 2 A-Fig. 2 B provides the high level explanation of each embodiment of the physical model of determining how to construct object.
At first, in step 200, provide the exercise data of determining on the relevant art (being associated with the software model of object sometimes) to computing machine 100.In some embodiments of the present invention, the animation data software model that can be used to the object of film, positive, playlet etc. produces the identical animation data of animation.In each embodiment, animation data can be represented the desired specified scope (for example, " training set ", " model sport " (that is the characteristic kinematic of object) etc.) that moves for the physical model of object.As example, in film, cartoon role two eyes that can blink, yet, can only blink her left eye of the physical model that may expect cartoon role.Thus, the animation data that provides in this step (for example, " training set ", " model sport ") can include only the cartoon role of the left eye that only blinks.
In the embodiments of the present invention, animation data is specified at specific time quantum usually.For example, animation data can be associated with the scene of special scenes in the positive (for example, film, playlet etc.), particular series, particular time interval etc.In each embodiment, time quantum can freely be determined by the user.In other embodiments, time quantum can be specified by other users.In case specified the time period, then can easily determine the scope of animation data.
In each embodiment, can wait according to the speed (HD frame per second) of the speed (typical TV frame per second) of the speed of one second about 24 (24) frame (typical film frame per second), one second about 25 (25) or about 30 (30) frames, one second about 60 (60) frame and specify animation data.In this type of embodiment, can specify the animation data value according to this type of data transfer rate.
In each embodiment, animation data can adopt multiple multi-form.In some embodiments, animation data comprises the three-dimensional coordinate of the position of animate object.As example, the animate object that produce animation can be facial.In this example, animation data can represent that surface location on the face is about the three-dimensional coordinate (for example, (x, y, z) coordinate) of time in static coordinate system.In other embodiments, three-dimensional coordinate is can be about the time, with respect to " static " of face or the skew of acquiescence attitude.For example, for supratip point, (z) value can be (0.1,0.2 ,-0.1) for x, y, and the expression nose moves-0.1 from basic attitude along the x direction, for example, and left; Move 0.2 along the y direction, for example, upwards; And move-0.1 along the z direction, for example, flatten.In each embodiment, the position can be animate object lip-deep position, can be the position of animate object inside, the combination of this type of position, or the like.
On mathematics, in some embodiments, animation data can followingly be represented:
{X it,Y it,Z it}:i=1...m,t=1...T
In each embodiment, the number of the surface location that provides in " m " expression animation data, and the frame number of " T " expression animation data.For example, the magnitude of m can be 100-1000 data point, a 1000-3000 data point or bigger, or the like.And the magnitude of T can be tens of frames to thousands of frames, for example 25-100 frame, 100-1000 frame, 1000-4000 frame or bigger, or the like.
In each embodiment, animation data can receive according to comparatively abstract form, such as prompt table, animation timing list etc.In each embodiment, animation regularly list comprises the data value of animation variable (avars) at each included in time period frame.In other embodiments, animation data can comprise the batten data that are associated with avar; Can comprise at avars and the key frame value of appointment, or the like.In the embodiments of the present invention, be under some situation of comparatively abstract form at animation data, can expect the absolute or relative position value of the point of definite animate object in step 200.
As mentioned above, In some embodiments of the present invention, animation data can be associated with whole cartoon role or object, perhaps with such as the cartoon role of face of role etc. or the part of object is associated.
In other embodiments of step 200, the exercise data of determining on the art is in response to the physics representation of data and definite.More specifically, the physics representation of data can be derived from motion capture system.For example, performer or other objects can be associated with a plurality of visible marks, and these witness markings move and move in the space in response to performer's gesture or other.In each embodiment, the position of these marks is hunted down, and is used to form exercise data.The physics representation of data can be performer's a whole health, and perhaps performer's a part is such as face etc.
In the embodiments of the present invention, in step 210,, carry out decomposition analysis determining global schema in response to animation data, and at global schema's weighted value of animation data.In each embodiment, carry out the svd process by principal component analysis (PCA) (PCA) technology; Certainly, also can use other analytical technologies in other embodiments.In some embodiments, holotype can be called " proper vector ", and the pattern weighted value can be called " eigenwert ".Known, use this technology, determined global schema is main for data acquisition.In each embodiment, comprise at animation data under the situation of three-dimensional location data of surface point that overall holotype comprises the global schema of motion of the surface point of animate object.An example will be shown below.
On mathematics, in some embodiments, decompose (in the one dimension) and can followingly represent:
X i , t = Σ j = 1 N { b i , j a t , j }
In each embodiment, the number of " N " expression holotype, b I, jExpression global schema, and a T, jThe expression weight factor.
Next, in each embodiment,,, carry out factor rotation to determine local mode and (local mode) weighted value in response to global schema and (global schema) weighted value in step 220.Different with global data, in each embodiment, local mode is specified the shift position of for example surface point of animate object, and it is independent of (usually away from) surface point comparatively speaking.An example will be shown below.
More generally, basis function can be transformed to the set of any linear independence of striding same space in the following manner so that basis function more directly the control with the Electronic cartoon unit is relevant.In some embodiment, factor rotation matrix " R " (normally symmetry square matrix), wherein R are determined in expectation T=R, and local mode b ' I, jLocalize according to following relation:
b i,jR=b′ i,j
Use this embodiment, top formula becomes:
X i , t = Σ j = 1 N { b i , j ′ R T a t , j }
In this formula, b ' I, jR TThe expression partial model.
In some embodiments of the present invention, a kind of factor rotation technique of realization is " maximum variance " quadrature pivot analysis.In other embodiments, can use other factor rotation techniques.As described, the result of this technology is: the given standard that move according to for example minimum variance etc. the position of the animate object of appointment in the data acquisition is localized.
In each embodiment, in step 230, based on the local mode of being discerned, the user determines the number of " master " local mode.In some embodiments, the number of local mode can be associated with the number of included physics control structure in the structure of the physical model of object.As an example, based on local mode, the user can specify a plurality of other zones of assigning control structure to it.For example, for upper arm, the local mode that the user can select to be associated with the biceps zone of object is as main station portion pattern.
Next, in step 240, based on main station portion pattern, the structure of the physical model that the user can appointed object.In each embodiment, the physics control structure can be associated with the desired one or more physical equipments that move are provided.
In each embodiment, the physics control structure can be following combination in any: traction or pushing motor; Stretching or pinching hydraulic pressure, pneumatic or fluid control systems; Rotation or cutting system; Or above-mentioned combination in any.As example, a fluid control systems can be used in animated face so that the eyeball projection; Motor can draw the eyes back skin so that eyes open; Or the like.According to the disclosure, what should easily see for those of ordinary skills is, the desired user that perhaps can use the controlled physical arrangement of multiple other types to obtain object physics mold surface position moves.Additional example will illustrate hereinafter.
Fig. 2 B shows the high level explanation of each embodiment that moves that is used to observe and adjust the object physical model.
In step 250,, can make up the physical model of animate object (Electronic cartoon unit) based on the structure explanation.As mentioned above, can use any conventional number target physical parts to handle the different piece of (for example, move, turn to, bending, stretching etc.) physical model.
In each embodiment, in step 260, the local mode weighted value of determining in step 220 to determine, be associated with main station portion pattern, and its form with drive signal is applied to corresponding physics control structure.As example, a main station portion pattern can be specified: when wearing smile, the cheek of animated face raises, and stretches out from face.Thus, local mode multiply by the weight factor that is associated with it, and product is applied to cheek rising control structure.As another example, based on weight factor, two motors under " skin " that is positioned at animated face send signal.Then, these two motors can rotate and promote in the upward direction.
In the embodiments of the present invention, in step 260, drive signal can be based on the animation data of appointment.In some embodiments, can think that the animation data that provides is " training data " in step 200, it specifies the expectation largest motion scope of animate object, thereby can make up the physical model of object.In step 260, the employed animation data of having specified can be specified than the subset range of the animation of training data.For example, the training animation data can comprise role's independently the blinking of two eyes, and the employed animation data of having specified can be that role's left eye blinks in the step 260.
In each embodiment of specifying animation data is provided, can revise the weight factor that is associated according to the local mode of determining in the step 230.Then, in step 260,, determine specific drive signal and be applied to the physical model of object (Electronic cartoon unit) based on the weight factor that is associated.In step 270, the controlled motion of the physical model that the user then can the object of observation.
In each embodiment,, the attitude of the physical model of object and the expectation attitude and the motion of motion and object are carried out visual comparison in step 280.For example, the user may see that the physical model of object does not stretch or do not regain fully an arm fully, do not smile, or the like; Physical model does not have with correct speed or moves along desirable path, or the like.In other embodiments, user (for example, animation teacher) can determine whether the motion of the physical model of object has reached the expectation " outward appearance " of cartoon role.
In step 285, in response to visual comparison, the user can adjust weight factor and/or drive signal.As mentioned above, because local mode can be overlapping on object, and have only some local mode to realize in physical model, it may be difficult therefore duplicating particular pose.In addition, because the physical restriction of parts, nonlinear elastic material etc. for example, physique may imitate the local mode of expectation not vividly.Thus, the actual range of the attitude of the physical model of object may look and not exclusively look like cartoon role (motion of expectation).Therefore, often prospective users must be adjusted the physique etc. of the physical model of weight factor, object.Then can repeat above-described process, whether helpful to check adjustment.Further details and other embodiments provide hereinafter.
Under the motion and physics attitude satisfied situation of user for the physical model (perhaps modified physical model) of object, in step 290, weight factor for example, the control datas such as control signal that send to control structure can be stored in the storer.In step 295, from the control data of storer then can be in " routine " operator scheme of its expectation the physical model of driven object, for example in recreational facilities or program, performance etc.In each embodiment, conventional computer system, for example away from the computer system 100 of physical model, can controlling object.In other embodiments, can use dedicated computer system, embedded system to wait storage control data, be used for the control signal of control structure etc., with the physical model of controlling object.
Only, can specify and construct toy, move to have with specified the moving similarly of the animation data of expectation according to above-described principle as an example.The driving data of expectation can be converted into control signal, and it is stored in such as in the storer on the plates such as flash memory, ROM as computer executable instructions, and drives the microprocessor that is embedded in the toy, so that the control toy is mobile.
Fig. 3 shows the block diagram according to the additional process of each embodiment of the present invention.More specifically, Fig. 3 provides the high level explanation of the embodiment that moves of the physical model that is used to revise object.
At first, as Fig. 2 B,,, can make up the physical model of animate object (Electronic cartoon unit) based on the structure explanation in step 250.As mentioned above, can use any conventional number target physical parts to handle the different piece of (for example, move, turn to, bending, stretching etc.) physical model.
In each embodiment, can in step 260, determine appropriate driving signal and be applied to physical model.The inventor recognizes, being used for the process that physical motion with physical model is matched to desired motion is a space-time problem, and can partly utilize computing machine to find the solution at least.Thus, in step 260, will be applied to physical model to " initial guess " of suitable drive signal.
In each embodiment, in step 300, the attitude of the physical model of object and motion are recorded in the computer system then, and/or are observed by the user.Especially, in step 300, determine the real space/time location (perhaps other constraints) of the physical model of object.In the embodiments of the present invention, can expect being used to carry out its multitude of different ways.Each embodiment comprises the servomotor that uses the specific juncture be positioned at physical model etc. to locate.In this embodiment, observe the angle position of servomotor, to help to calculate the position of whole physical model about animation time about animation time.Other embodiments comprise the capturing movement technology of using, such as being positioned at reflection spot on the physical model etc.In this embodiment, after in the space, being corrected, the position of the reflection spot of catching can be determined, the position of physical model can be determined then about the time by video camera etc.In other embodiment, can catch the physical motion of object by a plurality of physics feelers that use is arranged in the space key position.Whether or when these physical sensors sensing physical model then move to key position in the space.In other embodiments, can use laser distance measuring equipment or other 3-D scanning technology to help the computational physics model about the physical location of animation time, the path of physical model etc.
Next, in each embodiment, in step 310, based on animation data, computer system can be predicted or the desired motion of each several part in room and time of definite animate object.In some embodiments, the expectation attitude of Electronic cartoon unit can be to utilize the attitude and the motion of the animate object that animation data such as source animation feature for example drives simply.
In the embodiments of the present invention, in step 330, real space/time location of determining physical object for expectation space/time location of this period whether within acceptable bias.In some embodiments, depart from and depend on that what is important for the animation of physical model.For example, important may be that physical model is carried out certain action in the specific moment, rather than how physical model carries out this action (for example, moving); Perhaps, important may be physical model is comparatively strict follows specific motion path, rather than physical model is carried out the speed of this motion; Perhaps any other combination.Thus, the amount that can accept to depart from and type height depend on the application of the physical model of object.As a specific example, comparing data may be indicated: the physical model of object is jumped not high enoughly, holds hand too early or late excessively, does not catch the object that drops, perhaps other and relevant the departing from of motion.
In each embodiment, in step 330, departing from can be definitely to depart from, accumulate and depart from or the departing from of any other type.For example, if physical model at any time in the space/space/time location of in fact departing from prediction surpasses 10%, it can be unacceptable then departing from.As another example, if physical model space/time location not departing from the space for 75% animation time surpasses 5%, it is acceptable then departing from, and is higher than 10% even its peak value space/time departs from.In other examples, if physical model does not arrive certain position in the specific time, but then physical model surpasses acceptance threshold.In other embodiments, according to concrete application, can use other conditions to make up threshold condition is set.In addition, threshold value can automatically or manually be provided with.
In the embodiments of the present invention, in step 340,, then can use and (for example depart from if bias is unacceptable, error signal) comes automatically or manually revise (for example, utilizing weight factor) driving data at the physical model of Electronic cartoon unit.In some examples, if the physical model of object " hits " or arrived the locus of all expectations, but slow slightly, then can increase driving data, current of electric for example; Can be in time application drives data earlier; Or the like.In other examples,, also can increase driving data, for example current of electric if physical model does not have the locus of expectancy of hitting; Can in driving data, introduce and suspend so that physical model can drop on ad-hoc location; Or the like.In other embodiments, can expect revising multiple other modes of driving data.
In some embodiments of the present invention, can be automatically or manually revise the structure of physical model, so that physical model can arrive the space/time location of expectation.For example, can be reconstructed, thereby the lighter parts of operating weight reduce the inertia of parts, use the drive motor of higher running frequency etc., so that physical model moves quickly physical model; Can be reconstructed physical model, use stronger parts or additional parts, thereby make physical model deal with bigger stress, deal with bigger load or move quickly etc.; Can reorientate the placement of physical model, thereby by utilizing bigger leverage to make physical model move quickly; Or the like.Reconstruct can be depended on the experience etc. of physical model wright's real world.
In other embodiments of the present invention, if Electronic cartoon role's arm is after the time of expectation arrives the position of expecting " knock-on (bounce) ", then arm can increase between most of moving period in the speed of movable part, but reduce (, from the Linear Driving signal to the non-linear drive signal) during near desired locations at it; Can in arm, use lighter parts (to reduce the inertia of arm); Or the like.The combination of the change in drive signal and the physique can be proposed in design in other embodiments.
In other embodiment of the present invention, the user can revise the animation data that is used for determining in step 260 drive signal.For example, the user may determine that by using above-described technology, physical model can't meet the desired space/time location exactly.In this case, the user can specify one group of new animation data, and it will be used for determining drive signal in step 260.For example, if thereby physical model can't move at specific time point along expected path and arrives certain position, and then for physical model can arrive this position in the time of expectation, the user can specify the mobile route of simplification by animation data.
In each embodiment, can repeating step 250 to 340, arrive the space/time location of expectation according to user's expectation up to the physical model of object.In each embodiment, this process can be artificial input robotization and/or that comprise the space-time error of the physical model (using the animation data in the step 200 to drive) that reduces object.
Fig. 6 shows the block diagram of high level explanation of the additional embodiment that moves of the physical model that is used to revise object.
In each embodiment, can determine the software emulation of the physical model of object.In some embodiments, can wait in conjunction with computer-aided design (CAD) (CAD) analogue system and carry out this software emulation.In other embodiments, the physical model of object may be fabricated, and the software model that makes up in this step can be based on the performance of the real world of this physical model.For example, the software emulation of object can be based on the performance of the physical object of the execution special exercise that measures.
In each embodiment, software emulation can give the attitude of the physical model of user so that how object can be set and the idea how physical model can move, and need not ready-made available physical model (Electronic cartoon unit).If animation team is away from the Electronic cartoon unit, then this embodiment may be useful.
Be similar to step 260 above, in each embodiment, in step 510, can determine appropriate driving signal and be applied to software emulation.As above, the inventor recognizes the space-time behavior that should study software emulation, to improve the behavior of Electronic cartoon unit.
In each embodiment, be similar to step 300 above, in step 520, the attitude of the software emulation of Electronic cartoon unit and motion are recorded and/or are observed by the user.Next, in each embodiment, in step 530, based on animation data, computer system can be predicted or the desired motion of various piece in room and time of definite animate object.In some embodiments, the expectation attitude of Electronic cartoon unit can be to utilize the attitude and the motion of the animate object that the animation data of source animation feature for example etc. drives simply.
In the embodiments of the present invention, in step 540, the real space/time location of software model of determining physical object for expectation space/time location of this period whether within acceptable bias.As mentioned above, in some embodiments, depart from and to depend on that what is important for the animation of physical model.For example, important may be that physical model is carried out certain action in the specific moment, rather than how physical model carries out this action (for example, moving); Perhaps, important may be physical model is comparatively strict follows specific motion path, rather than physical model is carried out the speed of motion; Perhaps any other combination.Thus, the amount that can accept to depart from and type height depend on the application of the physical model of object.As a specific example, comparing data may be indicated: the physical model of object is jumped not high enoughly, holds hand too early or late excessively, does not catch the object that drops, perhaps other and relevant the departing from of motion.
In each embodiment, be similar to step 330 above, departing from can be definitely to depart from, accumulate and depart from or the departing from etc. of any other type.For example, if physical model at any time in the space/space/time location of in fact departing from prediction on average surpasses 20%, it may be unacceptable then departing from.In other examples, if physical model does not arrive certain position in the specific time, but then physical model surpasses acceptance threshold.
In each embodiment,,, then can use and depart from (for example, error signal) and come automatically or manually revise (for example, utilizing weight factor) driving data at the Electronic cartoon unit if bias is unacceptable in step 560.In some examples, if the software emulation of object " hits " or arrived the locus of all expectations, but too early, then can open hydraulic pump in the time a little later; Can reduce the oil pressure of hydraulic system; Or the like.In other embodiments, if the software emulation of physical model does not have " hitting " or do not arrive the locus of expecting, then can increase the air pressure of pneumatic system; Early activate hydraulic system in time; Or the like.In other embodiments, can expect revising multiple other modes of driving data.
In each embodiment, can repeating step 500 to 560, arrive the space/time location of expectation according to user's expectation up to the software emulation of Electronic cartoon unit.In each embodiment, this process can be artificial input robotization and/or that comprise the space-time error of the physical model (using the animation data in the step 200 to drive) that reduces object.
In some embodiments, when software emulation satisfied the space-time constraint of user expectation, process can be returned Fig. 2 B or Fig. 3, as shown.In this case, if the Electronic cartoon unit does not make up as yet,, can construct and adjust physical model according to corresponding process then in step 250.In other cases, if the Electronic cartoon unit makes up, then step 250 can comprise according to determined in the step 560 and like that the Electronic cartoon unit being changed.Under other other situations, step 250 may be carried out, and the Electronic cartoon unit may be fabricated.
Fig. 4 A-Fig. 4 B shows the example according to each embodiment of the present invention.Fig. 4 A shows the example of the overall holotype of determining at the sample data set that is associated with object surface 370.In this example ,+and-part represents the change of relative depth.Can see, in the overall holotype 370+and-partly extending surpasses the zonule.
On the contrary, Fig. 4 B shows the example of the local mode of determining based on overall holotype 380 after factor rotation.In this example ,+and-part also represents the change of relative depth.Can see, in the local mode+and-the less zone of some effects.According to the respective embodiments described above, local mode is the zone of relative separation on the object surface, and can be associated with corresponding physics control structure thus.
Fig. 5 A-Fig. 5 C shows the example according to the physics control structure of each embodiment of the present invention.Fig. 5 A shows the example of motor 400, and it is used for obtaining local crowning 410, the people's who is for example smiling cheek.Fig. 5 B shows the example of motor 420, and it is connected to circuit 430, is used to the depression 440 that obtains to localize, for example just at beetle-browed people's lip.Fig. 5 C shows the example of pressure charging system 450, and its (for example, hydraulic pressure, pneumatic) is connected to piston 460, is used to obtain the stretching, extension effect 470 of arm.
According to above open, those of ordinary skill in the art will be understood that, can carry out the pattern analysis of multiple other types, to determine the overall situation and localization Move Mode.And according to above open, those of ordinary skill in the art will be understood that, can use and make up multiple other modes that are used to realize the physics control structure, with the physical representation of giving cartoon role with " life ".
After reading the disclosure, those of ordinary skill in the art's further embodiment that has the ability to anticipate.In other embodiments, can carry out useful combination or sub-portfolio to disclosed embodiment above.It is for easy to understand that the block diagram of framework and process flow diagram is divided into groups.Yet, should be appreciated that the rearranging etc. of interpolation, frame of the combination of in alternate embodiment of the present invention, expecting frame, new frame.
Thus, instructions and accompanying drawing are regarded in an illustrative, rather than a restrictive.Yet what easily see is under the situation that does not break away from the of the present invention more wide in range spirit and scope of putting down in writing in the claim, can carry out various modifications and change to it.

Claims (24)

1. method that is used for determining the behavior of Electronic cartoon unit comprises:
Receive animation data, described animation data comprises the motion of determining to the art of small part of cartoon role;
In response to described animation data, definite a plurality of control signals that will be applied to the described part of described Electronic cartoon unit;
By utilizing described a plurality of control signal to drive the described software emulation to small part of described Electronic cartoon unit, the described partial response at least of estimating described Electronic cartoon unit is in the behavior of described a plurality of control signals; And
The expression of the described described behavior to small part of exporting described Electronic cartoon unit to the user.
2. the method for claim 1,
Wherein said animation data is described described the appointment attitude or the designated movement to small part of described Electronic cartoon unit;
Wherein said behavior comprises described the prediction attitude or the predicted motion to small part of described Electronic cartoon unit; And
Wherein said method further comprises: in response to described appointment attitude or described designated movement and described prediction attitude or described predicted motion, revise the described described software emulation or the described animation data to small part of described Electronic cartoon unit.
3. as each described method of claim 1-2,
Wherein said animation data is described the described designated movement to small part of described Electronic cartoon unit; And
Determine that wherein described a plurality of control signals comprise:
In response to described designated movement, determine the described enhancing space-time designated movement of described Electronic cartoon unit to small part; And
In response to described enhancing space-time designated movement, determine to be applied to described described a plurality of control signals of described Electronic cartoon unit to small part.
4. computer system that is used for determining the behavior of Electronic cartoon unit comprises:
Storer, configuration is used to store animation data, and described animation data comprises the action of determining to the art of small part of cartoon role;
Be coupled to the processor of described storer, wherein said processor configuration is used for: in response to described animation data, and definite a plurality of control signals that will be applied to the described part of described Electronic cartoon unit; Wherein said processor configuration is used for: drive the described software emulation to small part of described Electronic cartoon unit by utilizing described a plurality of control signal, the described partial response at least of estimating described Electronic cartoon unit is in the behavior of described a plurality of control signals; And the configuration of wherein said processor is used for: the expression of the described described behavior to small part of exporting described Electronic cartoon unit to the user.
5. computer system as claimed in claim 4,
Wherein said animation data is described described the appointment attitude or the designated movement to small part of described Electronic cartoon unit;
Wherein said behavior comprises described the prediction attitude or the predicted motion to small part of described Electronic cartoon unit; And
Wherein said processor configuration is used for: in response to described appointment attitude or described designated movement and described prediction attitude or described predicted motion, revise the described described software emulation or the described animation data to small part of described Electronic cartoon unit.
6. as each described computer system of claim 4-5,
Wherein said animation data is described the described designated movement to small part of described Electronic cartoon unit; And
Wherein said processor configuration is used for: in response to described designated movement, determine the described enhancing space-time designated movement to small part of described Electronic cartoon unit; And
Wherein said processor configuration is used for: in response to described enhancing space-time designated movement, determine to be applied to described described a plurality of control signals to small part of described Electronic cartoon unit.
7. a computer program that resides on the tangible medium comprises the executable code that can carry out on computer system, and wherein said computer system comprises processor and storer, and described computer program comprises:
Configuration is used to guide described processor to receive the code of animation data, and described animation data comprises the motion of determining to the art of small part of cartoon role;
Configuration is used to guide described processor to determine to be applied to the code of a plurality of control signals of the described part of described Electronic cartoon unit in response to described animation data;
Configuration is used to guide described processor to estimate that by the described software emulation to small part that utilizes described a plurality of control signal to drive described Electronic cartoon unit the described partial response at least of described Electronic cartoon unit is in the code of the behavior of described a plurality of control signals; And
Configuration is used to guide the code of the expression of the described described behavior to small part that described processor exports described Electronic cartoon unit to the user.
8. computer program as claimed in claim 7,
Wherein said animation data is described described the appointment attitude or the designated movement to small part of described Electronic cartoon unit;
Wherein said behavior comprises described the prediction attitude or the predicted motion to small part of described Electronic cartoon unit; And
Wherein said computer program further comprises: configuration is used to guide described processor to revise the described to the described software emulation of small part or the code of described animation data of described Electronic cartoon unit in response to described appointment attitude or described designated movement and described prediction attitude or described predicted motion.
9. as each described computer program of claim 7-8,
Wherein said animation data is described the described designated movement to small part of described Electronic cartoon unit; And
Wherein said computer program further comprises:
Configuration is used to guide the code of described processor determines described Electronic cartoon unit in response to described designated movement described enhancing space-time designated movement to small part; And
Configuration is used to guide the code of described processor determines to be applied to described Electronic cartoon unit in response to described enhancing space-time designated movement described described a plurality of control signals to small part.
10. method that is used to specify the design of Electronic cartoon unit comprises:
Receive exercise data, described exercise data comprises the motion that art is upward determined;
In response to described exercise data, determine design to the structure of small part at described Electronic cartoon unit; And
Output needle is to the described design of the structure of described Electronic cartoon unit.
11. method as claimed in claim 10 determines that wherein described design comprises:
In response to described exercise data, determine a plurality of global characteristics attitudes;
In response to described a plurality of global characteristics attitudes, determine a plurality of local feature attitudes; And
In response to described a plurality of local feature attitudes, determine described design at the structure of described Electronic cartoon unit.
12., further comprise as each described method of claim 10-11:
In response to described exercise data, predict the motion of the described part of described Electronic cartoon unit;
Definite drive signal drives the described part of described Electronic cartoon unit in response to described exercise data in utilization, with the motion of the described part of determining described Electronic cartoon unit;
The described partial response of determining described Electronic cartoon unit is in the described motion of described drive signal and the difference of described partial response between the motion of the described prediction of described exercise data of described Electronic cartoon unit; And
In response to described difference, determine control mapping at the described part of described Electronic cartoon unit.
13., determine that wherein described design at the structure of the described part of described Electronic cartoon unit is also in response to mechanical constraint as each described method of claim 10-11.
14. a computer system comprises:
Storer, configuration is used for the storing moving data, and described exercise data comprises the motion that art is upward determined;
Be coupled to the processor of described storer, the configuration of wherein said processor is used for determining the design to the structure of small part at described Electronic cartoon unit in response to described exercise data; And wherein said processor configuration is used for the described design of output needle to the structure of described Electronic cartoon unit.
15. computer system as claimed in claim 14,
Wherein said processor configuration is used for: the described partial response of predicting described Electronic cartoon unit is in the position of described exercise data;
The configuration of wherein said processor is used for: utilize that definite drive signal drives the described part of described Electronic cartoon unit in response to described exercise data, with the position of the described part of determining described Electronic cartoon unit;
Wherein said processor configuration is used for: the described partial response of determining described Electronic cartoon unit is in the difference of described partial response between the position of the described prediction of described exercise data of the position and the described Electronic cartoon unit of described drive signal; And
Wherein said processor configuration is used for: in response to described difference, determine the control mapping at described Electronic cartoon unit.
16. as each described computer system of claim 14-15, wherein said exercise data is selected from the group of being made up of following: the animation data that is associated with cartoon role; Physics performance motion capture data; The animation data that is associated with animated face; Facial performance motion capture data.
17. as each described computer system of claim 14-16,
Wherein said processor configuration is used for: determine the software phantom in response to described control mapping at described Electronic cartoon unit.
18. a computer program that resides on the tangible medium comprises the executable code that can carry out on computer system, wherein said computer system comprises processor and storer, and described computer program comprises:
Configuration is used to guide described processor to obtain the code of exercise data from described storer, and described exercise data comprises that art goes up the motion of determining;
Configuration is used to guide described processor to determine the code to the design of the structure of small part at described Electronic cartoon unit in response to described exercise data; And
Configuration is used to guide the code of described processor output needle to the described design of the structure of described Electronic cartoon unit.
19. computer program as claimed in claim 18 further comprises:
Configuration is used to guide described processor to determine the code of a plurality of global characteristics attitudes in response to described exercise data;
Configuration is used to guide described processor to determine the code of a plurality of local feature attitudes in response to described a plurality of global characteristics attitudes; And
Configuration is used to guide described processor to determine code at the described design of the structure of described Electronic cartoon unit in response to described a plurality of local feature attitudes.
20. as each described computer program of claim 18-19, the motion of determining on the wherein said art comprises a plurality of attitudes about the time.
21. a method that is used to construct the Electronic cartoon unit comprises:
Reception is at the design of the structure of the part of described Electronic cartoon unit; And
In response to described design, construct the described part of described Electronic cartoon unit at the structure of the described part of described Electronic cartoon unit;
Wherein, determine in response to exercise data that described exercise data comprises that art goes up the motion of determining at the described design of the structure of the described part of described Electronic cartoon unit.
22. method as claimed in claim 21, wherein the described design at the structure of the described part of described Electronic cartoon unit comprises: with the explanation in the zone of the described part correlation connection of the described Electronic cartoon unit with local feature attitude.
23. as each described method of claim 21-22, the described part of wherein constructing described Electronic cartoon unit comprises: with control structure be coupled to described Electronic cartoon unit on the zone of described part correlation connection;
Wherein said control structure is selected from the group of being made up of following: hydraulic-driven, pneumatic actuation, motor.
24. Electronic cartoon unit according to each described method construct of claim 20-23.
CN2008801267633A 2007-12-17 2008-12-16 Methods and apparatus for designing animatronics units from articulated computer generated characters Pending CN101952879A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US11/958,239 US8232998B2 (en) 2007-12-17 2007-12-17 Methods and apparatus for estimating and controlling behavior of animatronics units
US11/958,233 US8390629B2 (en) 2007-12-17 2007-12-17 Methods and apparatus for designing animatronics units from articulated computer generated characters
US11/958,233 2007-12-17
US11/958,239 2007-12-17
PCT/US2008/087025 WO2009079514A1 (en) 2007-12-17 2008-12-16 Methods and apparatus for designing animatronics units from articulated computer generated characters

Publications (1)

Publication Number Publication Date
CN101952879A true CN101952879A (en) 2011-01-19

Family

ID=40795901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008801267633A Pending CN101952879A (en) 2007-12-17 2008-12-16 Methods and apparatus for designing animatronics units from articulated computer generated characters

Country Status (4)

Country Link
EP (1) EP2227802A4 (en)
JP (1) JP2011511713A (en)
CN (1) CN101952879A (en)
WO (1) WO2009079514A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9427868B1 (en) * 2015-02-24 2016-08-30 Disney Enterprises, Inc. Method for developing and controlling a robot to have movements matching an animation character
US11631295B2 (en) 2020-08-11 2023-04-18 ScooterBug, Inc. Wireless network, mobile systems and methods for controlling access to lockers, strollers, wheel chairs and electronic convenience vehicles provided with machine-readable codes scanned by mobile phones and computing devices
US11995943B2 (en) 2020-08-11 2024-05-28 ScooterBug, Inc. Methods of and systems for controlling access to networked devices provided with machine-readable codes scanned by mobile phones and computing devices
US11790722B2 (en) 2020-08-11 2023-10-17 Best Lockers, Llc Single-sided storage locker systems accessed and controlled using machine-readable codes scanned by mobile phones and computing devices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020149622A1 (en) * 2001-04-12 2002-10-17 Akira Uesaki Animation data generation apparatus, animation data generation method, animated video generation apparatus, and animated video generation method
US20040210426A1 (en) * 2003-04-16 2004-10-21 Wood Giles D. Simulation of constrained systems
US20050153624A1 (en) * 2004-01-14 2005-07-14 Wieland Alexis P. Computing environment that produces realistic motions for an animatronic figure
CN1764890A (en) * 2003-03-25 2006-04-26 英国电讯有限公司 Apparatus for generating behaviour in an object
US20070255454A1 (en) * 2006-04-27 2007-11-01 Honda Motor Co., Ltd. Control Of Robots From Human Motion Descriptors

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3614824B2 (en) * 2002-03-18 2005-01-26 ソニー株式会社 Motion editing apparatus and motion editing method for legged mobile robot
FR2839176A1 (en) * 2002-04-30 2003-10-31 Koninkl Philips Electronics Nv ROBOT ANIMATION SYSTEM COMPRISING A SET OF MOVING PARTS
US20070143679A1 (en) * 2002-09-19 2007-06-21 Ambient Devices, Inc. Virtual character with realtime content input
WO2004066124A2 (en) * 2003-01-14 2004-08-05 Disney Enterprises, Inc. Animatronic supported walking system
CN1929894A (en) * 2004-03-12 2007-03-14 皇家飞利浦电子股份有限公司 Electronic device and method of enabling to animate an object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020149622A1 (en) * 2001-04-12 2002-10-17 Akira Uesaki Animation data generation apparatus, animation data generation method, animated video generation apparatus, and animated video generation method
CN1764890A (en) * 2003-03-25 2006-04-26 英国电讯有限公司 Apparatus for generating behaviour in an object
US20040210426A1 (en) * 2003-04-16 2004-10-21 Wood Giles D. Simulation of constrained systems
US20050153624A1 (en) * 2004-01-14 2005-07-14 Wieland Alexis P. Computing environment that produces realistic motions for an animatronic figure
US20070255454A1 (en) * 2006-04-27 2007-11-01 Honda Motor Co., Ltd. Control Of Robots From Human Motion Descriptors

Also Published As

Publication number Publication date
EP2227802A4 (en) 2012-04-25
EP2227802A1 (en) 2010-09-15
WO2009079514A1 (en) 2009-06-25
JP2011511713A (en) 2011-04-14

Similar Documents

Publication Publication Date Title
CN110930483B (en) Role control method, model training method and related device
Brubaker et al. Physics-based person tracking using the anthropomorphic walker
Xia et al. A survey on human performance capture and animation
Ye et al. Synthesis of detailed hand manipulations using contact sampling
Zhao et al. Robust realtime physics-based motion control for human grasping
CN111292401B (en) Animation processing method and device, computer storage medium and electronic equipment
Liu et al. Synthesis of complex dynamic character motion from simple animations
US6552729B1 (en) Automatic generation of animation of synthetic characters
US20110293144A1 (en) Method and System for Rendering an Entertainment Animation
Ishigaki et al. Performance-based control interface for character animation
US20170091976A1 (en) Coordinated gesture and locomotion for virtual pedestrians
US20130173242A1 (en) Methods and apparatus for estimating and controlling behavior of animatronics units
WO2011034963A2 (en) Combining multi-sensory inputs for digital animation
US11721056B2 (en) Motion model refinement based on contact analysis and optimization
Kenwright Watch your step: Real-time adaptive character stepping
CN101952879A (en) Methods and apparatus for designing animatronics units from articulated computer generated characters
US8390629B2 (en) Methods and apparatus for designing animatronics units from articulated computer generated characters
CN102426709B (en) Real-time motion synthesis method based on fast inverse kinematics
Oore et al. Local physical models for interactive character animation
Kim et al. Realtime performance animation using sparse 3D motion sensors
Lin et al. Temporal IK: Data-Driven Pose Estimation for Virtual Reality
Mezger et al. Trajectory synthesis by hierarchical spatio-temporal correspondence: comparison of different methods
Kim et al. Reconstructing whole-body motions with wrist trajectories
JP3973995B2 (en) Animation creation system
Multon et al. From motion capture to real-time character animation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110119