WO2014152313A2 - Procédé et système d'enregistrement d'informations concernant des actifs rendus - Google Patents

Procédé et système d'enregistrement d'informations concernant des actifs rendus Download PDF

Info

Publication number
WO2014152313A2
WO2014152313A2 PCT/US2014/027198 US2014027198W WO2014152313A2 WO 2014152313 A2 WO2014152313 A2 WO 2014152313A2 US 2014027198 W US2014027198 W US 2014027198W WO 2014152313 A2 WO2014152313 A2 WO 2014152313A2
Authority
WO
WIPO (PCT)
Prior art keywords
file
assets
version
model file
composite product
Prior art date
Application number
PCT/US2014/027198
Other languages
English (en)
Other versions
WO2014152313A3 (fr
Inventor
Alan L. Davidson
Steve Lavietes
Blair J. Zajac, Jr.
Robert B. Engle
Original Assignee
Sony Corporation
Sony Pictures Technologies Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation, Sony Pictures Technologies Inc. filed Critical Sony Corporation
Priority to CN201911351410.XA priority Critical patent/CN111125402B/zh
Priority to CN201480012544.8A priority patent/CN105027207B/zh
Publication of WO2014152313A2 publication Critical patent/WO2014152313A2/fr
Publication of WO2014152313A3 publication Critical patent/WO2014152313A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/219Managing data history or versioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/56Information retrieval; Database structures therefor; File system structures therefor of still image data having vectorial format

Definitions

  • Fig. 1 illustrates a prior art versioning and publishing system 10.
  • the system 10 is of a type similar to that described in US Patent Number 6,947,958, entitled “SYSTEM AND METHOD FOR DOCUMENTING COMPOSITE DATA PRODUCTS", filed September 19, 2001, and issued September 20, 2005, owned by the assignee of the present application and incorporated by reference herein in its entirety.
  • Such systems and methods keep track of assets in a rendered scene, e.g., images, models, or the like. They document the contents of each of a number of composite media products in order to be able to determine the version of each media product or asset used to create the composite media product.
  • An API may be associated with the versioning and publishing system, allowing an artist to request a particular version or representation of an asset.
  • an artist may construct a SPREF which can refer to a particular version of an asset; the same is used in lieu of a path and filename.
  • Version indicators or tags may indicate a particular version number or may provide an indicator such as "current”,
  • FIG. 1 an exemplary SPREF is illustrated for “character 1", and various versions shown, e.g., version 3 and a version "approved by director".
  • the SPREF may then get resolved and a desired asset obtained. If the desired asset is updated to a new version, the SPREF may get linked to the new version.
  • the SPREF itself will stay the same, but will point to the updated file. Versioning and publishing systems such as these allow a convenient way for assets to be catalogued and passed back and forth between artists.
  • VP- VCR systems and methods allow query of a database to obtain data for backup and retrieval, e.g., information about shots made at prior points in time, where a "shot” refers to a scene or associated set of sequential frames (a single image may also be considered a shot in some instances), rendered or not depending on context.
  • information may be obtained about rendered scenes which allow knowledge and subsequent use of each asset employed in the scene, including its proper version and representation at the time of the rendering. Such may be employed not only to obtain prior versions of shots, but also to allow modification of assets in prior shots to obtain new effects.
  • the systems and methods according to present principles may also be employed to, e.g., following approval of a 2-D image, create a complementary stereo image for 3-D viewing.
  • a difference mode may be employed to conveniently visualize differences between versions of shots.
  • Systems and methods according to present principles collect input data, dependencies, and process attributes from an executing process without the need for special instrumentation.
  • the data collected may be employed to backup and restore data as well as to lock data employed to faithfully render elements for feature animation and visual effects productions.
  • the same may also be employed to faithfully render a missing stereo element for the same purposes.
  • Systems and methods according to present principles may provide processes that run at the system level and may collect all opened file and versioning and publishing lookups as well as any attributes that call an API associated with the VP -VCR system during processing.
  • Custom processes may be run after the VP -VCR method is executed to finalize and inject data that is not directly controlled by the versioning and publishing system during the executed VP -VCR process.
  • the invention is directed towards a method of creating a composite product, the composite product including a plurality of assets, at least one of the plurality having associated version information therewith, including: receiving an indication of a model file indicating a desired composite product, the model file indicating one or more assets and respective version indicators constituting the composite product; and creating the composite product using the one or more assets, the one or more assets chosen from among a plurality at least according to the respective version indicators.
  • Implementations of the invention may include one or more of the following.
  • the indication may be received from a GUI including a viewer, the indication associated with a rendered file played back on the GUI.
  • the indication may be associated with a rendering session for the rendered file, where the session may be associated with a path for the rendered file.
  • the method may further include restoring the model file from a queue archive.
  • the composite product or asset may be indicated by a SPREF.
  • the version indicator may be a number or may be associated with a "published", “latest”, “current”, or “approved” reference indicator.
  • the method may further include creating the composite product using a free entry file, where the free entry file is not associated with a version indicator.
  • the method may further include using the model file to populate a product lock table.
  • the method may further include, after the receiving and before the creating, a step of modifying the product lock table.
  • the modifying may include creating a view for a stereo image.
  • the invention is directed towards a non-transitory computer- readable medium, including instructions for causing a computing environment to perform the above method.
  • the invention is directed towards a method of storing data about a composite product, the composite product including a plurality of assets, at least one of the plurality having associated version information therewith, including, upon receipt of a command to render CG assets constructed in a model file within an application, storing in a database a record of each asset called by the model file, including a version indicator of at least one asset called by the model file.
  • Implementations of the invention may include one or more of the following.
  • the method may further include storing a path of a render file in a database.
  • the method may further include saving the model file associated with the CG assets constructed within the application in a queue archive database.
  • the method may further include storing a path of each asset called by the model file.
  • the method may further include saving a record of a version of the application.
  • the version indicator may be a number or may be associated with a "published", “latest”, “current”, or “approved” indicator.
  • the method may further include storing data in the database about free entry files referenced by the file.
  • the method may further include locking each asset referenced by the model file against modification.
  • the method may further include populating a product lock table with data about each asset called by the file, including a version indicator of each product if available.
  • the invention is directed towards a non-transitory computer- readable medium, including instructions for causing a computing environment to perform the above method.
  • the invention is directed towards a module implemented on a non-transitory computer-readable medium for storing data about a composite product, the composite product including a plurality of assets, at least one of the plurality having associated version information therewith, including: a pre-processing module, the preprocessing module for, for CG assets constructed in an application, upon receipt of a render command, saving a model file associated with the CG assets constructed within the application; a locking module to lock each asset called by the file against modification; and a recording module, the recording module storing in a database a record of each asset called by the file, including a version number or a file path of each asset called by the model file.
  • inventions may include one or more of the following.
  • Certain implementations of the systems and methods may provide convenient ways to re-create past or prior shots, as well as to modify the same to create new images.
  • the disclosed systems and methods may also provide a way to catalog assets used in a render, including their version information, and thus what assets are required to be brought online to re-perform a render.
  • Systems and methods may also provide ways to know, for a given asset, which renders used the same.
  • Systems and methods may allow convenient creation of a complementary stereo image for 3-D viewing, as well as convenient visualization of differences between versions of shots.
  • FIG. 1 illustrates a prior art system employed for versioning and publishing, e.g., for keeping track of versions of assets.
  • Fig. 2 illustrates a schematic layout of a number of graphics workstations interacting through an API with a versioning and publishing system and its associated databases.
  • the databases may be distributed as shown or may form different portions of a common database.
  • FIG. 3 is a flowchart of a VP -VCR method according to principles disclosed here, for recording assets and their associated versions, as well as free entries, during a rendering process.
  • FIG. 4 is a flowchart of a VP -VCR method according to principles disclosed here, for restoring a prior rendering session, including restoring assets employed according to their respective versions at the time of a prior render.
  • Fig. 5 is a flowchart of a VP -VCR method according to principles disclosed here, for restoring a prior rendering session, including modifying assets employed in a rendering process.
  • Fig. 6 is a flowchart of a VP -VCR method according to principles disclosed here, for employing a difference mode to compare two shots rendered using different assets.
  • FIG. 7 is a flowchart of a VP -VCR method according to principles disclosed here, for restoring a prior rendering session, including creation of a complementary stereo image to obtain a 3-D view.
  • FIG. 8 illustrates an exemplary computing environment in which systems according to principles disclosed here may be embodied and using which methods according to principles disclosed here may be carried out.
  • FIG. 9 illustrates another exemplary computing environment in which systems according to principles disclosed here may be embodied and using which methods according to principles disclosed here may be carried out.
  • a number of graphics workstations 25 are illustrated accessing a VP -VCR system 21 through a versioning and publishing API 24.
  • the versioning and publishing API 24 and resolves SPREFs delivered to it from the graphics workstations 25 and evaluates the same to return a path of a product or asset to the graphics workstation 25.
  • a SPREF 34 in the form of a string that references a product or asset, is presented by the graphics workstation 25 to the versioning and publishing API 24 which resolves the SPREF and in particular determines what asset the SPREF is pointing to, and retrieves the same from a database 26.
  • the SPREF may have one or more tags that indicate a number of different versions or representations, including by version number, "highest”, “current”, “approved”, “published”, and many more. If one indicator does not exist, e.g. "current”, then the versioning and publishing API 24 will check the successive indicator, stopping when the first existing indicator is found. In any case, following resolution, a path 36 to a referenced product or asset is provided back to the graphics workstation 25.
  • the graphics workstation 25 runs a modeling/lighting application 16 and at a given point in time may be employed by a user to operate on a model file 17 in which various SPREFs 18 and 22 are referenced.
  • a typical modeling/lighting application 16 may include Katana®, a node -based lighting system, available from The Foundry in London, UK, and Los Angeles, California.
  • Other types of applications may also benefit from the systems and methods according to present principles, including, e.g., Nuke® compositing products, also available from The Foundry.
  • any application which uses composite products may benefit from the systems and methods disclosed here.
  • a command to render the model file 17 performs a rendering, using the models and objects in the file, as well as the selected lighting and cameras.
  • a command to render the model file 17 sends the model file 17 with the referenced SPREFs to a rendering queue 32, and the rendering may subsequently be performed in a rendering farm, with, e.g., each processor rendering a different frame of the desired scene.
  • the command to render may also lock down product or asset lookups as of that time, e.g., at a VNP APPLICATION TIME, so that assets which may be modified during the rendering are only included as they were at the time of the rendering command, and not as later modified.
  • a unique session ID 35 is created and assigned to the render, and a reference of the session ID and the source file is saved in a queue archive 29.
  • the session ID 35 is a token that is employed at least in part to recombine the rendered frames post-rendering.
  • the session ID 35 may include information about the time the rendering was begun, and users may add arbitrary metadata to the session ID. To avoid situations where users write over model files, a copy of the model file 17 itself may also be saved in the queue archive 29.
  • the VCR database 28 which may be, e.g., an Oracle® or an Apache Cassandra® database, records these API calls, including information about what product or asset was evaluated, what version it resolved to, and what file it pointed to. Lookups using the versioning and publishing system are registered directly with the VP -VCR, and the same may be SPREFs or other representations.
  • the database 26 of assets has been shown as a separate database from the queue archive 29 and the VCR database 28. Generally, however, the three databases may be merged or may be spread over a number of different databases, according to the requirements of the system.
  • the system and method may also record any files opened by the application or applications used by the render process during the render.
  • a shim library layer (or equivalent) may be employed to intercept calls to the operating system to open files, or alternatively a patch may be employed at the operating system level to perform this function. Many of these recorded files may be duplicative of those referenced by calls to the API 24, but may nevertheless provide useful information regarding which files were employed in the render.
  • Free entries are generally file paths that were referred to during the render, as opposed to versioning and publishing entries which are product or asset lookups.
  • the result of this monitoring of file openings at the operating system level is a list of file paths, referencing files accessed through the versioning and publishing API 24 as well as free entries. These processes generally collect everything that was used during the render. It is noted that the versioning and publishing API 24 may include information that a directory was accessed, but without details as to the files within the directory. The file path information from the shim library layer intercept system or the like allows the retrieval of this more specific information. The information collected may be added to the information gained from the versioning and publishing lookup and the combination may be stored together.
  • the information gained may itself provide useful information for debugging. For example, if the versioning and publishing API 24 makes a lookup but the file was never actually opened, this may indicate an error. In the same way, a user familiar with the latest version of a product or asset may quickly review the contents of the stored file in the VCR database 28 to determine if the latest version was in fact used; if not, an error may be inferred.
  • Fig. 3 is a flowchart 30 illustrating a method according to present principles.
  • a model is constructed and aspects such as lighting and camera positions are designed (step 42). Such construction may be performed using the applications noted above, including node -based lighting systems, compositing applications, and the like.
  • a command to render the model file may then be received, and the render started (step 44). In some cases a rendering is performed at the local workstation, and in other cases the render is sent to a rendering queue (step 46). In some cases, a time is noted as to when the render began, e.g., termed VNP APPLICATION TIME (step 47).
  • VNP APPLICATION TIME may be useful in some contexts where it is known at what time a desired image was created. In many cases, however, a director may view a series of prior shots and simply determine on an artistic basis which is preferable.
  • the various assets may be locked (step 54), i.e., so that modifications to the assets, made subsequent to the VNP APPLICATION TIME, are not recorded by the VP- VCR, and so that subsequent modifications to the assets are not included in the rendered image.
  • the model file may be saved into the queue archive (step 56), so that the same is safe from inadvertent overwrites by users.
  • the model file may then be restored by the modeling application later if it is desired to obtain a prior version of a rendering.
  • the system may then record what happens during the render (step 48).
  • the VP -VCR creates a listing of everything that was opened by the render application during the render (step 52), including an indication of the version of the assets opened.
  • the listing (which may be within a file 64 in a database) may include entries from calls to versioning and publishing API 24 (step 58) as well as files opened by the application or applications used by the render process during the render (step 62).
  • Attributes may also be stored in the file 64, and the attributes may include arbitrary output variables (AOVs), i.e., user-defined variables containing data calculated by the system, or other data such as frame range, command used to start the job, camera used, or other such data in the nature of metadata. Generally any data may be stored that may be helpful to re-create the rendering job.
  • AOVs arbitrary output variables
  • Fig. 4 is a flowchart 40 illustrating steps of a method for restoring a prior shot.
  • a requester e.g., a director
  • a GUI may provide information about the views imaged from metadata stored therein, e.g., including time of rendering, products or assets included (including thumbnails), and the like.
  • a requester finds a desired shot the same may be selected.
  • Each rendered shot may have one or more associated session IDs, and the same may then be looked up, along with a filename and path for the model file associated with the session ID (step 68).
  • the GUI may indicate, e.g., a file in a database containing entries that happened during the render, including a notation of all of the prior versions of products or assets used.
  • a select shot may be identified by a filename and path exclusively, with no session ID recorded.
  • a GUI specific to the VP -VCR is not required either, as shots may be viewed using a number of types of viewers.
  • Other variations will also be seen.
  • the model file for the shot i.e., a composite product with various subproducts or assets
  • a model file may be restored from a queue archive using a command such as "— restore queue archive" in the modeling/lighting application, e.g., Katana®, Nuke®, or the like.
  • Attributes stored in the queue archive may include the version of the modeling/lighting application used during the render, and thus the restoration may include opening the modeling/lighting application to the correct version.
  • the requester may be presented with a list of assets opened during the render. These assets may be brought online or otherwise obtained access to (step 76). The re-rendering may then occur.
  • the re-rendering occurs by exporting a "product lock table" and using the same in the modeling/lighting application (step 82).
  • a product lock table provides a list of referenced assets for an associated render.
  • the product lock table allows essentially an override of products or assets that would otherwise be referenced (e.g., the latest version of all products or the version at the VNP_APPLICATION TIME), with the desired set of products or assets, i.e., prior assets using versions recorded by the VP -VCR.
  • the desired render i.e., composite product
  • a first step may be to import a product lock table into a modeling/lighting application (step 86).
  • the product lock table may then be modified (step 88).
  • the product lock table is no longer "all or nothing".
  • Certain objects may be locked but others may be modified.
  • the same scene from a prior rendering may be rendered but with a different color texture on a selected object.
  • an object may be changed to another object, e.g., for purposes of creating a different backdrop with a different building .
  • the files that the free entries point to may also be modified but doing so may be more complicated since the same do not appear in the product lock table. For example, a path of a particular free entry may be modified to point to a different file. Of course, to faithfully recreate a desired scene, neither the free entry nor the file it points to should be modified.
  • the model file may be rendered (step 92), e.g., by sending the file to a rendering queue or performing local rendering.
  • Fig. 6 is a flowchart 60 of an alternative implementation according to principles described here.
  • a director or other user may view two shots or images and may further view differences between the shots.
  • the requester reviews and/or selects two shots using an interface, e.g., a GUI associated with the VP -VCR (step 94).
  • Sessions and/or file paths are determined for the shots (step 96).
  • Sets of corresponding assets used for the different shots are then determined (step 98).
  • a difference mode may then be employed to find the differences between the shots (step 102).
  • the difference mode may take a number of different forms. For example, both shots may be portrayed side-by-side (step 97) and a viewer may visually determine differences.
  • step 99 objects for which no differences exist may be portrayed in black, and only objects for which differences exist may be visible in the GUI. An indication may be made of the differences between the objects.
  • a difference mode may be employed where two files 64 are illustrated, with the differences between the files indicated textually, e.g., in a "redline" type mode. Other implementations will also be apparent to one of ordinary skill in the art given this disclosure.
  • Fig. 7 is a flowchart 70 indicating use of systems and methods according to present principles in the creation of a complementary image, e.g., a stereo image
  • implementations may be particularly pertinent where a 2-D image has been finalized in the past, and a studio has decided to create a 3-D version.
  • a session ID is determined (step 106), and a model file is restored, including bringing assets online if needed (step 108).
  • the model file is restored from the queue archive into a modeling/lighting application, compositing application, or other such application which can construct a scene and render the same using referenced assets. In so doing the prior stored values may be employed to seed a product lock table within the application.
  • a camera product may then be inserted into the constructed scene to create a stereo view (step 112).
  • the camera product is a virtual camera that can construct an alternate view, e.g., one from a viewpoint spaced a few inches away from the initial imaging point, to construct a stereo complementary image.
  • the additional camera product will be identical in its parameters to the camera product which provided the initial view, and the only difference will be the spacing between their imaging planes.
  • an entirely different set of cameras may be employed to realize the new view, with the use of stored VP -VCR data solely to provide an initial seed of the product lock table. Use cases for this include situations where the previously-rendered version is selected but an undesired feature exists in the background. In this case, the new cameras may be employed to visualize views lacking that undesired feature.
  • the stereo image desired e.g., that corresponding to the initial view, may then be created (step 114).
  • Fig. 8 illustrates a computing environment 80 which may be employed to perform steps noted above, or other steps as appropriate to the VP -VCR system.
  • the computing environment 80 includes a processor 116 and various modules.
  • the modules may be contained within a local computing device or may be distributed according to principles of distributed computing.
  • a pre-processing module 118 may perform various functions, e.g., saving the model file in a queue archive.
  • a locking module 122 may perform functions including, once a render is running, locking assets, modified subsequent to the start of the rendered, from being included in the render.
  • a recording module 124 may perform the majority of the VP- VCR functions, including recording versioning and publishing API calls as well as file openings.
  • modules may also be employed to perform standard functionality for such a computing environment, these other modules not shown in Fig. 8 for simplicity.
  • Such modules include those appropriate for input and output files, running a modeling/lighting application, interacting with the versioning and publishing API, in the like.
  • One implementation includes one or more programmable processors and corresponding computer system components to store and execute computer instructions, such as to provide the tools for storing information about assets and restoring and modifying sessions to create new rendered shots and scenes.
  • One such computing environment is disclosed below.
  • the computing environment 90 includes a controller 126, a memory 132, storage 136, a media device 142, a user interface 148, an input/output (I/O) interface 152, and a network interface 154.
  • the components are interconnected by a common bus 156.
  • connection configurations can be used, such as a star pattern with the controller at the center.
  • the controller 126 includes a programmable processor and controls the operation of a content creation system 128.
  • the controller 126 loads instructions from the memory 132 or an embedded controller memory (not shown) and executes these instructions to control the system.
  • Memory 132 which may include non-transitory computer-readable memory 134, stores data temporarily for use by the other components of the system.
  • the memory 132 is implemented as DRAM.
  • the memory 132 also includes long-term or permanent memory, such as flash memory and/or ROM.
  • Storage 136 which may include non-transitory computer-readable memory 138, stores data temporarily or long-term for use by other components of the system, such as for storing data or instructions.
  • the storage 136 is a hard disc drive or a solid state drive.
  • the media device 142 which may include non-transitory computer-readable memory 144, receives removable media and reads and/or writes data to the inserted media.
  • the media device 142 is an optical disc drive or disc burner, e.g., a writable Blu-ray® disc drive 146.
  • the user interface 148 includes components for accepting user input, e.g., the user indication of a desired shot or other aspects discussed above, and presenting a display, e.g., of rendered images, to the user.
  • the user interface 148 includes a keyboard, a mouse, audio speakers, and a display.
  • the controller 126 uses input from the user to adjust the operation of the computing environment.
  • the I/O interface 152 includes one or more I/O ports to connect to
  • the ports of the I/O interface 152 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports.
  • the I/O interface 152 includes a wireless interface for wireless communication with external devices. These I/O interfaces may be employed to connect to one or more content playback devices.
  • the network interface 154 allows connections with the local network and includes a wired and/or wireless network connection, such as an RJ-45 or Ethernet connection or "Wi-Fi" interface (802.11). Numerous other types of network connections will be understood to be possible, including WiMax, 3G or 4G, 802.15 protocols, 802.16 protocols, satellite, Bluetooth®, or the like.
  • the system may include additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity.
  • additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity.
  • different configurations of the devices can be used, e.g., different bus or storage configurations or a multi-processor configuration.
  • Exemplary computing environments which may be employed include those pertaining to personal computers, laptop computers, notebook computers, netbook computers, handheld computers, personal digital assistants, mobile phones, smart phones, tablet computers, hand-held gaming devices, gaming consoles, Internet appliances, and also on devices specifically designed for these purposes, in which case the special device would include at least a processor and sufficient resources and networking capability to run the content creation application.

Abstract

L'invention porte sur des systèmes et des procédés qui permettent une recréation de scènes antérieures, même si des actifs utilisés dans les scènes ont évolué au cours du temps. Les systèmes et les procédés utilisent une interrogation d'une base de données afin d'obtenir des données pour une sauvegarde et une récupération, par exemple des informations concernant des plans faits à des moments antérieurs, un « plan » se rapportant à une scène ou à un ensemble associé d'images séquentielles (une seule image peut également être considérée comme étant un plan dans certains cas) rendu ou non selon le contexte. Dans les systèmes et procédés VP-VCR, des informations peuvent être obtenues concernant des scènes rendues qui permettent de connaître et d'utiliser subséquemment chaque actif utilisé dans la scène, y compris sa propre version et représentation au moment du rendu. Cela peut être utilisé non seulement pour obtenir des versions antérieures de plans, mais également pour permettre une modification d'actifs dans des plans antérieurs afin d'obtenir de nouveaux effets.
PCT/US2014/027198 2013-03-15 2014-03-14 Procédé et système d'enregistrement d'informations concernant des actifs rendus WO2014152313A2 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911351410.XA CN111125402B (zh) 2013-03-15 2014-03-14 用于记录关于被渲染资产的信息的方法和系统
CN201480012544.8A CN105027207B (zh) 2013-03-15 2014-03-14 用于记录关于被渲染资产的信息的方法和系统

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/842,552 US10339120B2 (en) 2013-03-15 2013-03-15 Method and system for recording information about rendered assets
US13/842,552 2013-03-15

Publications (2)

Publication Number Publication Date
WO2014152313A2 true WO2014152313A2 (fr) 2014-09-25
WO2014152313A3 WO2014152313A3 (fr) 2014-12-04

Family

ID=51533024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/027198 WO2014152313A2 (fr) 2013-03-15 2014-03-14 Procédé et système d'enregistrement d'informations concernant des actifs rendus

Country Status (3)

Country Link
US (1) US10339120B2 (fr)
CN (2) CN105027207B (fr)
WO (1) WO2014152313A2 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10303651B2 (en) * 2016-02-25 2019-05-28 Sap Se Load back of archived data
US10086289B2 (en) * 2016-11-22 2018-10-02 Sony Interactive Entertainment America Llc Remastering by emulation
EP3571558A1 (fr) * 2017-02-20 2019-11-27 Siemens Aktiengesellschaft Produits bouclant la boucle
CN109963205A (zh) * 2017-12-25 2019-07-02 上海全土豆文化传播有限公司 多媒体剪辑方法及装置
US10732940B2 (en) * 2018-04-27 2020-08-04 EMC IP Holding Company LLC Enterprise services framework for presentation layer management
US20220101619A1 (en) * 2018-08-10 2022-03-31 Nvidia Corporation Cloud-centric platform for collaboration and connectivity on 3d virtual environments
US10740537B2 (en) 2018-11-01 2020-08-11 Dell Products L.P. Enterprise form dependency visualization and management
US20220134222A1 (en) * 2020-11-03 2022-05-05 Nvidia Corporation Delta propagation in cloud-centric platforms for collaboration and connectivity

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070240072A1 (en) * 2006-04-10 2007-10-11 Yahoo! Inc. User interface for editing media assests
US20100281383A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Segmented Timeline for a Media-Editing Application
US20120120054A1 (en) * 2001-05-04 2012-05-17 Jared Sandrew System and method for minimal iteration workflow for image sequence depth enhancement

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5603018A (en) * 1991-07-15 1997-02-11 Mitsubishi Denki Kabushiki Kaisha Program developing system allowing a specification definition to be represented by a plurality of different graphical, non-procedural representation formats
US6208348B1 (en) 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US7178106B2 (en) * 1999-04-21 2007-02-13 Sonic Solutions, A California Corporation Presentation of media content from multiple media sources
US6947958B2 (en) * 2001-09-19 2005-09-20 Sony Corporation System and method for documenting composite data products
US20050163462A1 (en) * 2004-01-28 2005-07-28 Pratt Buell A. Motion picture asset archive having reduced physical volume and method
US7873685B2 (en) * 2004-05-13 2011-01-18 Pixar System and method for flexible path handling
US9047915B2 (en) * 2004-04-09 2015-06-02 Sony Corporation Asset revision management in media production
US8219637B2 (en) * 2004-05-14 2012-07-10 Pixar Storage management for renderfarm
US7580986B2 (en) * 2004-05-17 2009-08-25 Pixar Dependency graph-based aggregate asset status reporting methods and apparatus
US7683904B2 (en) * 2004-05-17 2010-03-23 Pixar Manual component asset change isolation methods and apparatus
US7821516B2 (en) * 2004-05-17 2010-10-26 Pixar Automatic pre-render pinning of change isolated assets methods and apparatus
CN100459500C (zh) * 2006-01-18 2009-02-04 腾讯科技(深圳)有限公司 一种客户端软件加载功能扩展文件的方法
US8024356B2 (en) * 2006-02-03 2011-09-20 Autodesk, Inc. Database-managed image processing
JP2010524125A (ja) * 2007-04-12 2010-07-15 トムソン ライセンシング メディアの生成及び分配のための動作管理ソリューション
US8605081B2 (en) 2008-10-26 2013-12-10 Zebra Imaging, Inc. Converting 3D data to hogel data
US8624898B1 (en) * 2009-03-09 2014-01-07 Pixar Typed dependency graphs
US10419722B2 (en) * 2009-04-28 2019-09-17 Whp Workflow Solutions, Inc. Correlated media source management and response control
US8311983B2 (en) * 2009-04-28 2012-11-13 Whp Workflow Solutions, Llc Correlated media for distributed sources
US9477667B2 (en) * 2010-01-14 2016-10-25 Mobdub, Llc Crowdsourced multi-media data relationships
US20120001906A1 (en) 2010-06-30 2012-01-05 Blue Sky Studios, Inc. Methods and systems for 3d animation
US9196074B1 (en) * 2010-10-29 2015-11-24 Lucasfilm Entertainment Company Ltd. Refining facial animation models
US10445398B2 (en) * 2012-03-01 2019-10-15 Sony Corporation Asset management during production of media
CN102929600B (zh) * 2012-06-13 2016-06-29 许继电气股份有限公司 基于elf的监控系统版本识别方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120120054A1 (en) * 2001-05-04 2012-05-17 Jared Sandrew System and method for minimal iteration workflow for image sequence depth enhancement
US20070240072A1 (en) * 2006-04-10 2007-10-11 Yahoo! Inc. User interface for editing media assests
US20100281383A1 (en) * 2009-04-30 2010-11-04 Brian Meaney Segmented Timeline for a Media-Editing Application

Also Published As

Publication number Publication date
CN111125402B (zh) 2024-03-08
US10339120B2 (en) 2019-07-02
CN105027207B (zh) 2020-01-17
CN111125402A (zh) 2020-05-08
WO2014152313A3 (fr) 2014-12-04
CN105027207A (zh) 2015-11-04
US20140279976A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US10339120B2 (en) Method and system for recording information about rendered assets
US8610713B1 (en) Reconstituting 3D scenes for retakes
US9230294B2 (en) Preserving and reusing intermediate data
JP2022553766A (ja) 没入型コンテンツから2dフィルムを作成するためのシステムおよび方法
US9811936B2 (en) Level-based data sharing for digital content production
US11328470B2 (en) Distributed multi-context interactive rendering
US10990505B2 (en) Stipulated overrides with violation resolution
CN114463104B (zh) 用于处理vr场景的方法、装置和计算机可读存储介质
US11900545B2 (en) Creating effects based on facial features
US20230215465A1 (en) Visual effect design using multiple preview windows
US9729863B2 (en) Generating content based on shot aggregation
US10445398B2 (en) Asset management during production of media
US11610349B2 (en) Filling empty pixels
US11928078B2 (en) Creating effect assets while avoiding size inflation
US20230229443A1 (en) Synchronizing multiple instances of projects
US20240029381A1 (en) Editing mixed-reality recordings
Lieng et al. Interactive Multi‐perspective Imagery from Photos and Videos
US20240029351A1 (en) Scene tracks for representing media assets
Yudin et al. Millefiori: a USD-based sequence editor
US20130156399A1 (en) Embedding content in rich media
CN116993870A (zh) 一种项目管理方法、装置、设备及计算机可读存储介质
CN114510370A (zh) 基于全景编辑器的备份方法、装置、电子设备及存储介质
CN116991513A (zh) 配置文件生成方法、装置、电子设备、介质及程序产品
Spadaro et al. Maya 4.5 Bible
US20140129608A1 (en) Distributed production pipeline

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480012544.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14767679

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 14767679

Country of ref document: EP

Kind code of ref document: A2