EP3824440A1 - Procédé mis en oeuvre par ordinateur pour la création de contenus comprenant des images de synthèse - Google Patents
Procédé mis en oeuvre par ordinateur pour la création de contenus comprenant des images de synthèseInfo
- Publication number
- EP3824440A1 EP3824440A1 EP19759643.0A EP19759643A EP3824440A1 EP 3824440 A1 EP3824440 A1 EP 3824440A1 EP 19759643 A EP19759643 A EP 19759643A EP 3824440 A1 EP3824440 A1 EP 3824440A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- project
- server
- data
- real
- result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2213/00—Indexing scheme for animation
- G06T2213/08—Animation software package
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/024—Multi-user, collaborative environment
Definitions
- Computer-implemented method for creating content including computer generated images
- the present invention relates to a computer-implemented method for the creation, in a collaborative manner and in a process, also called pipeline, unified and real-time, of digital animated and sound sequences comprising synthetic images.
- the term 2D is generally opposed, associated with traditional animation for example produced by a succession of images drawn by hand or with video, with the term 3D, corresponding to the creation of animated sequences or still images whose creation is the result of calculations generally performed by a computer. It is for this reason that the images produced in 3D are called synthetic images.
- 3D animated sequences can be of two types: precomputed or real-time.
- pre-calculated sequences 3D animations are calculated in advance and the content created is then saved in a video file or in the form of a sequence of digital images. Once calculated, the content of the images can no longer be modified.
- Real-time sequences are calculated at the time of display, generally by dedicated processors known as GPUs or graphics cards specially designed to calculate computer graphics at very high speed
- a frequency of generation of these 2D or 3D animated sequences, precalculated or real-time is generally at least 24 images per second regardless of the size of the image. , the number of outputs and the sound quality generated.
- a stage of creation of 3D models (a human, an animal, a possibly articulated object), stage also called modeling or in English surfacing.
- the appearance of the model such as the color of its surface or a matt or shiny appearance for example, is also defined during this step.
- assets are called assets.
- a so-called layout stage (this term having no French-speaking equivalent for those skilled in the art).
- a so-called animation stage It consists in animating the elements put in place during the layout using different methods.
- a lighting stage To be visible, the elements making up the scenes from the layout, filmed from the viewing angles chosen during the layout stage, must be lit.
- a usual process for producing animation content in computer graphics, called production pipeline also generally comprises, before the editing step E5, a rendering step, during which effects of material and texture are applied. the elements of the scenes represented in order to give them a visual appearance conform to the desired artistic criteria.
- producing linear 3D animation content requires that it be possible, at any time during the creation process, to make modifications to the stages of modeling, layout, animation, lighting or assembly because the results produced at each of these stages have an impact on the final result; in other words on the content produced at the end of the chain.
- a 3D animated feature film project for cinema is the result of tens or even hundreds of thousands of modifications made by all the people contributing to its realization.
- micro-modifications such as for example small adjustments on the color of a object, over the length of a shot in a montage
- macro modifications such as for example deciding to modify the appearance of the main character of the story.
- Macro-modifications are less frequent than micro-modifications but they are more visible and have a greater impact on the creation process.
- the production pipelines of the prior art pose many problems, including:
- a modification made upstream of the production chain requires going through all the stages which are downstream of the stage where the modification is made, before being able to be judged in the context final content (at the end of the editing stage).
- the process is similar to that of a domino or cascade effect: a modification upstream triggers a whole series of events downstream until reaching the last stage of the chain.
- the tools used at the various stages of production do not produce results in real time and do not communicate with each other (due to the fragmented nature of the production pipeline), the changes made cause significant loss of time of production. It is not uncommon for a change to take several minutes, several hours, or sometimes several days before it can produce a visible result, in particular depending on its position in the production chain.
- One of the well-known problems is to optimize the process of creating 3D animation content, mainly linear, whether real-time or precomputed, by ensuring in particular that a large number of people can work simultaneously on the production. of these contents.
- a process of unified collaborative and real-time pipeline implemented by computer for the collaborative creation of animation content characterized in that it comprises on the one hand stages of production and distribution of animation content in synthetic images intended to be implemented by a plurality of terminals in cooperation with a central server, and secondly steps of management of this animation content adapted to allow the central server to centralize and manage all the data produced at the stage of production stages;
- stages of production of said unified real-time process comprising:
- said management steps comprising:
- - a step of managing a production history, adapted to ensure the transmission and recording of the result of the implementation of production steps by a terminal to the central server; - a step of updating the project stored on said server based on said results of the implementation of production steps by a terminal transmitted during the step of managing the production history;
- a conflict detection step adapted to be implemented on the server so as to detect when at least two production steps have created, modified or deleted, directly or via another linked data item, simultaneously at least one and the same stored data item on the central server;
- a conflict resolution step when a conflict is detected in the previous step, capable of determining the creation or creations, modifications or deletions to be applied to said at least one data item for which a conflict is detected.
- the method comprises a step of real-time synchronization of the project between the central server and the said terminals so that each terminal implementing the production steps of the process receives all or part of the updated project data according to all the modifications and creations made by all of the terminals and the server, said synchronization step being adapted to be implemented by the server during operation in collaborative work mode and / or by said terminals when they connect to the server.
- the method comprises, for said steps of updating and synchronizing the project between the central server and said terminals, a plurality of data synchronization modules, said plurality of modules comprising:
- a real-time update module adapted to implement a cryptographic encoding function generating a hash key as a function of said project data, said real-time update module being adapted to determine whether data from the imported project must be saved by said terminals and the server;
- a real-time optimization module able to detect changes in transient states of project data, and being adapted to compress said list of project creation history so as to reduce the amount of data transferred and stored by said terminals and the server;
- a real-time learning module capable of analyzing data from the history of project creation, and defining an order of priority, according to which said server transmits and updates the data to said terminals;
- a real-time versioning module capable of preserving the history of project creation in the form of a series of backups of the total state of the project and of intermediate revisions relating to these states; said frequency of backups of the total states being a function of learning data from said real-time learning module;
- a real-time marking module capable of authorizing a user of a terminal to mark with at least one tag a key stage in the development of the project, said marking module making it possible to restore said project to its state at the moment marking.
- the method further comprises steps for managing access control to prohibit or authorize the implementation of all or part of the steps of production and management at a terminal connected to the server.
- the rights to implement the process can be segmented in order to limit interactions during collaborative work involving many people.
- Access control also makes it possible to limit the risks of modification or deletion of content, for example accidental.
- the conflict resolution step includes the exclusion from the project of a first result of the implementation of production steps by a first terminal, when a second result of the implementation of production steps by a second terminal generated the detection of a conflict, the previous event being excluded if one of the following criteria is met:
- the first result deletes an object that has been deleted, modified, added or referenced by the second result
- the first result adds an object that has been deleted, added or modified by the second result
- the first result modifies a property of an object which has been deleted by the second result
- the first result modifies a unique property of an object which has also been modified by the second result
- the first result adds a reference to an object which has been deleted by the second result;
- the first result adds a reference to an object or a value for a property of an object that can have several values, which has been added, deleted or changed by the second result;
- the first result deletes a reference to an object or a value of an object that can receive several values for the same property having been added, deleted or changed by the second result;
- the first result moves a reference to an object or a value of a property that can receive several values having been added, deleted or moved to the same property by the second result.
- the method comprises an automatic learning module adapted to optimize the sequence of loading data into the memory of said terminals in order to restore in animated and sound image the content of the project in real time on said terminals, based on the data from the project creation history, the project data and metadata generated by said terminals.
- an automatic learning module adapted to optimize the sequence of loading data into the memory of said terminals in order to restore in animated and sound image the content of the project in real time on said terminals, based on the data from the project creation history, the project data and metadata generated by said terminals.
- said steps of production and distribution of animation content comprise a step of real-time display of said animation content on an augmented reality device, such as a smartphone or a tablet. , connected to said server.
- an augmented reality device such as a smartphone or a tablet.
- the method implements a step of creating a virtual camera adapted for an augmented reality device, said step of creating a virtual camera being implemented after said step of opening and editing at least minus a 3D scene.
- the invention also relates to a server device comprising a network interface, a storage memory and a processor for implementing at least the management steps and / or the steps of producing and broadcasting the animation content of the method as described. previously.
- the invention also relates to an augmented reality assembly, comprising a server device as described above and a device augmented reality, such as a smartphone or a tablet, said server device implementing the steps of producing and broadcasting the animation content of the method as described above.
- the invention also relates to a computer terminal for controlling a man-machine interface adapted to execute and / or carry out at least the production steps of the method described above, and comprising a network interface for communicating with said server device described above.
- the invention also relates to a computer system comprising a server device as described above and one or more computer terminals as described above.
- the invention also relates to a storage medium readable by a computer, for example a hard disk, a mass storage medium, an optical disk, or any other suitable means, having recorded on it instructions which control a server device and / or a computer terminal to execute a method as described above.
- a storage medium readable by a computer, for example a hard disk, a mass storage medium, an optical disk, or any other suitable means, having recorded on it instructions which control a server device and / or a computer terminal to execute a method as described above.
- FIG. 1 and 2 are schematic views of a production pipeline of the prior art
- FIG. 3 is a schematic view of the interactions between stages of production of a process according to an embodiment of the invention.
- FIG. 4 is a graphical representation of an animation content project in computer graphics
- - Figure 5 is a representation of a 3D scene known from the prior art represented in its most conventional form by a series of objects or 3D models, called assets, each comprising properties allowing to modify the appearance;
- - Figure 6 is a schematic view of the organization of the data of a project content of a method according to an embodiment of the invention;
- FIGS. 7 to 16 are simplified views of user interfaces of the implementation of the method on a computer according to an embodiment of the invention.
- FIG. 17 is a schematic view of a synchronization step according to an embodiment of the invention.
- FIG. 18 is a schematic view of a group of diffusion and distribution steps according to one embodiment of the method.
- FIG. 19 is a schematic view of a group of diffusion and distribution steps according to another embodiment of the method.
- the invention relates to the design of a process dedicated to the creation, production, distribution and distribution of linear animation content or more generally the creation and distribution of animated and sound sequences using a variety of sound sources. and graphics which can be combined together such as, for example, computer graphics, also called 3D content, mainly, but also digital images and videos, said 2D content, in a process, or pipeline, which is both unified, real-time, collaborative and connected to other methods of creating real-time animation content, especially for augmented reality.
- the animated and sound sequences generated by this invention can either be precalculated and saved in video files for example, or calculated on the fly which allows them to be used on augmented reality or virtual type systems or any other system of or existing or future display or broadcast (such as streaming) for which sound and animated sequences in computer graphics must be calculated on the fly, in real time (3D real-time visualization).
- the method implemented by computer according to the invention comprises a plurality of production steps leading to the production of content, which can be implemented in parallel and independently of each other.
- the method according to the invention is also called Collaborative Unified Pipeline or Collaborative Unified Pipeline in English, to which we will refer later in this document by the acronym CUP.
- the user of the process will be referred to as any person, or any group of people, acting on the process implemented by computer according to the invention, via a computer or any device able to communicate with the computer implementing all or part of the process.
- the method according to the invention comprises two groups of main production steps: creation and editing steps and dissemination steps.
- the first group of production steps is generally implemented on user terminals, while the second group of steps is in this embodiment implemented on the server.
- the method further comprises a third group of steps called management steps, these steps being jointly implemented by the terminals and the server, these steps notably comprising the history and conflict resolution steps, which will be described later. .
- the first group of production steps called creation and editing functions includes all of the steps E1 -E5 of the production of 3D animation content, with reference to FIG. 1, from the step of the modeling E1 until the final stage of assembly E5 as we have described them in the prior art.
- creation of new 3D scenes which may however contain a mix of other sources (such as digital images, videos, etc.), and creation of new sequences; Opening and editing of 3D scenes created in the previous step.
- the user of the process can model on site or import 3D models created with other solutions, choose viewing angles (via virtual cameras that are placed in the scene), add lights, and animate all the objects in the scene (models, cameras, light sources, etc.) and all the properties of these objects (such as the color of a 3D object).
- a scene can also contain several versions of the animation (or take animation in the terminology of those skilled in the art);
- Opening and editing of sequences created in step 1.2 the user of the process can edit the content of a sequence.
- the process involves putting a set of clips end to end as in any video editing solution.
- the invention uses as a plan not videos but the 3D scenes created in the previous step 1.3. For each scene used in the montage of the sequence, the user must at least specify the camera or the angle of view from which this scene will be calculated.
- a shot in this system is defined at least by a 3D scene, the version of the animation that must be used for this scene when it is played, a viewing angle from which this scene is filmed when it is played in the montage, and usual montage information like the position of the plane in the montage, its duration, and its point of entry and exit.
- the content of the project is calculated in 2D to be projected on the screen of the computer on which the system is running (it may also be the screen of a tablet or smartphone or any projection device connected to the computer).
- this content is calculated to be broadcast on virtual and augmented reality systems.
- the content can be calculated on the fly on one or more processors and the result, the video and audio output, broadcast (or streamed according to a neologism from English frequently used by those skilled in the art) to another electronic / computer device.
- this screen or navigation system was designed to bring together all the stages of the production of 3D animation content in a single process, in other words in a single solution, in order to allow the user to work on any aspect of the film (layout, editing, lighting, animation, etc.) in parallel and with real-time feedback (next point).
- the process according to the invention is a real-time process. To do this, the process relies on two elements:
- the unified pipeline solution described above (1) takes advantage of the capabilities of graphics processors (GPUs) which have been designed to accelerate tasks such as the calculation of computer generated images or the calculation of distortions of animated 3D objects whose processing computing lends itself well to massively parallel computing architectures.
- graphics processors GPUs
- CPUs central processing units
- the IT processing resources required by the solution are much greater than those required by a software solution designed for text editing (the solution falls into the category called "data-intensive computing" in English). It is therefore preferable to be able to exploit all the available resources of the IT / electronic device on which the solution is executed, which implies combining the computing capacities of the CPU or GPU (s) available on the IT device where the solution is put into practice. .
- the method according to the invention is adapted so as to allow collaborative implementation of operation.
- the method according to the invention allows several users to work at the same time on the same content / project / 3D animation film and to see in real time the changes made by all of these users.
- the process therefore allows simultaneous collaborative work to be done remotely.
- the method is partially implemented on a server which centralizes the data, while another part of the method is implemented on terminals, for example desktop computers, tablets or smartphones.
- terminals for example desktop computers, tablets or smartphones.
- the centralized part of the process being common for the implementation of the process by all of the terminals of the same 3D animation content project.
- the users have access to the project data thanks to a software solution (hereinafter called the client application) executed on their terminal, in other words on their computer device which the user uses for work.
- the client application a software solution (hereinafter called the client application) executed on their terminal, in other words on their computer device which the user uses for work.
- the terminal is a complete computer processing unit which has one or more central and graphical processing units as well as one or more video and audio output devices which make it possible to display images and play sounds on a variety of devices (computer screen, virtual reality headset, speaker, headset, etc.).
- server application a remote application
- S server the server application
- the project data is encapsulated according to a protocol specific to the invention.
- any standard protocol can be used (such as TCP / IP which is the protocol used for data exchange on the Internet).
- TCP / IP which is the protocol used for data exchange on the Internet.
- the terminal and the server form a local network (called LAN for Local Area Network).
- the terminal and the server belong to different networks but can nevertheless communicate using an Internet type connection for example.
- they form a so-called extended network or WAN (for Wide Area Network in English).
- WAN Wide Area Network in English.
- a work session is created as soon as at least one client is connected to the server; the number of client applications connecting to the server in a work session has no upper limit.
- This division allows the client application Ci to send to the server application all the modifications made to a project P from terminal T 1.
- the server application S When it receives a modification, the server application S performs two tasks: 1) it applies this modification to its own version of the project 2) it diffuses this modification to all of the clients with which it shares a connection C2, C3,. .., CN with the exception of the client from which the modification comes, which allows these client applications to apply it to their own version of the project.
- client application means all of the steps of the method implemented on the user's terminal, as opposed to the server application, corresponding to the steps implemented on the central server S.
- the server application When a new client application is launched and the version of the Pc project on the terminal disk is not the same as the P s version on the server, the server application then proceeds to a step synchronization, during which it sends to the client application all the modifications necessary for updating the project P s so that at the end of the process, the project Pc is identical to Ps.
- the method implemented further comprises a history function, a connected function and distribution functions.
- the history function is implemented so that all the modifications made to the project, by all the users acting on their remote terminals, simultaneously or not, since its creation are saved on the server S, since each time that a user makes a modification, whether it is a minor or major change, this modification is sent to the server which saves it locally before it in turn disseminates it according to the process which has just been explained . The user of the process therefore does not technically need to save the modifications made in order to preserve his changes.
- the data of the history of changes of a project are saved in the memory of the server application but also on the storage space, for example in a hard disk, which is associated with it.
- Historical data can be divided into two main categories.
- the first type of data is represented by all the assets imported or created in the project by users.
- An asset being an English term well known to those skilled in the art and including in particular 3D models, images or textures, sounds, videos, materials, etc.
- this data is described as blobs (acronym for Binary Large Objects) which can range from a few kilobytes to more than a thousand megabytes (gigabytes).
- These blobs are stored on the server storage space in the form of binary data.
- the second type of data is represented by objects to which properties are attached.
- An object is defined as a set of key-value pairs called property.
- Each property contains either a value or a reference (to a blob or another object), or multivalued (list of values or references).
- an object of type “3D object” references a blob containing the mesh information of a 3D model and properties such as the position of this object in a 3D scene, as well as references to the materials assigned to it.
- Material is another example of an object that contains information about the appearance of a 3D model such as its color or its brightness, each of these properties can contain either a constant value or a reference to texture type blobs.
- the amount of memory required to store this type of information on the server disk is relatively negligible compared to the size of the blobs.
- Blob data is more rarely added to the project than data of the second type, the editing of which is much more frequent.
- Changing the color of an object can be done in an iterative process in real time, leading to tens or even hundreds of changes per second. This data can be seen as representing the creation history of the project.
- the project therefore consists of an editing history comprising steps 1 -8 and two blobs saved both on the server's hard disk but also on that of the client application in what we call in the invention a blob store (warehouse of blobs).
- the 3D object is imported from the server blob store, saved in the client application blob store and added to the scene.
- the texture is imported from the server blob store, saved in the client application blob store and added to the scene
- the method according to the invention comprises modules adapted to the collaborative function and to the generation of the history of the solution which results therefrom. These modules are:
- - update management a distinction is made between 1) updates in the context of a real-time collaborative work session (i.e. when several client applications are already connected to the server) and 2) updates updates made when an application connects to the server after a disconnection time:
- a client application When a client application connects to the server after a disconnection time (or for the first time), it obtains from the server a project status (via a module described below) which allows it to establish the list of all blobs that have been added to this project since its last connection. The client application is then able to establish the missing blobs by comparing this list with the list of blobs already present in its local blob store. Only missing blobs are thus returned by the server application to the client application.
- Pull it is the client who requests the list of data he needs to update himself.
- - blobs management when binary type data is added to the project, it sometimes happens that a user adds it several times. Binary data is sometimes associated in the form of “bundles” or packets.
- a B1 bundle comprising mesh information describing a 3D model and three textures (T1 T2 and T3).
- the user may import into the project a new version of the “bundle” B1 which we will call B2.
- the method according to the invention therefore comprises a module making it possible to calculate on the basis of the information that the blob contains a unique signature.
- a sha type key or hash value that is to say a key obtained by a cryptographic hash function which generates a unique fingerprint for each blob in the project.
- the function that generates this hash value uses the binary data of the blobs as input.
- the method compares the hash value obtained with those of each of the blobs contained in the blob store:
- the storage capacity and internet traffic on the server side are limited. The user must therefore make good use of this storage capacity and bandwidth and the system must therefore inform him of the impact that an import operation can have on the use of the quota of disk space and of traffic reserved for him.
- the client application when the user imports blobs into the project, the client application proceeds in this way:
- This two-step import module was created in the context of the development of the collaborative functions of the method according to the invention and is specific to it.
- - management of the creation history in the example given above, several stages of the history represent successive and potentially very rapid modifications, for example of the order of a few milliseconds, of a property of the project or of the client application.
- stages 5, 6 and 7 the three consecutive changes made to the color of an object, are very rapid.
- Step 1 the client application calculates the hash value HB of a blob B and sends a request to the server application telling it that it is about to send it a blob with a hash value H B. Immediately afterwards, the client application begins to send data from blob B in relation to this request.
- Step 2 when the server stops receiving data associated with blob B, it considers that the data has been transmitted in full. It then calculates the hash value of the H’B server-side blob using the same algorithm as that used by the client application. If the hash value H’B is equal to that sent by the client application (H B), the server is guaranteed that the data in blob B is complete and intact and the process stops at this stage. If the value is different, the server goes to step 3.
- Step 3 as soon as the server application re-establishes a connection with the client application, it sends a request asking it to return the data for the blob whose data is incomplete.
- the server application sends the same request to all client applications sharing the same project, in the event that one of the client applications is in possession of the same blob and the client application that sent the blob initially, would be unavailable.
- the server application repeats step 3 until it obtains a complete copy of the blob data. Additionally, the server does not send blob B data to other client applications until the server-side backup is complete.
- This module which incorporates the hash value used for managing the blob store, is important because it guarantees the reliability of the collaborative function of the process according to the invention. Furthermore, the previous case describes the module when the data passes from the client application to the server application, but the same module is used when data is transmitted (in the case of an update for example) from the server application to any client application.
- the blobs are transmitted from the server application to the client applications according to criteria determined by the client applications.
- the client application detects the part of the project on which the user first works, for example a particular scene of the project and transmits this information to the server application which will send the blobs contained in this scene before the others.
- This prioritization or prioritization process sends the “useful” information first to the user, which reduces waiting time and improves their experience.
- the server application is able to know and therefore to learn on the basis of the creation history of the project the working habits of each user working on the project, and in general the involvement of each user of the system in different projects.
- This machine-based learning process provides a decision matrix allowing the server application to adopt in real time a strategy for prioritizing the delivery of blobs that is best suited to each user and each client application. .
- the server application therefore maintains a list of priorities for sending blobs per user for each client application (desktop computer, augmented reality application on a smartphone, etc.). If the result of the learning process on the basis of machine learning indicates for example that a user is working more particularly on one of the project assets than another, all the data related to this asset will have their priority in the list sending blobs increased.
- the server sends them to the client application in descending order of priority. This process takes place in real time (the priorities for each client application and each user of the system are continuously updated) and the priority list may change during the sending of blobs (due to new information communicated to the server by the client).
- a group of users works on a project for a relatively long period, for example several weeks or several months.
- the project is complex: the history includes tens or even hundreds of thousands of state changes called revisions and the blob store includes several tens of gigabytes of data.
- the server will have to send all the blobs (including those which are potentially no longer used in the most recent version of the project) and the entire content of the project history for the new user; in order to put the project on the user's system in the Es state, all revisions since the creation of the project will then be executed on the client application, which can take a long time for a project with an important history like this is the case in our example.
- the server application automatically and regularly saves a total state of the project.
- This total state can be seen as a version or image (snapshot in English) of the project at time t.
- the fact that there is a backup of the total state of the project at time t is also preserved in the creation history of the project. Thanks to that method, the server only has to send to the new user U, the data relating to the last saving of the total state of the E ⁇ otai project then all the modifications carried out on the project since the moment when E Total was generated.
- the modifications made to a project property are always defined in the creation history, as relating to the state in which this property is in the last backup of the total state of the project.
- the server performs a full backup of the project when the number of revisions since the last backup of the total report exceeds the QR value set by the system.
- the server application sends update information from the client application using the T Total project state backup most immediately prior to the desired restore restore period, then sends the additional modifications from the moment this backup has been performed until the desired restoration time (T Restores).
- T Restores desired restoration time
- the project is then restored to the state in which it was at the time ⁇ Restaure.
- the user can leave in the creation history tags or tags in English allowing him to mark in the creation history of the project stages considered key to its evolution. These tags or tags allow you to quickly restore versions of the project and in particular to establish visual or other comparisons between different states of the project in a simple and fast way.
- this special tag is added when the project has not been modified by any of the client applications after a period of QT . Indeed we consider in this case, that an absence of change for a long period of time indicates that the project is potentially in a satisfactory or stable state for the users of the method according to the invention and that this state therefore deserves to be preserved.
- the method according to the invention therefore comprises a set of modules collaborating with each other, in the main embodiment according to an interdependent operation, built on common concepts such as for example the calculation of the hash value and exploiting the collaborative function of the invention.
- the method includes in particular:
- a real-time A module for synchronizing project data on client applications implemented either by the server in the case of a collaborative work session (push) or by client applications when they connect to the server ( sweater),
- a real-time module B which, based on a hash value calculated on the binary data of the blobs thanks to a cryptographic encoding function, makes it possible to decide whether imported blobs should be added or not to the blob store,
- a real-time F module preserving the history of project creation in the form of a series of backups of the total state of the project and of intermediate revisions relating to these states.
- the frequency of the backups of the total states being determined by an algorithm based on a machine learning module in real time exploiting the historical data
- a real-time G module allowing the user of the process through a user interface of the client application to mark with tags the key moments in the evolution of the project and to go back in time using these tags or not to easily restore the project to previous states, from which comparisons between different project states can be presented to the user for comparative purposes.
- Optimization module for the management and viewing process of content by learning system using, among other things, historical data.
- One of the functions of the method according to the invention is to view, in real time or delayed, 3D animation content constituted by a set of plans or clips organized in the form of sequences, as described above.
- the objective is to allow users of the process, on the client application side, to create the most complex content possible while maintaining the best viewing performance, that is to say with a higher frequency of display of images and sound or equal to 30 images per second for an image definition greater than or equal to the standard of image format known as Full HD.
- the device implements a module making it possible to optimize the process by which the data forming the content to be viewed are transformed into animated and sound images.
- the collaborative use of unified device generates large amounts of information on 1) the use of the device itself and 2) the content produced by the users of the device.
- This information collected by the server application includes 1) the creation history of the project, 2) metadata related to the use of the device and to the editing of the project 3) all the data of which the project is consisting.
- the creation history has already been presented above.
- Metadata can include metrics such as the memory size required to load a given 3D scene, the time it took to load this scene, the frequency of use of a scene or a given asset (how many times has it been edited by system users, how often does the scene appear in the sequences of the final animation content), etc.
- Users of the device also frequently view the different scenes and sequences making up the project; thus during these views, it is possible to collect information on the calculation time of each image for each scene, on the use of the memory which may vary depending on what is visible on the screen, etc.
- This data can vary depending on the client application used (that for smartphone or desktop computer for example) and the characteristics of the GPU and CPU of the system on which the client application is executed (memory quantity, number of cores, etc. .).
- the process by which this content is calculated to be rendered is predictive since it is, in the most common case of the use of the process, linear (in contrast to so-called interactive content, the content of which changes when it is played as for example in the case of a video game).
- a machine learning algorithm uses all of the information available to it on the project (creation history, metadata collected over time by hardware configuration used, as well as the project data), to plan (schedule) and optimize the use of the resources of the system on which the client application is executed, in order to best guarantee that the content of the project is played in the required conditions (resolution of image, number of images per second, etc.).
- This resource optimization planning module defines the part of the method according to the invention is here called film engine, which will be referred to below under the term of movie engine.
- the method during an information gathering step detected that an S3 scene requires 3 seconds to be loaded and 5 GB of memory on the GPU; this scene appears 10 seconds after the start of the animation sequence (a Q1 sequence made up of 10 shots referring to 3 different scenes S1, S2 and S3): the process is therefore able to plan the loading of the scene at least 3 seconds before it is necessary to display it on the screen and at least 5 GB of memory on the GPU is free by the time the loading process begins, even if it means removing from memory data whose method n not immediately needed if necessary.
- the method therefore comprises a movie engine, the operation of which according to the main embodiment of the invention is as follows:
- the server application collects the following information:
- the server processes and reorganizes this information into information packets relevant to the movie engine which are then sent to client applications (on the same principle as sending blobs),
- the movie engine refines substantially permanently, and preferably at a frequency higher than the frequency of user events on the terminals, in real time, the answer to this problem based on the information which is sent to it continuously. by the server application.
- the movie engine explores different strategies by adapting the parameters of the problem in order to optimize its response in a process of continuous self-learning. So while a configuration S1 seems more efficient to a user of the system than a configuration S2, the movie engine can discover within the framework of this learning process that S2 by means of modifications to specific parameters of the system is actually more more efficient than S1. For example, one strategy may be to prioritize loading and unloading data from client storage space into and from GPU memory as quickly as possible rather than preserving as much data as possible in GPU memory as quickly as possible. as long as possible.
- the movie engine has alternative strategies to work around system limitations if the planning process is not sufficient to guarantee uninterrupted viewing. According to one embodiment of the method according to the invention, it can:
- the method according to the invention is also a connected method. Indeed, the collaborative nature of the project requires that the version of the project that is located on the server S to which the client applications are connected, ie the reference version of the project. Having all the project data (including its history) on the server allows you to develop a set of satellite applications that can either access the project data from the server or directly interface with the client application.
- the method according to the invention comprises a step of connecting a satellite application to the server.
- a software solution is executed on a smartphone or a tablet equipped with augmented reality functionality.
- a satellite application on the server S reads the data of the project P on the server S and displays the different scenes of the project on the application of the smartphone.
- the user of the application can then select one of the scenes of the project, then using the augmented reality system, deposit the content of this virtual scene Sv on any surface of the real world whose phone is able to know the position and orientation in 3D space (it is often in the simplest case a horizontal or vertical surface, such as the surface of a table or that of a wall).
- the 3D Sv virtual scene is then added to the video stream coming from the camera of the superimposed phone.
- the purpose of this device is to allow the user to play the 3D Sv scene and to be able to film it in augmented reality.
- the application offers a camera recording function.
- the movements of the camera in 3D space (which are provided by the augmented reality system) and the images of the video stream created are stored in the memory of the smartphone. .
- the camera movements, the video images and all the other auxiliary data created by the augmented reality system (such as the so-called tracking points which are points of the real scene used by the augmented reality system to calculate the position and rotation of the smartphone in space) are saved in the P project on the server.
- This acquisition method notably allows the user to create animated virtual cameras for a 3D animation content project using a consumer device (smartphones or tablets).
- the computer-implemented method also includes, according to a particular embodiment, a step of connecting a real-time application to the client application.
- a motion capture system making it possible to record the positions and rotations of objects or members of living beings (body and face), to control a virtual counterpart on computer (camera, 3D model, or avatar).
- This method makes it possible to directly record the data captured on the server S in the project P; they are then immediately accessible to all client applications connected to the server.
- the method according to the invention further comprises a second group of production steps called diffusion steps.
- broadcasting steps include in one embodiment a local calculation and broadcasting step.
- the user of the client application C which uses to execute this application a computer device of the desktop or portable computer type equipped with one or more central processing units (CPU) and one or more multiple graphics processing units (GPUs) can calculate animated sound sequences on the fly (in real time), locally, using the resources of these different processing units.
- CPU central processing units
- GPUs graphics processing units
- the video and audio outputs created can be redirected to any viewing device connected to the computer such as a 2D screen and speakers or a virtual reality headset.
- the information concerning the position and rotation of the camera provided by the viewing system is taken into account in the creation of the video and audio stream creating a feedback loop: the augmented or virtual reality system for example provides 3D information on the position and rotation of the real camera (smartphone or virtual reality headset) which allow the corresponding video and audio stream to be calculated on the computer to which it is connected, these streams being themselves connected to the respective video and audio inputs of the virtual or augmented reality system.
- the augmented or virtual reality system for example provides 3D information on the position and rotation of the real camera (smartphone or virtual reality headset) which allow the corresponding video and audio stream to be calculated on the computer to which it is connected, these streams being themselves connected to the respective video and audio inputs of the virtual or augmented reality system.
- the calculation and streaming is carried out remotely from the server S.
- audio and video streams can be saved in a video.
- video and audio streams are calculated on the fly, dynamically and are broadcast (streaming) to another computer or electronic device connected to the server by a LAN (local) or WAN (Internet for example) type network. ).
- this version of the invention it is possible to calculate a finalized version of the project, whether in offline form (offline) or in real time on as many processing units (CPU and GPU) as the infrastructure of the data center to which the server is locally connected allows this (note that insofar as the data of the P project which is on the server S is on the local network to which the processing units are also connected, access to this data is fast). It is also possible to use any solution calculating 3D synthetic images to create the finalized images of this content.
- a user can adapt the speed with which a finalized version of the film is calculated by increasing the number of processing units dedicated to this task.
- a user of the process can access remote computing resources much greater than those available on his local computer (or the electronic device he uses to view content such as a smartphone or a tablet) in order to obtain a finalized version of the real-time content of quality much higher than the version he could obtain by exploiting the resources of his own computer.
- This latter embodiment of the invention allows the computing resources (the GPUs and the CPUs used in the calculation of the content) to be physically separated from the device for viewing this content.
- This device is different from the case of streaming video from a server to, for example, a smartphone via the Internet, where the content of the video is pre-calculated.
- it is a matter of calculating the content of the animation project on the fly, at the very moment when it is broadcast (streamed) to this smartphone.
- a protocol suitable for streaming real-time content such as RTSP (Real-Time Streaming Protocol in English) is required.
- the content of the project is calculated live, dynamically on demand.
- RTSP Real-Time Streaming Protocol in English
- the content of the project is calculated live, dynamically on demand.
- the content of the project can be changed at the same time of its dissemination (which is of course not the case for a video).
- This is particularly necessary when the user of the process controls the camera as in the case of virtual and augmented reality.
- the virtual or augmented reality system and the server S are connected by a LAN or WAN network (Internet for example).
- the data on the position and rotation of the camera created by the virtual or augmented reality device are therefore sent to the server S via this network (input); this information is then used to calculate in direct on the fly, a video and audio stream on the server S which is returned (output) to the device by the network from which the information on the camera came.
- one or more processing units are assigned to each person watching a version of the animation content so that each calculation group can create a different version of this content from the same S project.
- This solution is similar to the concept of asynchronous multi-casting in the broadcasting or streaming industry except that in the case described here, each video and audio stream generated and broadcast to each client connected to the streaming server is unique.
- This information can be saved at the pixel level in the form of metadata, thus transforming them into "intelligent pixels”.
- the method according to the invention thus responds to the technical problems linked to the fragmented nature of non-real-time production pipelines, the absence of a collaborative solution, and the disconnection of the production process from the process of broadcasting 3D animation content. (whether real-time or not).
- the method according to the invention solves all of these technical problems in a global process dedicated more particularly to the creation of real-time 3D linear animation content for traditional media (television and cinema) and new media (reality and augmented).
- the present method according to the invention is designed specifically for the creation of linear animation content or more generally 3D narrative content which may contain elements of interactivity but which in any case are clearly distinguished from a video game.
- one or more users can, in a unified process, work in real time and in parallel on any aspect of 3D animation content (and / or mixing a variety of visual and sound sources such as example videos, digital images, prerecorded or procedurally generated audio sources, etc.) in simultaneous and remote collaborative mode using a variety of electronic and / or computer devices such as for example a smartphone, a tablet, a laptop or desktop, headset, or any other virtual or augmented reality system, and to stream that content through a variety of methods such as posting a video of the content to a video distribution platform or streaming '' a dynamic live video stream (i.e. generated on the fly, or created dynamically on demand) from a server to n'impo rte which interactive display device or not (virtual reality headset, smartphone or tablet screen, computer screen, etc.).
- the method according to the invention in other words the unified collaborative pipeline (CUP), can be implemented on a computer as described below .
- CUP unified collaborative pipeline
- a user U1 launches the client application on a desktop computer equipped with a CPU and a GPU.
- This computer also called terminal
- This computer is connected by a remote WAN type network to a server located in a data center or cloud platform, as shown in Figure 16.
- U1 To identify himself on server S (on which the server application is located), U1 must provide a user identifier (username) and a password, figure 7, validated by the server application.
- Creation / opening of a project once connected, another screen, figure 8 is offered to U1 which allows it either to create a new content project or to open an existing project.
- the user U1 has the possibility of customizing the icon of the P project by dragging and dropping (drag-and-drop) an image stored on the local disk of the computer on the icon of the project 23.
- the user U1 can click once on the icon of the project P for a preview of the content of the project in the form of a small video or teaser 24 for example of thirty seconds maximum, pre - calculated or calculated on the fly. Double-clicking on the P project icon opens it.
- the opening of the project causes the synchronization of the data on the local disk of the terminal from the server.
- the local version of the project is up to date (identical in all respects to the version saved on the server)
- another screen hereafter called the project editor is displayed, figure 1 1. It is divided vertically into two large spaces : on the left a list of virtual 3D scenes 51 and on the right a list of the sequences making up the film 52.
- Each space has an icon which makes it possible to create either a new scene 54 or a new sequence 57.
- the scenes and sequences are displayed as a series of cards 55 comprising an image of the scene or of the sequence chosen by the user U1 or in a random manner (thumbnail or snapshot) and the name of the scene or sequence which can be edited. .
- Scenes can be duplicated and deleted. Double-clicking on the map of a scene opens the scene for editing. Editing the animated scene: when user U1 opens a scene, a new screen is presented to it, Figure 12.
- This screen offers a workspace from which the user can edit all aspects of a 3D virtual scene such as importing or modeling 3D models on site, importing or creating animations, importing cameras or creating new viewing angles from which the scene can be filmed (by having virtual cameras in the 3D scene), import or create light sources, videos, digital images, sounds, etc.
- a 3D virtual scene such as importing or modeling 3D models on site, importing or creating animations, importing cameras or creating new viewing angles from which the scene can be filmed (by having virtual cameras in the 3D scene), import or create light sources, videos, digital images, sounds, etc.
- the scene editor presents a time bar which allows the user of the process to move in time of the scene.
- the content of the scene is visible through a kind of window open on the 3D scene, calculated in real time on the fly, which is called the viewport in English 71.
- a window appears on the screen comprising the list, presented in the form of maps, of all the cameras already created in scene 82.
- Each map includes the name of the camera and a small image which represents an image of the scene taken from this camera. Double-clicking on one of these maps displays the 3D scene filmed from the selected camera in the viewport.
- the camera can be animated. To create a new camera, you have to move freely in the 3D scene, then when a point of view is suitable, click on the 'ne ⁇ ri button of the cameras window 82 which has the effect 1) to add a map to the camera list, 2) create a new camera positioned in the 3D space of the scene at the desired location. Editing of a sound and visual sequence: once the user has created one or more scenes, he can return to the project editor, figure 1 1.
- sequence editor By double-clicking on the card representing a sequence 58, the user opens a new screen hereinafter called the sequence editor, FIG. 13.
- This screen allows the user to edit the content of a sequence.
- This screen is separated into three main sections: - a 3D window or viewport in English which is the usual terminology for those skilled in the art, allows the user to see the result of its assembly 91,
- 'A map on the timeline is operated by a drag-and-drop operation of a map representing a 3D scene (hereinafter SCN) from the section listing all the 3D scenes of the project, on the timeline.
- SCN 3D scene
- the plan thus created on the timeline has the same duration as the duration of the SCN scene. However, this duration can be adjusted as in any editing tool.
- This window includes the list of cameras and animations of the 3D SCN scene.
- the user just has to choose the camera and the version of the animation he wants to use for this plan by clicking on the camera and animation icon representing his choice.
- the viewport includes certain controls 99 which allow the user to play the sequence to check the result of its editing.
- Watch the sequence in virtual reality by activating the virtual reality function 100 in the sequence editor, it is possible to watch the sequence not only on the computer screen but also on a virtual reality headset connected to the user's computer.
- Play the film in its entirety by activating the function of viewing the film, figure 14 from the project editor 53, the user has the possibility of playing the entire film, ie the sequences placed end to end .
- the sequences are played in the same order as the order in which they are arranged in the project editor 52.
- Creation of an augmented reality virtual camera the user launches a satellite application, with reference to FIG. 19, on a smartphone equipped with the augmented reality function, or any other suitable device.
- the application connects to the server S to read the project data P 1207, 1208. Then appear on the screen of the smartphone the list of scenes in the form of cards identical to those used for the project editor 1205.
- the user Via the touch screen of the smartphone, the user selects an SCN scene, the content of which he can then drop onto a surface of the real world to fix it there.
- the 3D virtual scene is then filmed by the phone as if it were part of the real world.
- the smartphone saves the information of the camera movement that has just been created on the S server by adding a new animated camera to the SCN scene of the project.
- This camera movement is then available in project P in the client application from the scene editor or the sequence editor.
- the user U1 can give the second user U2 access to the project P.
- the user U2 therefore has access to all the data of this project and can modify the content as desired.
- user U1 or user U2 edits the content of this project, the changes are immediately reflected or visible on the screen of the other user.
- user U1 creates a new scene from the project editor. This scene appears as a new card in the interface of the two users although it was user U1 who modified the project.
- the second user U2 who now also has access to this new scene, decides to open it to import a 3D model. The model is then visible and available both in the project of the second user U2, which has just imported it, but also in the project of the first user U1, which, however, has done nothing.
- the first user U1 decides to change the color of this object. This color change is also applied to the project model of the second user U2. This principle applies to all aspects of the project regardless of the number of users connected to the S server.
- the second user U2 can create a second version P 'of the project below called branch 1 1 1. All the modifications carried out by the first user U1 in project P will no longer be visible to the second user U2 as long as he is working on project P '.
- Play the film in augmented reality a person launches a satellite application on a smartphone equipped with the augmented reality function. This person does not want to edit the content of a project but to watch it, as a spectator.
- This application connects to the S server and provides this viewer with a list of all the projects available. Thanks to the touch screen of the smartphone, he selects a project, then selects, using the graphic interface provided by the augmented reality application, a real world surface on which the computer-generated film will be played; such as the surface of a table.
- a streaming server is launched on the S server. It is an application that will calculate a finalized version of the project on as many processing units (GPU and CPU) as desired by the user, as a service parameter, then broadcast / stream the result of this just-in-time calculation to the user's tablet. The content of this incoming stream will then be displayed on the tablet screen using the process launched in the previous step (a).
- the user can interact with the content of the film watched. For example, it is possible to select an object on the screen using the mouse or the touch screen. The object is represented as a set of pixels on the screen, but the streaming application can know the model or 3D models represented by these pixels. The selected 3D model can therefore appear on the screen surrounded by a contour (to indicate that it is selected) and then a whole set of services associated with this object can be presented to it.
- the user can order a 3D print of the selected model.
- the broadcaster that is to say the service in charge of broadcasting the content to the tablet, smartphone, etc. of one or more users of the service, can also modify the content of the project while it is being calculated and disseminated.
- the modifications are made not by the service user but by the service operator (the broadcaster). It is possible to create as many personalized versions as users connected to the service.
- the process also includes management steps access rights.
- the server includes user authentication steps.
- this application When a client application is implemented from a terminal, this application first connects to the server which implements the prior authentication step.
- This authentication step then assigns a digital authentication token, which includes hashed authentication data, encrypted or encoded using any suitable technique known to those skilled in the art, and access right data, which will have been previously defined in a server database.
- authorizations of the administrator type can be provided, giving the right to implement production steps and management steps, an authorization of the producer type, giving for example rights for all of the production stages, and targeted authorizations, for example an authorizer authorization allowing only access for modification and creation at the animation stage of the produced content.
- the data of the animation content produced are all stored on the central server.
- Asset-type animation content data is stored in the form of large binary objects, known as Binary Large Objects, generally abbreviated by the acronym BLOB.
- This stored data is organized in the form of data groups, known in the technical field under the name of Data Pool.
- the data storage mode is not limited to this storage and referencing mode. Any other technical server storage solution that can be adapted to the invention.
- Each data item is associated with a state R n on the server. This state is associated with modifications, so that in the preceding state the same datum was in a state R n -i which following a recorded modification known as Cn brought the object to the state R n .
- the method according to the invention implements a step of managing editing and creation conflicts.
- This management step is subdivided into two sub-steps: a conflict detection step and a conflict resolution step.
- the conflict detection step is linked to the history step in that it detects which concomitant actions of the history act on similar data stored in the central server.
- This conflict resolution step aims to give priority to the latest modifications, creations or deletions.
- the conflict detection method detects that two concurrent states are mutually exclusive.
- the method then implements the conflict resolution step to determine which state the server should take, Rp, Rf or a different state.
- event p is recorded as prior to event f.
- event p or f is meant the result of the implementation of a production step as described above.
- the method implements a step of determining the exclusion of the event p.
- the event p is excluded if it meets one of the following criteria:
- event p deletes an object which has been deleted, modified, added or referenced by the event f;
- event p adds an object which has been deleted, added or modified by event f;
- the event p modifies a property of an object which has been deleted by the event f;
- the event p modifies a unique property of an object which has also been modified by the event f;
- event p adds a reference to an object which was deleted by event f;
- the event p adds a reference to an object or a value for a property of an object which can have several values, which was added, deleted or changed by the event f;
- the event p deletes a reference to an object or a value of an object that can receive several values for the same property having been added, deleted or changed by the event f;
- the event p moves a reference to an object or a value of a property which can receive several values having been added, deleted or moved to the same property by the event f.
- event p falls into one of these cases, it is ignored, and the project is updated according to the last event f. Otherwise the event p is kept with the event f.
- the terminals then receive from the server an indication of updating the project, synchronizing the local data on the terminals with the state of the server according to the resolution of the conflict.
- the invention also relates to a computer system as shown in FIG. 16 comprising a server 1 105 and one or more terminals 1100.
- a terminal 1100 includes a computer device composed of display systems, processing units of the CPU and GPU type or the like, and storage capacity to locally save a version of the P 1 102 project.
- the client application 1 101 according to the invention which allows the content of this project to be edited, is executed on this device.
- the terminal is connected to the server S 1 105 by a local or remote network of LAN or WAN 1 106 type.
- the server In the case of a remote connection of WAN type, the server is said to be in the cloud (or in the cloud).
- the server S is itself composed of processing units (of CPU and GPU type) and of storage capacity 1,103 used to save on the server a version of the project P.
- the server application described in this innovation is executed on this server 1104.
- Several terminals are connected to the server via the network user 1, user 2, ..., user N.
- Figure 17 shows schematically the way in which a plurality of different content projects is synchronized with a plurality of different terminals.
- a modification made to project P by a user on the terminal TA is first saved locally 1 ’then broadcast to the server 1.
- the server saves this change in its own version of project 2.
- the server distributes the modification to all terminals except the one from which it comes 3 which apply it in turn and save it locally 3 ’.
- FIG. 18 is a representation of the module by which the content of a project can be calculated on the fly and broadcast in just-in-time flow to as many (interactive) display devices as necessary.
- a virtual reality headset 1 1 10 and the controllers associated therewith a tablet or a smartphone equipped with augmented reality functions and touch screens 11 1 1, and a computer with a keyboard and a 1,112 mouse.
- the information generated by these various devices (such as the position of the virtual reality headset or the smartphone in the 3D space of the real world) is sent to the server to which they are connected 1 1 13.
- These servers are different from the project server S: they are servers known as streaming 1 1 15.
- LAN local area network
- streaming server equipped with its own CPU and GPU processing units to calculate a single 1 1 14 video and audio stream that responds to inputs from the viewing system. Each flow is therefore potentially unique.
- FIG. 19 represents the module allowing a system equipped with augmented reality functions such as a smartphone or a tablet to connect to the server S by means of a software solution executed on this system in order to access the project data as per example, in this case, the 3D scenes of project P.
- step 1 the 3D scenes are displayed on the screen of the smartphone in the form of 1205 maps.
- step 2 the application then allows you to play and film these 3D scenes in augmented reality.
- step 3 once the scene has been filmed, all the data captured by the augmented reality system such as for example the movement of the camera or the video are then saved on the server S 1 105.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Architecture (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Economics (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Data Mining & Analysis (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1856631A FR3084190B1 (fr) | 2018-07-18 | 2018-07-18 | Procede mis en œuvre par ordinateur pour la creation de contenus comprenant des images de synthese |
PCT/FR2019/051796 WO2020016526A1 (fr) | 2018-07-18 | 2019-07-17 | Procédé mis en oeuvre par ordinateur pour la création de contenus comprenant des images de synthèse |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3824440A1 true EP3824440A1 (fr) | 2021-05-26 |
Family
ID=63579458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19759643.0A Withdrawn EP3824440A1 (fr) | 2018-07-18 | 2019-07-17 | Procédé mis en oeuvre par ordinateur pour la création de contenus comprenant des images de synthèse |
Country Status (6)
Country | Link |
---|---|
US (1) | US20210264686A1 (fr) |
EP (1) | EP3824440A1 (fr) |
CN (1) | CN112449707A (fr) |
CA (1) | CA3102192A1 (fr) |
FR (1) | FR3084190B1 (fr) |
WO (1) | WO2020016526A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020181152A1 (fr) * | 2019-03-05 | 2020-09-10 | Farrokh Shokooh | Gestion et modélisation de projet de réseau de services publics |
US20220165024A1 (en) * | 2020-11-24 | 2022-05-26 | At&T Intellectual Property I, L.P. | Transforming static two-dimensional images into immersive computer-generated content |
US11620797B2 (en) * | 2021-08-05 | 2023-04-04 | Bank Of America Corporation | Electronic user interface with augmented detail display for resource location |
CN115314499B (zh) * | 2022-10-10 | 2023-01-24 | 国网浙江省电力有限公司嵊州市供电公司 | 适用于电力领域的多终端协同工作方法及系统 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10872322B2 (en) * | 2008-03-21 | 2020-12-22 | Dressbot, Inc. | System and method for collaborative shopping, business and entertainment |
CN102332174B (zh) * | 2011-09-06 | 2013-10-16 | 中国科学院软件研究所 | 一种协同草图动画生成方法和系统 |
CN102866886B (zh) * | 2012-09-04 | 2015-04-29 | 北京航空航天大学 | 一种基于Web的算法动画可视化开发系统 |
US20160171740A1 (en) * | 2014-12-15 | 2016-06-16 | Calay Venture S.à r.l. | Real-time method for collaborative animation |
-
2018
- 2018-07-18 FR FR1856631A patent/FR3084190B1/fr not_active Expired - Fee Related
-
2019
- 2019-07-17 US US17/255,551 patent/US20210264686A1/en not_active Abandoned
- 2019-07-17 WO PCT/FR2019/051796 patent/WO2020016526A1/fr active Application Filing
- 2019-07-17 EP EP19759643.0A patent/EP3824440A1/fr not_active Withdrawn
- 2019-07-17 CA CA3102192A patent/CA3102192A1/fr not_active Abandoned
- 2019-07-17 CN CN201980048030.0A patent/CN112449707A/zh active Pending
Also Published As
Publication number | Publication date |
---|---|
CN112449707A (zh) | 2021-03-05 |
WO2020016526A4 (fr) | 2020-03-19 |
FR3084190B1 (fr) | 2020-07-10 |
FR3084190A1 (fr) | 2020-01-24 |
CA3102192A1 (fr) | 2020-01-23 |
US20210264686A1 (en) | 2021-08-26 |
WO2020016526A1 (fr) | 2020-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020016526A1 (fr) | Procédé mis en oeuvre par ordinateur pour la création de contenus comprenant des images de synthèse | |
US11402969B2 (en) | Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing | |
US9277198B2 (en) | Systems and methods for media personalization using templates | |
US9460752B2 (en) | Multi-source journal content integration systems and methods | |
CA2600207C (fr) | Procede et systeme d'edition et de stockage distribues de supports numeriques via un reseau | |
EP2393022A2 (fr) | Procédé de création d'une séquence média par groupes cohérents de fichiers médias | |
US9002175B1 (en) | Automated video trailer creation | |
US9601157B2 (en) | Methods and apparatus for remote motion graphics authoring | |
JP2020524866A (ja) | コンテンツ取引合意のシステムおよび方法 | |
US9578188B1 (en) | Enterprise photo / video experience platform and kiosk systems | |
EP3497674B1 (fr) | Système de composition ou de modification de séquences de réalité virtuelle, procédé de composition et système de lecture desdites séquences | |
FR2933226A1 (fr) | Procede et systeme de production d'oeuvres audiovisuelles | |
FR2893470A1 (fr) | Procede et dispositif de creation d'une sequence video representative d'une sequence video numerique et procedes et dispositifs de transmission et reception de donnees video associes | |
US12002491B2 (en) | Visual effect design using multiple preview windows | |
TWI652600B (zh) | Online integration of augmented reality editing devices and systems | |
FR3044852A1 (fr) | Procede de gestion de contenus video pour leur edition | |
US20160012803A1 (en) | Method for visual differencing with attribution | |
EP1762068B1 (fr) | Procede d'edition de pages multimedia aupres d'un terminal,avec pre-memorisation de parametres d'objets intervenant dans les scenes | |
KR101263179B1 (ko) | 동영상을 이용한 이동단말기의 배경화면 설정 방법, 동영상을 이용한 배경화면 설정 장치가 포함된 이동단말기, 동영상을 이용한 이동단말기의 배경화면 설정 시스템 및 이동단말기의 배경화면 설정 방법을 저장한 기록매체 | |
EP3560147B1 (fr) | Automatisation des échanges entre objets communicants | |
EP2800017A2 (fr) | Génération d'un document sonore personnalisé relatif à un évènement | |
FR3087552A1 (fr) | Procede et systeme pour realiser un tutoriel | |
Park | Netflix as Cinematheque | |
FR3005181A1 (fr) | Generation d'un document multimedia personnalise relatif a un evenement | |
FR2920933A1 (fr) | Procede de diffusion de sequences de donnees video par un serveur vers un terminal client |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201224 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20230201 |