WO2018071562A1 - Système de gestion de contenu de réalité virtuelle/augmentée - Google Patents

Système de gestion de contenu de réalité virtuelle/augmentée Download PDF

Info

Publication number
WO2018071562A1
WO2018071562A1 PCT/US2017/056173 US2017056173W WO2018071562A1 WO 2018071562 A1 WO2018071562 A1 WO 2018071562A1 US 2017056173 W US2017056173 W US 2017056173W WO 2018071562 A1 WO2018071562 A1 WO 2018071562A1
Authority
WO
WIPO (PCT)
Prior art keywords
media
content
hotspot
spatiotemporal
media content
Prior art date
Application number
PCT/US2017/056173
Other languages
English (en)
Inventor
Marko MUNDA
Jernej MIRT
Blaz ZAFOSNIK
Original Assignee
Viar Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Viar Inc filed Critical Viar Inc
Publication of WO2018071562A1 publication Critical patent/WO2018071562A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Definitions

  • Three-hundred-sixty-degree (360-degree) photos and videos is an emerging format that is gaining in popularity due to its ability to give users an omnidirectional view of the media, creating an immersive and interactive experience.
  • the 360-degree photos and videos are created using a combination of multiple panoramic images, photospheric images, or videos captured using an omnidirectional camera. These images are then stitched together to form a contiguous spherical or semispherical image orthogonally positioned to an interactive viewport within a user interface.
  • 360-degree photo and videos are able to provide and immersive and interactive experience, they suffer from several shortcomings hindering adoption.
  • One of these shortcomings is a technical barrier.
  • In order to offer the level of interactivity found in 360- degree videos some coding and a degree of familiarity is required to install and run the software. This may be difficult as some users may lack the background or familiarity to deploy the technology. Due to this shortcoming, many 360-degree projects employ additional developers, take existing developers off of other projects, or require their staff to take time away from their projects to deploy this technology.
  • 360-degree videos have six sides (up, down, left, right, back, front). This complexity is compounded by the fact that an area of interest may be dependent on a particular time point, and thus the timeframe (seconds) and the location (x,y,z coordinates) are needed to identify the spatiotemporal location. Due to these issues,
  • Custom coded mobile VR applications utilize 360-degree videos that may be larger than 1 GB in size. Although storage may be allocated for the 360-degree video, a viewer may be required to download and install additional applications in order to view or interact with each video file they download. As a result, storage of 360-degree video may be an issue for users with limited storage availability.
  • Another issue may be due to anticipating a user's point of interest when watching a 360-degree video because VR viewers have more freedom in what to look at and what storyline to follow. As a consequence, it becomes difficult to determine a user's point of interest while the video is playing.
  • FIG. 1 illustrates a system environment 100 for operating a virtual/augmented reality content management system.
  • FIG. 2 illustrates an embodiment of a virtual/augmented reality content management system 200.
  • FIG. 3 illustrates an embodiment of a meshed content player 300.
  • FIG. 4 illustrates a embodiment of a method 400 of operating virtual/augmented reality content management system for generating a 360° media project presentation.
  • FIG. 5 illustrates an embodiment of a method 500 of of operating a virtual/ augmented reality content management system.
  • FIG. 6 illustrates an embodiment of a method 600 of operating virtual/augmented reality content management system for generating a 360° media project presentation.
  • FIG. 7 illustrates a embodiment of method 700 for operating a loader.
  • FIG. 8 illustrates an embodiment of method 800 for operating an editor.
  • FIG. 9 illustrates an embodiment of a method 900 of operating an editor.
  • FIG. 10 illustrates an embodiment of a method 1000 of operating an editor.
  • FIG. 1 1 illustrates an embodiment of a project editor interface 1 100.
  • FIG. 12 illustrates an embodiment of a media uploader interface 1200
  • FIG. 13 illustrates an embodiment of the media uploader interface 1200 displaying the progress of the media content manipulation.
  • FIG. 14 illustrates an embodiment of a story editor interface 1400.
  • FIG. 15 illustrates an embodiment of a storyboard interface 1500.
  • FIG. 16 illustrates an embodiment of a storyboard interface 1600.
  • FIG. 17 illustrates an embodiment of a meshed content player 1700.
  • FIG. 18 illustrates an embodiment of a meshed content player 1800.
  • FIG. 19 illustrates an embodiment of a meshed content player 1900.
  • FIG. 20 illustrates a meshed content player 2000 in accordance with one embodiment.
  • FIG. 21 illustrates an embodiment of a project sharing interface 2100.
  • FIG. 22 is an example block diagram of a computing device 2200 that may incorporate embodiments of the present invention.
  • 360° media content in this context refers to image or video content that are equirectangular projection of a 360° degree image or video.
  • 360° media project presentation in this context refers to collection of photosphere images and/or videos, displayed through an interactive interface, and organized as scenes in a presentation, where progression between scenes is controlled by user interactions.
  • 360° media slices in this context refers to sectional images or regions of a 360° media (image or video) projection utilized in the assembly of photosphere image or video.
  • activation triggers in this context refers to inputs or combinations of inputs occurring in a sequence and/or frequency triggering an action signal.
  • Content player in this context refers to logic or combination of logic for assembling slices from a equirectangular projection into a photospherical image or video in a virtualized environment around a rotatable viewport.
  • Correlator in this context refers to a logic element that identifies a configured association between its inputs.
  • a correlator is a lookup table (LUT) configured in software or firmware.
  • Correlators may be implemented as relational databases.
  • An example LUT correlator is:
  • safe_condition is:
  • a correlator receives two or more inputs and produces an output indicative of a mutual relationship or connection between the inputs.
  • Examples of correlators that do not use LUTs include any of a broad class of statistical correlators that identify dependence between input variables, often the extent to which two input variables have a linear relationship with each other.
  • One commonly used statistical correlator is one that computes Pearson's product- moment coefficient for two input variables (e.g., two digital or analog input signals).
  • Other well-known correlators compute a distance correlation, Spearman's rank correlation, a randomized dependence correlation, and Kendall's rank correlation. Many other examples of correlators will be evident to those of skill in the art, without undo experimentation.
  • Editor in this context refers to logic or combination of logic for creating and manipulating stored values and association between stored data in response to defined events signals received through a user interface.
  • Engine in this context refers to the term “engine” in this context refers to logic or collection of logic modules working together to perform fixed operations on a set of inputs to generate a defined output. For example, IF (engine. logic ⁇ get. data(), process. data(), store. data(), ⁇ get.data(inputl) -> data.inputl ; process. data(data. input 1) -> formatted. data 1 ->
  • a characteristic of some logic engines is the use of metadata that provides models of the real data that the engine processes, logic modules pass data to the engine, and the engine uses its metadata models to transform the data into a different state.
  • Loader in this context refers to logic or combination of logic for retrieving information and initiating operations for starting other process. For example, in an operating system, the loader places executable instructions into memory and prepares them for execution. The loader reads the contents of the executable file containing the program instructions into memory, and then carries out other required preparatory tasks to prepare the executable for running.
  • Transcoder in this context refers to logic to transform data from one format to another. Examples of transformations carried out by a transcoder include mathematical transformations, scaling operations, offset processing, data value mapping, and/or structure member mapping.
  • a process of operating a virtual/augmented reality content management system involves displaying a project management window including a project editor interface, a media content uploader interface, a storyboard interface, and a project sharing interface.
  • the project editor interface enables editing of the title and the description of the 360° media project.
  • a 360° media project is loaded through the project management window.
  • the 360° media project may be a previously stored 360° media project or a new project, depending on the selection input received.
  • a new project may be generated in a controlled memory data structure. Alternatively, the selection input may retrieve an existing project in the controlled memory data structure.
  • a transcoder When media content is uploaded through the media content uploader interface, a transcoder is operated to generate a low-level version of the media content and slices from the low-level version and original version of the media content.
  • the media content, the low-level version of the media content, and the slices of the low-level version and original version of the media content are stored as uploaded media content in a 360° media project allocation of memory in the controlled memory data structure.
  • the storyboard interface may be operated to display a project map of the 360° media project including media tile icons corresponding to uploaded media content in the 360° media project allocation of memory.
  • the media tile icons displayed on the project map comprise a plurality of media tile icons that may be organized hierarchically to show the progression of a 360° media project presentation between pieces of the 360° media content.
  • the storyboard interface may receive a selection input to connect an exit node of a media tile icon to another media tile icon, creating a start node and storing the connector linking the pair of uploaded media in the 360° media project allocation of memory.
  • a loader may detect a selection input for a media tile icon through the storyboard interface launching a meshed content player for the associated 360° media content.
  • the loader may be operated to configure a content player with display parameters, from the user interface, to select the slices of the low-level version or the original version of the uploaded media, and configure the content player to generate a content stream by geometrically positioning the slices orthogonal to a rotatable camera and assigning a coordinate system for vectors of the rotatable camera.
  • the loader may be operated to control of a correlator to map spatiotemporal hotspot locations from the media project allocation of memory to the coordinate system.
  • the loader may be operated to control a meshed content player to overlay mapped spatiotemporal hotspot locations over the content stream, such that the mapped spatiotempoeral hotspot locations are positioned between the rotatable camera and the content stream generating an interactive interface layer for the mapped spatiotemporal hotspot locations and a media content layer for the content stream.
  • the loader may also be configured to display the meshed content player through a media display window, and to load the uploaded media content associated with the exit node of the connector. The loader may perform this action in response to detecting a selection input for the mapped spatiotemporal hotspot locations associated with the start node.
  • the meshed content player may control the orientation of the rotatable camera in the content stream in response to receiving selection inputs through the user interface. Operation of the loader may include launching an editor to create and modify the spatiotemporal hotspot locations in the interactive interface layer.
  • the system may utilize browser and cloud technology to enable storytellers to create virtual reality experiences with no coding involved.
  • the output is device and operating system agnostic.
  • users upload their content they can place interactive elements (hotspots) directly on the canvas or connect different media files into a branched storyline.
  • the viewer can control how the story unfolds.
  • Finished stories can be delivered through traditional digital marketing channels.
  • the analytical components of the virtual/augmented reality content management system platform measures how the stories are being viewed and how engaging they are.
  • the system thus provides a streamlined cinematic storytelling tool for creating interactive virtual reality (VR) and augmented reality without any coding experience.
  • a user may start by creating a project for which the user may add a title and description. After creating the project, the user drags and drops the media files such as
  • the 360° images and 360° videos After the user has finished uploading all their files they may proceed to the processing steps.
  • the processing steps take time as images and videos are copied and transcoded onto a server. Due to the size of some videos and images used in a project, processing may be prolonged, but the virtual/augmented reality content management system incorporates a notification system to contact the user when the processing of the images and videos has completed.
  • a user may utilize the virtual/ augmented reality content management system' s storyboard editor.
  • the storyboard editor offers a user interface to rearrange the uploaded media elements of a story.
  • the user selects the starting media story, then selects an edit mode by clicking an edit button at the bottom center position of the preferred media file.
  • the virtual/augmented reality content management system launches VR media.
  • the user After selecting a starting media, the user is able to make the content interactive. By double clicking/tapping anywhere on the screen, the user is able to add point of view (POV) hotspots, info hotspots, or transitional hotspots inside the media fill.
  • POV point of view
  • hotspots are effective for creating virtual tours, branched VR stories, and other unique media presentations.
  • a user returns to the storyboard editor and selects a white box on the right side of a media file and drags it into an "entry" box of a different media file. This action creates a link joining the two media files.
  • the user may click a refresh button on the top right side to save a previous action.
  • the user may also create the link between two media files through the editor mode.
  • the user may select a drop down to choose the filename of the media file they would like to connect.
  • video media files can be looped and autoplayed if desired. Looping videos replay once they reach the end.
  • the story may be previewed by clicking a preview button found in the center of the starting media.
  • they can share the story through social media channels, email using a universal resource locator (URL) or by simply embedding the media into their website using an iframe code provided.
  • URL universal resource locator
  • the virtual/augmented reality content management system supports panorama images as well as 360° photosphere images.
  • the virtual/ augmented reality content management system manipulates images according to their type.
  • panoramic images the system revolves the image around the viewport, rotatable camera, and meshes the lateral ends to form a cylindrical image.
  • An image or color scheme may be provided as a background for the panoramic image.
  • photospherical images the system warps the image into a spherical shape around the viewport, rotatable camera.
  • the virtual/augmented reality content management system differentiates between panoramic images and photospherical images by analyzing the height and width of the image. Unlike panoramic images, photospherical images are typically found in a 2: 1 (width/height) proportion. As such, the system identifies images meeting the 2: 1 proportion as photospherical images, while images having a different proportion are regarded as panoramic images.
  • the collaboration feature allows sharing and editing of the interactive interface layer of the 360° video project.
  • the collaboration feature allows reviewers to access the 360° media project through a URL and add comments to spatiotemporal locations in the interactive interface layer.
  • the collaboration feature operates similarly to the editor mode with the exception that the project owner may limit the editing capabilities of the reviewers from making unwanted changes to the project while still being able to comment and reply to existing comments.
  • Project owner may be provided with a robust tool set to control editing permissions for different users or groups working on a project.
  • the project owner may set permissions allowing a user, or group of users, access to specific media elements of the project, add or modify a limited set of features, and/or prevent modification to edits made by specific user or groups of users.
  • the virtual/augmented reality content management system may be implemented as a white label publishing platform. Some professional content producers with an interest in creating interactive virtual reality content, may lack the familiarity or technical background to create and maintain the required code.
  • users are able to add their branding and use their domains (such as WordPress or Squarespace for web pages) to deliver their virtual reality content. Because the virtual reality content is made available through a browser, the created content is made more accessible to a target audience. Because the virtual/augmented reality content management system may operate as an end-to-end product, analytics are included to help determine the performance of a particular VR story.
  • the virtual/augmented reality content management system offers an easy to use drag- and-drop interface allowing film-makers and photographers to collaborate with their clients to create VR stories with interactive elements that can be easily published to websites, email or social platforms. Once published, the virtual/augmented reality content management system tracks analytics around usage, user behavior and return on investment statistics.
  • the virtual/augmented reality content management system allows film-makers and photographers to differentiate their services and easily add virtual reality production
  • the virtual/augmented reality content management system eliminates the need for any external help.
  • the 360° video content created using the content management system may include features such as optional looping of the video, active pre-screen load, and higher quality of the video with reduced emphasis on peripheral distortion effects.
  • the interactive features found in content created using the content management system may include fully functional transitional hotspots, fully functional media (info) hotspots, and fully functional starting view (POV) hotspots.
  • Content created using the content management system may implement a VR mode that includes gaze based control to allow the activation of the buttons by looking at them, and an inter-pupillary distance (IPD) controller to allow a viewer can change the IPD distance in VR mode.
  • IPD inter-pupillary distance
  • the virtual/augmented reality content management system may include a collaboration loop to allow clients to add a hotspot and write a comment that the creator can see and resolve.
  • the virtual/augmented reality content management system includes analytics to determine the number of views for a particular story, which may be defined as the number of instances the viewer activated at least one hotspot and was present in the experience for at least 30 seconds.
  • the virtual/augmented reality content management system may create a "WordPress-like plugins" marketplace where content creators can publish and develop add-ons for virtual reality storytelling such as plugins, that extend and expand the functionality of virtual/augmented reality content management system.
  • One example may be a radar plugin that allows users to suggest to the viewer where to look during a particular time in the story.
  • the plugin could be developed internally, creating a marketplace for creative ideas may generate useful and unique functionality for the platform.
  • the virtual/augmented reality content management system may provide full 3D support through utilization of stereoscopic video.
  • the virtual/augmented reality content management system may provide full virtual reality support through the addition of cinematic and CGI elements.
  • the virtual/augmented reality content management system may include the ability to create virtual worlds like in Unity.
  • a system environment 100 comprises a user interface 108 communicably coupled to a server 104 by through a network 102.
  • the server 104 comprises memory 106.
  • a virtual/augmented reality content management system as further illustrated by example in the following drawings may utilize this general system architecture.
  • a virtual/augmented reality content management system 200 comprises a user interface 206, a controlled memory data structure 212, a transcoder 214, a loader 246, a content player 230, a correlator 264, an editor 272, a project management window 232, analytics engine 258, and a media display window 254.
  • the user interface 206 displays the project management window 232 and the media display window 254.
  • the project management window 232 comprises a project editor interface 250, a media content uploader interface 248, a story editor interface 208, and a project sharing interface 252.
  • the story editor interface 208 comprises a project map 268 displaying a plurality of media tile icons 226 comprising a start node 274 and at least one exit node 234.
  • Tile connectors 238 directionally link at least one exit node 234 of one media tile icons 226 to the start node 274 of another media tile icons 226.
  • the project sharing interface 252 provides
  • the media display window 254 comprises a meshed content player 210 comprising a meshed content editor interface 218, an interactive interface layer 222, and an environment content layer 220.
  • the interactive interface layer 222 displays mapped spatiotemporal hotspot locations 224.
  • the controlled memory data structure 212 comprises a media project allocation of memory 256.
  • the media project allocation of memory 256 comprises interface layer content allocation of memory 216, spatiotemporal hotspot locations 244, an uploaded media 240 comprising media content slices 236.
  • the operation of the virtual/augmented reality content management system 200 involves the user interface 206 displaying a project management window 232 to work on creating or modifying a 360° media project presentation.
  • the project management window 232 displays a project editor interface 250 that allows a user to modify the name and description of the 360° media project presentation.
  • the user may add additional 360° media content to the 360° media project presentation through the media content uploader interface 248.
  • the media content uploader interface 248 receives 360° media content as image content 204 and video content 202.
  • a user can select files they want to upload through a list/drop down menu or may add the files utilizing a drag and drop method through the user interface 206.
  • the media content uploader interface 248 communicates the image content 204 and the video content 202 to the transcoder 214.
  • the transcoder 214 receives the 360° media content from the media content uploader interface 248 and modifies the image content 204 and the video content 202 to be assembled as 360° media content stream 242 displayed through the environment content layer 220 of a meshed content player 210.
  • the transcoder 214 takes the image content 204 and the video content 202 and slices the content to generate 360° media slices 270 that will be
  • the transcoder 214 stores media content slices 236 in the uploaded media 240 of the media project allocation of memory 256.
  • the transcoder 214 may additionally generate lower resolution slices 266 from the image content 204 and the video content 202.
  • the lower resolution slices 266 are stored within the uploaded media 240 of the media project allocation of memory 256 and may be utilized by the content player 230 to build the 360° media content stream 242 depending on the display parameters 262 of the user interface 206 displaying the media display window 254.
  • the project management window 232 displays a story editor interface 208 with media tile icons 226 through a project map 268.
  • the media tile icons 226 correspond to 360° media content pieces uploaded through the transcoder 214.
  • the project map 268 operates as storyboard where the media tile icons 226 display relational attributes of a 360° media project presentation through tile connectors 238.
  • the tile connectors 238 provide insight into the progression from one media tile icon to another during a 360° media project
  • the at least one exit node 234 correspond to the transitional spatiotemporal hotspot locations activated during playback of a 360° media project presentation through a meshed content player 210. While the 360° media content stream 242 generated following the activation of the transitional spatiotemporal hotspot location corresponds to the connected media tile icon with the start node 274 connected by the tile connector.
  • the tile connectors 238 may be directionally biased limiting the transition from media content stream to another or may be configured to be bidirectional. Bidirectional movement between the current media content stream and the previous media content stream may be permitted with a navigation button allow a user to return to the previous screen.
  • the current media content stream may be a 360° video that includes an automatic transitional hotspot that may load the previous media content stream once the 360° video completes its playback.
  • Each of the media tile icons 226 displays a thumbnail/preview image of the
  • the media tile icons 226 may comprise a preview content button, an edit content button, and delete tile button.
  • the selection of the preview content button launches the 360° media content in the meshed content player 210 in a preview mode, allowing viewing of the media as a ready to publish project.
  • the selection of the edit content button launches the 360° media content in an edit mode through the meshed content player 210.
  • the edit mode allows for the creation and modification of spatiotemporal hotspot locations within the interactive interface layer 222 through user interactions within a meshed content editor interface 218.
  • the selection of the delete tile button removes the corresponding media tile icon from the project map 268. [0073]
  • the selection of the preview content button or the edit content button triggers activation of the loader 246.
  • the loader 246 detects input controls from the story editor interface 208 and the interactive interface layer 222 to generate a 360° media content stream 242. From the story editor interface 208, the loader 246 detects an input control for the selection of the preview content button or the edit content button and begins control of the content player 230. The input control identifies the 360° media content associated with the preview content button and the edit content button of the media tile icon in the story editor interface 208. In the interactive interface layer 222, the loader 246 detects the input control from selection of a transitional spatiotemporal hotspot location. The loader 246 operates the content player 230 to identify the media content slices 236 associated with the media tile icon in the uploaded media 240 of the media project allocation of memory 256.
  • the content player 230 utilizes the media content slices 236 for the selected 360° media content to generate a 360° media content stream 242 by geometrically positioning the 360° media slices orthogonal to a virtual rotatable viewport and assigning a coordinate system for vectors of the virtual rotatable viewport.
  • the content player 230 meshes the edges of the 360° media slices 270 and adds curvature allowing the vectors of the virtual rotatable viewport to be equidistant forming a spherical projection.
  • the content player 230 may receive display parameters 262 from the user interface 206.
  • the display parameters 262 provides information regarding parameters affecting how content is being displayed through the user interface. This information may include, but is not limited to, aspect ratio, pixel density, refresh rate, connection speed (bandwidth), processing speed, memory timing, etc., of the display, display window, and/or computing device controlling the user interface.
  • the content player 230 may utilize the display parameters 262 to determine selection of the original resolution 360° media slices 270 or the lower resolution slices 266 to generate the 360° media content stream 242.
  • the content player 230 may be configured with a minimum threshold value for the aspect ratio and pixel density, where if the minimum values are not met, the content player 230 defaults to the lower resolution slices 266 to generate the 360° media content stream 242. Similarly, the content player 230 may have a minimum threshold value for the data transfer rate to the user interface 206, which if are not met, may default to the lower resolution slices 266 to prevent poor playback quality.
  • the loader 246 controls the display of the 360° media content stream 242 as an environment content layer 220 within a meshed content player 210. Once the 360° media content stream 242 has been constructed, the 360° media content stream 242 is displayed through an environment content layer 220 of the meshed content player 210.
  • the loader 246 controls operation of the meshed content player 210 to display the interactive interface layer 222 comprising mapped spatiotemporal hotspot locations 224 as an overlay to the environment content layer 220. Additionally, the loader 246 operates the meshed content player 210 to display the meshed content editor interface 218 as an overlay when the editor mode is selected.
  • the meshed content player 210 is the combination of environment content layer 220 with the interactive interface layer 222 as an overlay and, when the editor mode is selected, the meshed content editor interface 218 as an overlay to both the interactive interface layer 222 and the environment content layer 220.
  • the meshed content player 210 is displayed through a media display window 254.
  • the media display window 254 is an area of the user interface 206 displaying the meshed content player 210. In some instances, the media display window 254 may be accomplished by a browser window.
  • the editor 272 controls creation and modification of the 360° media project
  • the editor 272 controls creation of spatiotemporal hotspot locations 244 for display through the interactive interface layer 222.
  • the editor 272 receives input controls from the meshed content editor interface 218 to generate spatiotemporal hotspot locations 244 at a coordinate position on the environment content layer 220.
  • the editor 272 configures display settings and spatiotemporal activation settings for the spatiotemporal hotspot locations 244, controlling the appearance of the spatiotemporal hotspot location (e.g., graphical icons, etc.,), as well as any activation triggers (e.g., playback time, mouse hover over, other hotspot activation order, etc.,).
  • the editor 272 configures the appearance of the spatiotemporal hotspot locations 244 through the interactive interface layer 222 by associating graphical icons with the displayed position of the mapped spatiotemporal hotspot locations 224.
  • the graphical icons displayed through the interactive interface layer 222 for the mapped spatiotemporal hotspot locations 224 may be user provided images uploaded through the meshed content editor interface 218. If the graphical icons are uploaded through the meshed content editor interface 218, the editor 272 stores the graphical icons in the interface layer content allocation of memory 216 within the media project allocation of memory 256.
  • the graphical icons may be generated through the meshed content editor interface 218 using a shape builder tool allowing for geometric shape, color modifications, and other customization options to create a graphical icon.
  • Mapped spatiotemporal hotspot locations 224 displayed through the interactive interface layer 222 may additionally include a text box displaying the name/title of the spatiotemporal hotspot location and/or a brief description of the spatiotemporal hotspot locations.
  • the name/title and the brief description of the spatiotemporal hotspot location may be generated at the same time as the graphical icons are created/uploaded.
  • the name/title of the spatiotemporal hotspot location may be utilized as a descriptive text displayed in the exit node of a media tile icon in the project map 268.
  • the editor 272 may configure non-360 0 media content to be displayed upon activation of the spatiotemporal hotspot location through input control received through the meshed content editor interface 218.
  • the non-360 0 media content may include non-360 0 visual media content (e.g., flat images, videos, three-dimensional models, etc.,) audio content, and URLs.
  • the non-360 0 visual media content, audio content, and URLs are stored within the interface layer content allocation of memory 216.
  • the editor 272 may configure the activation triggers controlling the display of the non- 360 0 visual media content, playback of the audio content, and opening of the URLs through input control received through the meshed content editor interface 218.
  • the activation triggers determine activation of the spatiotemporal hotspot content depending on interactions through the meshed content player 210.
  • Activation triggers may be configured with a playback time of the 360° media content stream 242 in the environment content layer 220, such that a hotspot may activate at a specific time during the playback of the 360° media content stream 242.
  • an activation trigger may be tied to a specific region of the environment content layer 220, such that when the viewing of the specific region through the virtual rotatable viewport may activate a spatiotemporal hotspot location.
  • the editor 272 operates a correlator 264 to generate mapped spatiotemporal hotspot locations 224.
  • the correlator 264 operates as an association table for matching coordinate positions (i.e., vectors) of 360° media content stream 242 from the virtual rotatable viewport to locations on the interactive interface layer 222.
  • the editor 272 operates the correlator to map a coordinate position on the interactive interface layer to a spatiotemporal hotspot location, in response to detecting a map control from a meshed content editor interface displayed in the meshed content player.
  • the spatiotemporal hotspot locations 244 created through the meshed content editor interface 218 are stored in the media project allocation of memory 256 and displayed as mapped spatiotemporal hotspot locations 224 through the interactive interface layer 222.
  • the editor 272 configures transitional spatiotemporal hotspot locations through the meshed content editor interface 218.
  • the transitional spatiotemporal hotspot locations generate a link between the current 360° media content displayed in the environment content layer 220 and another piece of 360° media content. This relationship is visualized through the project map 268 of the story editor interface 208 as connections between media tile icons 226.
  • the transitional spatiotemporal hotspot location is created similarly to other spatiotemporal hotspot locations through the meshed content editor interface 218 with the additional step of associating another piece of 360° media content to the spatiotemporal hotspot location.
  • the editor 272 maps the association under the media connectors 260 and adds tile connectors 238 to project map 268 connecting the corresponding pieces of the 360° media content.
  • the project sharing interface 252 allows the user to share their media project presentation to others.
  • analytics engine 258 tracks user interactions in each piece of 360° media content.
  • the analytics engine tracks the position of the virtual rotatable viewport and the duration each user stays at the position.
  • the analytics engine utilizes the information to form a usage distribution for piece of 360° media content which can provided as an overlay and viewed through the editor to assist in positioning of the hotspot locations.
  • the virtual/augmented reality content management system 200 may be operated in accordance with the processes described in Figure 4 - Figure 10.
  • a meshed content player 300 comprising an interactive interface layer 302 positioned between a virtual rotatable viewport 308 and an environment content layer 304.
  • the loader 246 operates the content player 230 to generate a 360° media content stream 242 comprising the 360° media slices orthogonally positioned to the virtual rotatable viewport 308 from the media content slices 236 of the uploaded media 240 in the media project allocation of memory 256.
  • a vector line is drawn from the virtual rotatable viewport 308 to the interactive interface layer 302, and corresponds to the coordinate position of a mapped spatiotemporal hotspot locations 306.
  • the correlator 264 positions the spatiotemporal hotspot locations 244 on the interactive interface layer 302.
  • the position of spatiotemporal hotspot locations corresponds to the associated coordinate positions of the environment content layer 304 stored in the correlator 264.
  • the spatiotemporal hotspot locations 244 are displayed on the interactive interface layer 302 as the mapped spatiotemporal hotspot locations 306 and are displayed with any associated graphical icons stored in the interface layer content allocation of memory 216 in the media project allocation of memory 256.
  • the meshed content player 300 may be operated in accordance with the processes described in Figure 4 - Figure 10
  • a method 400 of operating virtual/ augmented reality content management system for generating a 360° media project presentation involves receiving media content through a media content uploader interface displayed through a user interface (block 402), and in block 404, slicing 360° media content into 360° media slices through operation of a transcoder.
  • the method 400 stores the 360° media slices as uploaded media content in a media project allocation of memory within a controlled memory data structure.
  • the method 400 displays media tile icons in a project map for the uploaded media content through a story editor interface.
  • the method 400 operates a loader.
  • subroutine block 412 the loader displays a meshed content player in a media display window comprising an environment content layer displaying the 360° media slices media geometrically positioned orthogonal to a virtual rotatable viewport overlaid with an interactive interface layer.
  • the method 400 operates an editor.
  • the editor displays a meshed content editor interface in the meshed content player.
  • the editor configures the display of spatiotemporal hotspot locations through the interactive interface layer.
  • a method 500 of operating a virtual/augmented reality content management system involves displaying a project management window through a user interface comprising a project editor interface, a media uploader interface, a storyboard interface, and a project sharing interface, (block 502).
  • the project management window loads a 360° media project presentation from a media project allocation of memory, in response to receiving input control through the project editor interface (block 504).
  • the input control may retrieve an existing 360° media project presentation stored in the media project allocation of memory or generate a new 360° media project presentation.
  • the method 500 receives 360° media content through media content uploader interface and uploads the 360° media content to a transcoder.
  • the transcoder is operated to generate 360° media slices and lower resolution slices of the 360° media content.
  • the method 500 stores the 360° media slices and the lower resolution slices 360° media slices to a media project allocation of memory within a controlled memory data structure.
  • the method 500 display media tile icons in a project map within a story editor interface.
  • the method 500 selects lower resolution slices of the 360° media content from the media project allocation of memory for a 360° media content stream in response to detecting the user interface display parameters being below a playback settings threshold.
  • the method 500 operates a loader to display the lower resolution slices of the 360° media content as the 360° media content stream in an environment content layer of the meshed content player overlaid with an interactive interface layer.
  • a method 600 of operating virtual/ augmented reality content management system for generating a 360° media project presentation involves operating an editor to designate a starting media tile icon from the media tile icons in the project map (block 602).
  • the starting media tile icon may be designated through selection of a media tile icon listed in a start media selector drop down displayed in the story editor interface.
  • the method 600 initiates the 360° media project presentation with the uploaded media content associated with the starting media tile icon, displayed through the environment content layer of the meshed content player.
  • a method 700 for operating a loader involves generating a 360° media content stream from the uploaded media content by geometrically positioning the 360° media slices orthogonal to a virtual rotatable viewport and assigning a coordinate system for vectors of the virtual rotatable viewport through operation of a content player (block 702).
  • the method 700 displays the 360° media content stream in an environment content layer of a meshed content player.
  • the method 700 overlays an interactive interface layer to the environment content layer through the meshed content player.
  • the method 700 detects an input control for a transitional spatiotemporal hotspot location in the interactive interface layer.
  • the method 700 generates and displays a new content stream from another uploaded media content associated with the transitional spatiotemporal hotspot location in the meshed content player.
  • a method 800 for operating an editor involves detecting a map control from a meshed content editor interface displayed in the meshed content player (block 802).
  • the editor maps a coordinate position on the interactive interface layer to a spatiotemporal hotspot location through operation of a correlator.
  • the editor configures display and spatiotemporal activation of the spatiotemporal hotspot locations through the interactive interface layer.
  • the editor associates other uploaded media content with the transitional spatiotemporal hotspot location in the interactive interface layer.
  • the editor generates and display connectors between at least one exit node of a media tile icon and the start node of another media tile icon associated with the transitional
  • a method 900 of operating an editor involves associating graphical icons uploaded or created through the meshed content editor interface for display through the interactive interface layer with the spatiotemporal hotspot locations (block 902).
  • the method 900 associates non-360 0 visual media content uploaded through the meshed content editor interface for display through the interactive interface layer with the spatiotemporal hotspot locations.
  • the method 900 configures activation triggers for displaying the non-360 0 visual media content.
  • the method 900 associates a Uniform Resource Locator (URL) with the spatiotemporal hotspot locations.
  • the method 900 configures the activation triggers for opening the URL.
  • URL Uniform Resource Locator
  • the method 900 associates audio content with the spatiotemporal hotspot locations.
  • the method 900 configures the activation triggers for the audio content for initiating playback.
  • the method 900 stores the graphical icons, the non-360 0 visual media content, and the audio content in an interface layer content allocation of memory of the media project allocation of memory.
  • a method 1000 of operating an editor involves mapping usage interaction data from an analytics engine to the coordinate system of the environment content layer (block 1002).
  • the method 1000 generates and displays a usage interaction distribution overlay from mapped usage interaction data through the meshed content player.
  • a project editor interface 1 100 is displayed comprising a project title name field 1 102, a project description field 1 104, and a project navigation bar 1 106.
  • the project title name field 1 102 of the project editor interface 1 100 provides an entry field to add or modify the title of a project.
  • the project description field 1 104 provides entry field to add or modify the description of the project.
  • the project navigation bar 1 106 shows the first icon highlighted for "Information” indicating the status a user is currently on the first step of the story creation process.
  • the user may set a title for the 360° media project presentation through the project title name field 1 102 and a description of the 360° media project presentation through the project description field 1 104. After the title and description for the project are set, the user may select the "Next" button on the button right to proceed to the second step of the story creation process involving the media uploader interface 1200.
  • a media uploader interface 1200 receives a media content 1206 comprising image content 1202 and/or video content 1204 through the drag and drop region 1212. After a user sets the Title and Description for the 360° media project presentation in the project editor interface 1 100, they proceed to media uploader interface 1200 for the second step in the story creation process.
  • the media uploader interface 1200 includes a project navigation bar 1208 with the "Files" icon highlighted indicating the user is currently in the media uploader interface 1200 and on the second step of the story creation process.
  • the user may upload their 360 media (Images or Videos) by clicking on the "Add file” button opening a menu to choose files from their computer or utilize the Drag and drop option to drag files from their computer to the drag and drop region 1212 to an "Upload Queue”.
  • the transcoder is slicing the uploaded media in chunks that are reassembled in to 360° media project presentation.
  • the transcoder may also generate low-level images (lower resolution images) from the uploaded media to improve the user experience during the playback of the 360° media project presentation.
  • the transcoder may slice generate 128 images chunks per media content.
  • the transcoder begins manipulation of the media content in the upload queue.
  • the drag and drop region 1212 and the content manipulation start control 1210 are replaced with the media content manipulation status panel 1304 indicating the progress of the transcoder manipulating the media content in the upload queue.
  • the media content manipulation status panel 1304 provides a delay media manipulation control button 1306 in the bottom right corner that allows a user to pause and delay the manipulation process and restart it at a later time.
  • the delay media manipulation control button 1306 is replaced with a "Next" button. The user may click the "Next" button to proceed to the story editor interface 1400.
  • the story creation process moves to a story editor interface 1400.
  • the project navigation bar 1422 is shown with the " Story" icon highlighted indicating that the user is in the story editor interface 1400 and on the third step of the story creation process.
  • the story editor interface 1400 provides users with a project map 1402 displaying media tile icons for each of the pieces of uploaded media content and a project map tool bar positioned between the project navigation bar 1422 and the project map 1402.
  • the project map tool bar comprises a start media selector drop down 1414, a meshed content previewer 1416, a project map zoom control buttons 1418, and a refresh problem map button 1420.
  • the start media selector drop down 1414 is shown above the project map 1402 allowing for the selection of a starting media tile icon 1424 for the 360° media project presentation.
  • the start media selector drop down 1414 allows the user to select a starting media for the 360 from the uploaded media.
  • the project map zoom control buttons 1418 allows a user to have a better overview of uploaded media and connections between them.
  • the refresh problem map button 1420 allows the user to refresh the view on the problem to show any recent changes that a user has made that haven't been updated.
  • the meshed content previewer 1416 allows the user to enter the Preview mode for the 360° media project presentation.
  • the story editor interface 1400 includes a project map 1402 displaying all the uploaded media from the media uploader interface 1200.
  • the story editor interface 1400 is shown in a zoomed in view of the project map 1402 displaying a media tile icon 1404 comprising a preview image 1406, a view button 1408, an edit button 1410, and a delete media tile icon button 1412.
  • Each media tile icon may be colored and each color may represent a different type of media.
  • the project map 1402 comprises all the uploaded media represented in colored media tile icons.
  • Each of the media tile icon 1404 comprises a preview image 1406 of the media content, a colored background, title of the media, a view button 1408, an edit button 1410, a delete media tile icon button 1412.
  • the background color of the media tile icon may indicate different types of media content (e.g., image, video, etc.,).
  • the title of media tile may be changed by clicking on it and editing the title in a popup window that appears.
  • Selection of the view button redirects a user into the viewing mode to preview the selected 360° media content.
  • Selection of the edit button redirects the user to an editor mode for creating a variety of hotspots.
  • a storyboard interface 1500 is shown with the project map 1502 zoomed out to show a starting media tile icon 1504 and a plurality of other media tile icons 1510. Although starting media tile icon 1504 and the plurality of other media tile icons 1510 are shown in the project map 1502, progression between the media tile icons has not been established. Both the starting media tile icon 1504 and the plurality of other media tile icons 1510 each comprise a start node 1506. The start node 1506 corresponds to the starting point of view hotspot location of the 360° media content displayed through a meshed content player.
  • the editor may generate and display connectors between the at least one exit node of the media tile icon and the start node of the other media tile icon in response to receiving a link creation control through the project map.
  • a storyboard interface 1600 is shown with the project map 1612 zoomed out showing a starting media tile icon 1602 connected to a plurality of other media tile icons 1606 directly and indirectly through connectors 1610.
  • the connectors 1610 link the media tile icons through the at least one exit node 1604 on one media tile icon to the start node 1608 of another media tile icon.
  • the progression between the media tile icons in the project map 1612 illustrates the story progression for the 360° media project presentation.
  • the meshed content player 1700 comprises a meshed content editor interface selector 1728, an environment content layer 1730 and a meshed content editor interface 1732 comprising a create transitional hotspot button 1702, a create information hotspot button 1704, a set initial point of view button 1706, a create guide hotspot button 1708, an enable video loop hotspot button 1710, a disable video loop hotspot button 1712, a create automatic transitional hotspot button 1714, a create spatial audio hotspot button 1716, a create background audio hotspot button 1718, a create URL transitional hotspot button 1720, an embed video content hotspot button 1722, and an embed image or image gallery hotspot button 1724.
  • the meshed content editor interface 1732 is displayed with the hotspot creation buttons.
  • the process of creating hotspots involves clicking on the "Edit button” for entering the "Editing mode", double clicking anywhere on the media to display the meshed content editor interface, and choosing the type of hotspot the user wants to place.
  • the user may additionally choose an Icon or polygon to be displayed with the hotspot.
  • the Icon may be viewed as an icon on the media.
  • the polygon hotspot may set the space where hotspot may be located.
  • the user may then set the details regarding the hotspot type the selected.
  • the user may switch the meshed content editor interface selector 1728 from the "Edit Mode" to the "Preview Mode” to view the appearance of the hotspot in the media creation project.
  • Transitional hotspot works as a transition from one media to another.
  • a user may make transition from image to video, from video to image, image to image, video to video, and etc., to be viewed through the environment interface layer.
  • Creating a transitional hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) where the user wants to place the transitional hotspot in 360° media content, choosing a create transitional hotspot button 1702 from the meshed content editor interface, choosing an icon or polygon hotspot, and choose which media they want to make the transition to, and then select apply to set the transitional hotspot location.
  • a user may upload their own Icon, after which the user may set the icon file and the set the name, description, icon scale, icon file, and transparency. The user may set the 360° media content where they want the transition to be made to. If a user chooses a polygon hotspot, the user would set the space where the polygon hotspot may be located by selecting points on the interactive interface layer that generate a polygon.
  • Information hotspots are represented as an information icon and is a hotspot that may provide information to a user through a pop up.
  • Creating an information hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) in the interactive interface layer where the user wants to place the information hotspot in the 360° media content, choosing a create information hotspot button 1704 from the meshed content editor interface, opening a configuration panel for the information hotspot, configuring the information hotspot through a configuration panel, and then select apply to set the information hotspot location.
  • the user may choose an Icon or polygon hotspot, set the displayable name for the information hotspot, the title font, size, color, transparency, and title background color.
  • the user may set the popup information to be displayed by entering text into a field and/or uploading an image or a video to be display, the user may select to pause other media playback while the user is viewing the information hotspot through a checkbox displaying in the configuration panel.
  • a start open check box may be provided to the user if they want the Info hotspot content to be visible without user action. If the start open check box is the Info hotspot (e.g., media, text, etc.,) may be displayed as open when a user opens the meshed content player in the preview mode and during the story mode when shared to other users.
  • the Info hotspot e.g., media, text, etc.
  • the content associated with the information hotspot may be visible upon detection of an activation trigger which may be due to the viewing user clicking on Info Icon or polygon and/or by glancing at the information hotspot.
  • the user may set a "Close button” location and color that sets the "X" (close) button location in the right, left, or the middle of the information popup, and allows choosing a desired color for the "X".
  • the user may set a maximum width in pixels for the pop-up such that the window dimension.
  • the user may set a background color for the pop-up window.
  • the user may set the transparency level of the popup window.
  • the user may also set the scale of the icon setting the dimension the icon will be displayed on the interactive interface.
  • the user may set a specific icon file to be displayed for the information hotspot from previously uploaded icons. Alternatively, if the user selects "None" option, the default icon for the information hotspot will be displayed. After the user has completed configuration of the information hotspot, the user may select/click on the apply button closing the configuration panel.
  • a "Set POV” hotspot allows the user to change the default position of the virtual rotatable camera through the meshed content player.
  • the "Set POV” hotspot is associated with the start node location and is the starting position for viewing the environment content layer through the virtual rotatable camera.
  • Creating a "Set POV” hotspot involves clicking on the "Edit button” on a media tile icon in the problem map, double clicking on a spot (selection input 1726) in the interactive interface layer where the user wants to place the "Set POV” hotspot in the 360° media content, choosing a set initial point of view button 1706 from the meshed content editor interface, opening a configuration panel for the information hotspot, configuring the information hotspot through a configuration panel, and applying the settings.
  • the user may set the name for the "Set POV” hotspot location.
  • the entered name for the "Set POV" hotspot may be associated with the name displayed for the start node in the project map.
  • a guide hotspot allows a user to set graphical elements to help guide users to certain areas of the media, guide hotspots may not be visible by themselves when the media project presentation is being viewed as part of a story.
  • the guide hotspots may serve as guide arrows that may appear on screen to pointing to the location where the user set the guide hotspot.
  • Creating a guide hotspot involves clicking on the "Edit button” on a media tile icon in the problem map, double clicking on any spot (selection input 1726) in the interactive interface layer, choosing create guide hotspot button 1708 from the meshed content editor interface, and Dragging and Dropping the guide hotspot to the place the user want to point out for a viewer.
  • An "Enable Video Loop” button may be provided that allows a user to configure playback of a video displayed in meshed content player. The user may set the video to autoplay on a loop. The effects of this hotspot may only be visible when in the previewing mode/ story mode of the media project presentation. Creating an "Enable Video Loop” hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) in the interactive interface layer, and choosing enable video loop hotspot button 1710 from the meshed content editor interface. The position of the Enable video Loop hotspot on the interactive interface layer may not matter as activation of the "Enable Video Loop” hotspot may enable any video to play again when it ends.
  • the disable video autoplay hotspot is an option that may be set by default when video content starts playing automatically in the interactive interface layer.
  • the disable video auto play option allows a user to change the default settings for video playback.
  • Activating the disable autoplay hotspot involves clicking on the "Edit button" on a media tile icon in problem map, double clicking on a spot in the interactive interface layer, and choosing the disable video loop hotspot button 1712 from the meshed content editor interface.
  • the disable video autoplay hotspot may only be visible in Edit Mode, and not in viewing mode.
  • An auto transition hotspot may be an option set for video media content played in the environment content layer.
  • a user may place the auto transition hotspot anywhere in the video media and may automatically trigger the transition to another media when video ends. If a user has previously set the video loop hotspot, the auto transition has a priority, so the video may not play from the start but may transition a user to another media when it ends.
  • Creating an auto transitional hotspot involves clicking on the "Edit button" on a media tile icon in problem map, double clicking on a spot (selection input 1726) on the interactive interface, choosing the create automatic transitional hotspot button 1714 from the meshed content editor interface,
  • the auto-transitional hotspot location choosing which media they want to make the transition to through the configuration panel, and then select apply to set the auto transitional hotspot location.
  • the user may set the name, description, and define which media from a dropdown menu to choose where a viewer would be transitioned to after video playback ends.
  • the auto transition hotspot may only be visible in an Edit Mode, and not in viewing mode.
  • a spatial audio point hotspot may allow users to present sounds from any direction to draw a listener's attention and give them cues on where to look next.
  • Creating a spatial audio point hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) where the user wants to place the spatial audio point hotspot in 360° media content, choosing a create spatial audio hotspot button 1716 from the meshed content editor interface, configuring the spatial audio point hotspot through a configurations panel, and applying the setting. In the configuration panel, the user may choose between an Icon or polygon type of hotspot for the spatial audio point hotspot.
  • a user may upload their own Icon, after which the user may set the icon file and the set the name, description, icon scale, icon file, and transparency.
  • a polygon hotspot the user would set the space where the polygon hotspot may be located by selecting points on the interactive interface layer that generate a polygon.
  • the user may drag and drop an audio file to a drag and drop region on the configuration panel to upload the media content, select files on their computer through a file explorer, and/or select from previously uploaded audio files through a drop down list.
  • the user may set the displayable name for the spatial audio hotspot as well as the font, size, color, transparency, and title background color. Additionally, the user may have options to edit the spatial audio selection through a drop down list of previously uploaded spatial audio files. After the user finishes configuring the hotspot, the user may select apply through the configuration panel.
  • An audio background hotspot may not be visible in a viewing mode, the viewer may only hear a music playing in a media where a user place audio background hotspot.
  • the creation of an audio background involves clicking on the "Edit button" on a media tile icon in the project map, launching the meshed content player in the editor mode, double clicking anywhere on interactive interface layer (selection input 1726) launching the meshed content editor interface, choosing the create background audio hotspot button 1718 from the meshed content editor interface, configuring the hotspot details through a configuration panel, and applying the changes.
  • the configuring the audio background through the configuration panel includes selecting or uploading an audio file for playback.
  • the audio file may be uploaded by having the user dragging and dropping an audio file from their computer to a drag and drop region in the configuration panel, or by click in the field launching a file explorer to choose the MP3 file from their computer.
  • the configuration panel may include a dropdown menu, where the user may select from previously uploaded audio files.
  • the user may set the background audio to loop through options in the configuration panel, if audio loop option is set to OFF, the audio playback may stop playing when it comes to an end, otherwise if a user set audio loop ON, then audio playback may start over when it comes to an end.
  • the user may set the background audio to start on a delay.
  • a user may set a delay for an audio Playback on a scale. The scale is set on seconds, so if a user sets the delay on a "2", the audio may start playing again in two seconds after audio ends.
  • URL transitional hotspots allow a user to embed a link on the interactive interface layer that when selected would redirect the viewer to the desired URL address.
  • Creating a URL transitional hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) where the user wants to place the URL transitional hotspot in 360° media content, choosing a create URL transitional hotspot button 1720 from the meshed content editor interface, configuring the URL transitional hotspot through a configurations panel, and applying the setting.
  • the configuration includes a text field where a user may enter the destination URL they would like to associate with the URL transitional hotspot.
  • the user may choose between an Icon or polygon type of hotspot for the URL transitional hotspot. If a user chooses an Icon hotspot, a user may upload their own icon, after which the user may set the icon file and the set the name, description, icon scale, icon file, and transparency. If a user chooses a polygon hotspot, the user would set the space where the polygon hotspot may be located by selecting points on the interactive interface layer that generate a polygon. The user may set the displayable name for the URL transitional hotspot as well as the font, size, color, transparency, and title background color.
  • An embedded video hotspot allows a user embedded a video into their media simply by copying and pasting the URL of a video file into the hotspot.
  • Creating an embedded video hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) where the user wants to place the embedded video hotspot in 360° media content, choosing embed video content hotspot button 1722 from the meshed content editor interface, configuring the embedded video hotspot through a configurations panel, and applying the setting.
  • the configuration includes a text field where a user may enter the destination URL they would like to associate with the embedded video hotspot.
  • the user may choose between an Icon or polygon type of hotspot for the embedded video hotspot. If a user chooses an Icon hotspot, a user may upload their own Icon, after which the user may set the icon file and the set the name, description, icon scale, icon file, and transparency. If a user chooses a polygon hotspot, the user would set the space where the polygon hotspot may be located by selecting points on the interactive interface layer that generate a polygon. The user may set the displayable name for the embedded video hotspot as well as the font, size, color, transparency, and title background color.
  • the user After configuring, the graphical element (i.e., icon or polygon) for the embedded video hotspot, the user would enter the URL address associated with the video they would like to embed. The user would then set width of video frame, set the height of video frame, set the playback options for other media playback (i.e., pausing other media playback when the activation trigger for the embedded media hotspot is detected). The user may set a "Start Open” option if the user wants the Video to start playing immediately when the activation triggers is detected. The user may set a "Close button” location and color that sets the "X" (close) button location in the right, left, or the middle of the embedded video window, and allows choosing a desired color for the "X”.
  • the graphical element i.e., icon or polygon
  • the user may configure a Video Border if they want a border around Video frame.
  • the user may also set the scale of the icon setting the dimensions of the icon to be displayed on the interactive interface, the Icon Scale may be set by an incrementor through clicking on + or - button if a user wants larger or smaller icons.
  • the user may set a specific icon file to be displayed for the embedded video hotspot from previously uploaded icons. Alternatively, if the user selects "None" option, the default icon for the embedded video hotspot will be displayed. After the user has completed configuration of the embedded video hotspot, the user may select/click on the apply button closing the configuration panel.
  • An "embed an image or gallery hotspot” allows a user to create embedded images or galleries that may open when an activation trigger is detected. Creating an embedded image or gallery hotspot involves clicking on the "Edit button” on a media tile icon in problem map, double clicking on a spot (selection input 1726) where the user wants to place the embedded video hotspot in the 360° media content, choosing embed image or image gallery hotspot button 1724 from the meshed content editor interface, configuring the embedded image or gallery hotspot through a configurations panel, and applying the setting.
  • the configuration includes a text field where a user may enter the destination URL they would like to associate with the embedded image or gallery hotspot.
  • the user may choose between an icon or polygon type of hotspot for the embedded image or gallery hotspot. If a user chooses an Icon hotspot, a user may upload their own Icon, after which the user may set the icon file and the set the name, description, icon scale, icon file, and transparency. If a user chooses a polygon hotspot, the user would set the space where the polygon hotspot may be located by selecting points on the interactive interface layer that generate a polygon. The user may set the displayable name for the embedded image or gallery hotspot as well as the font, size, color, transparency, and title background color.
  • the user After configuring, the graphical element (i.e., icon or polygon) for the embedded image or gallery hotspot, the user would enter the URL address associated with the image or gallery they would like to embed. The user would then set width and height for an image frame. The user may set the playback options for other media playback for pausing other media playback when the activation trigger for the embedded media hotspot is detected. The user may set a "Start Open” option if the user wants the image or gallery to be open immediately when the activation triggers is detected. The user may set a "Close button” location and color that sets the "X" (close) button location in the right, left, or the middle of the embedded image or gallery window, and allows choosing a desired color for the "X".
  • the graphical element i.e., icon or polygon
  • the user may also set the scale of the icon setting the dimensions of the icon to be displayed on the interactive interface, the Icon Scale may be set by an incrementor through clicking on + or - button if a user wants larger or smaller icons. Additionally, the user may set a specific icon file to be displayed for the embedded image or gallery hotspot from previously uploaded icons. Alternatively, if the user selects "None" option, the default icon for the embedded image or gallery hotspot will be displayed. After the user has completed
  • the user may select/click on the apply button closing the configuration panel.
  • the user may also set the window border around the image or gallery frame.
  • the user may select an image file from a dropdown menu or upload it by dragging and dropping the image into a drag drop region.
  • the user may select an "Add Gallery” option allowing more than one image to be uploaded for the hotspot.
  • the added images may be displayed under the image gallery allowing the user to organize and delete images from being displaying in the gallery hotspot.
  • Setting polygons for hotspots allows a user to set a larger trigger area for activating a hotspot.
  • Creation of a polygon hotspot involves selecting a hotspot that a user would like to create, selecting the "create polygon hotspot” option in the configuration panel, setting the polygon color and polygon transparency, and then selecting "begin polygon creation” button.
  • the selection of the "begin polygon creation” button refocuses the user back the interactive interface layer for the user to select at least three points on the interactive interface layer defining a region where they would like to create a hotspot. After defining the region, the user may save the settings and associating the perimeter formed by the at least three points as the defined region of the polygon hotspot.
  • the meshed content player 1800 comprises a transitional hotspot editor panel 1802 comprises a hotspot name input field 1806, a transitional hotspot descriptor field 1808, a transitional hotspot scaler 1810, a transitional hotspot icon selector 1812, an uploaded media content associator 1814, and an apply settings button 1816.
  • a transitional hotspot editor panel 1802 is displayed allowing for the addition of a hotspot name in the hotspot name input field 1806, a description of the hotspot in the transitional hotspot descriptor field 1808, the scale of the icon in the transitional hotspot scaler 1810, the graphical icon for the hotspot in the transitional hotspot icon selector 1812, and selecting the 360° media content that will load upon selection of the transitional hotspot in the uploaded media content associator 1814.
  • the hotspot has been configured the user may select the apply settings button 1816 to save the transitional hotspot to the selected spatiotemporal hotspot location 1804.
  • the meshed content player 1900 comprises a background audio selector panel 1902 comprising an audio file drag and drop region 1908, an audio playback loop selector 1910, an audio file selector drop down 1912, an audio playback delay control 1914, and an apply setting button 1916.
  • the background audio selector panel 1902 is displayed. The user may add a background audio file through the audio file drag and drop region 1908 or alternatively select a previously uploaded file through the audio file selector drop down 1912, then select if the audio loops through the audio playback loop selector 1910, and configure if a playback delay through the audio playback delay control 1914.
  • the meshed content player 2000 comprises a starting point of view hotspot location 2002, an information spatiotemporal hotspot 2004, a transitional spatiotemporal hotspot location 2006, and an audio content hotspot location 2008.
  • Each of the visible hotspot locations comprises a hotspot descriptor 2010, and a graphical icon 2012 visible to an end user going through the 360° media project presentation.
  • the project sharing interface 2100 shows a project navigation bar 2106 indicating a user is on the project sharing interface 2100.
  • the project sharing interface 2100 displays different sharing options for the 360° media project presentation that includes project sharing links 2102 for sharing the 360° media project presentation through various sharing platforms.
  • the project sharing links 2102 may utilize a sharing platforms application programming interface (API) to directly share the 360° media project presentation through various social networks or sharing platforms.
  • a project sharing interface 2100 includes a URL link generator 2104 that generates a URL for directly accessing the 360° media project presentation through a browser.
  • the project sharing interface 2100 may include iframe code generator 2108 for generating code for embedding the project in a webpage's HTML.
  • Figure 22 is an example block diagram of a computing device 2200 that may incorporate embodiments of the present invention.
  • Figure 22 is merely illustrative of a machine system to carry out aspects of the technical processes described herein, and does not limit the scope of the claims.
  • the computing device 2200 typically includes a monitor or graphical user interface 2202, a data processing system 2220, a communication network interface 2212, input device(s) 2208, output device(s) 2206, and the like.
  • the data processing system 2220 may include one or more processor(s) 2204 that communicate with a number of peripheral devices via a bus subsystem 2218.
  • peripheral devices may include input device(s) 2208, output device(s) 2206, communication network interface 2212, and a storage subsystem, such as a volatile memory 2210 and a nonvolatile memory 2214.
  • the volatile memory 2210 and/or the nonvolatile memory 2214 may store computer- executable instructions and thus forming logic 2222 that when applied to and executed by the processor(s) 2204 implement embodiments of the processes disclosed herein.
  • the input device(s) 2208 include devices and mechanisms for inputting information to the data processing system 2220. These may include a keyboard, a keypad, a touch screen incorporated into the monitor or graphical user interface 2202, audio input devices such as voice recognition systems, microphones, and other types of input devices.
  • the input device(s) 2208 may be embodied as a computer mouse, a trackball, a track pad, a joystick, wireless remote, drawing tablet, voice command system, eye tracking system, and the like.
  • the input device(s) 2208 typically allow a user to select objects, icons, control areas, text and the like that appear on the monitor or graphical user interface 2202 via a command such as a click of a button or the like.
  • the output device(s) 2206 include devices and mechanisms for outputting information from the data processing system 2220. These may include speakers, printers, infrared LEDs, and so on as well understood in the art.
  • the communication network interface 2212 provides an interface to communication networks (e.g., communication network 2216) and devices external to the data processing system 2220.
  • the communication network interface 2212 may serve as an interface for receiving data from and transmitting data to other systems.
  • Embodiments of the communication network interface 2212 may include an Ethernet interface, a modem (telephone, satellite, cable, ISDN), (asynchronous) digital subscriber line (DSL), Fire Wire, USB, a wireless communication interface such as BlueTooth or WiFi, a near field communication wireless interface, a cellular interface, and the like.
  • the communication network interface 2212 may be coupled to the communication network 2216 via an antenna, a cable, or the like.
  • the communication network interface 2212 may be physically integrated on a circuit board of the data processing system 2220, or in some cases may be implemented in software or firmware, such as "soft modems", or the like.
  • the computing device 2200 may include logic that enables communications over a network using protocols such as HTTP, TCP/IP, RTP/RTSP, IPX, UDP and the like.
  • the volatile memory 2210 and the nonvolatile memory 2214 are examples of tangible media configured to store computer readable data and instructions to implement various embodiments of the processes described herein.
  • Other types of tangible media include removable memory (e.g., pluggable USB memory devices, mobile device SIM cards), optical storage media such as CD-ROMS, DVDs, semiconductor memories such as flash memories, non-transitory read-only-memories (ROMS), battery-backed volatile memories, networked storage devices, and the like.
  • the volatile memory 2210 and the nonvolatile memory 2214 may be configured to store the basic programming and data constructs that provide the functionality of the disclosed processes and other embodiments thereof that fall within the scope of the present invention.
  • Logic 2222 that implements embodiments of the present invention may be stored in the volatile memory 2210 and/or the nonvolatile memory 2214. Said logic 2222 may be read from the volatile memory 2210 and/or nonvolatile memory 2214 and executed by the processor(s) 2204. The volatile memory 2210 and the nonvolatile memory 2214 may also provide a repository for storing data used by the logic 2222.
  • the volatile memory 2210 and/or the nonvolatile memory 2214 may provide storage location for the analytics engine 258, the correlator 264, the loader 246, the content player 230, the meshed content player 210, the editor 272, the transcoder 214, the method 400, the method 500, the method 600, the method 700, the method 800, the method 900, and the method 1000.
  • the volatile memory 2210 and the nonvolatile memory 2214 may include a number of memories including a main random access memory (RAM) for storage of instructions and data during program execution and a read only memory (ROM) in which read-only non-transitory instructions are stored.
  • the volatile memory 2210 and the nonvolatile memory 2214 may include a file storage subsystem providing persistent (non-volatile) storage for program and data files.
  • the volatile memory 2210 and the nonvolatile memory 2214 may include removable storage systems, such as removable flash memory.
  • the bus subsystem 2218 provides a mechanism for enabling the various components and subsystems of data processing system 2220 communicate with each other as intended. Although the communication network interface 2212 is depicted schematically as a single bus, some embodiments of the bus subsystem 2218 may utilize multiple distinct busses.
  • the computing device 2200 may be a device such as a smartphone, a desktop computer, a laptop computer, a rackmounted computer system, a computer server, or a tablet computer device. As commonly known in the art, the computing device 2200 may be implemented as a collection of multiple networked computing devices. Further, the computing device 2200 will typically include operating system logic (not illustrated) the types and nature of which are well known in the art. [00140] Terms used herein should be accorded their ordinary meaning in the relevant arts, or the meaning indicated by their use in context, but if an express definition is provided, that meaning controls.
  • Circuitry in this context refers to electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes or devices described herein), circuitry forming a memory device (e.g., forms of random access memory), or circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment).
  • a computer program e.g., a general purpose computer configured by a computer program which at least partially carries out processes or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes or devices described herein
  • circuitry forming a memory device e.g., forms of random access memory
  • Firmware in this context refers to software logic embodied as processor-executable instructions stored in read-only memories or media.
  • Hardware in this context refers to logic embodied as analog or digital circuitry.
  • Logic in this context refers to machine memory circuits, non transitory machine readable media, and/or circuitry which by way of its material and/or material-energy
  • configuration comprises control and/or procedural signals, and/or settings and values (such as resistance, impedance, capacitance, inductance, current/voltage ratings, etc.), that may be applied to influence the operation of a device.
  • Magnetic media, electronic circuits, electrical and optical memory (both volatile and nonvolatile), and firmware are examples of logic.
  • Logic specifically excludes pure signals or software per se (however does not exclude machine memories comprising software and thereby forming configurations of matter).
  • Programmable device in this context refers to an integrated circuit designed to be configured and/or reconfigured after manufacturing.
  • the term "programmable processor” is another name for a programmable device herein.
  • Programmable devices may include programmable processors, such as field programmable gate arrays (FPGAs), configurable hardware logic (CHL), and/or any other type programmable devices.
  • Configuration of the programmable device is generally specified using a computer code or data such as a hardware description language (HDL), such as for example Verilog, VHDL, or the like.
  • a programmable device may include an array of programmable logic blocks and a hierarchy of reconfigurable interconnects that allow the programmable logic blocks to be coupled to each other according to the descriptions in the HDL code.
  • Each of the programmable logic blocks may be configured to perform complex combinational functions, or merely simple logic gates, such as AND, and XOR logic blocks.
  • logic blocks also include memory elements, which may be simple latches, flip-flops, hereinafter also referred to as "flops," or more complex blocks of memory.
  • signals may arrive at input terminals of the logic blocks at separate times.
  • Software in this context refers to logic implemented as processor-executable instructions in a machine memory (e.g. read/write volatile or nonvolatile memory or media).
  • references to “one embodiment” or “an embodiment” do not necessarily refer to the same embodiment, although they may.
  • the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.
  • Words using the singular or plural number also include the plural or singular number respectively, unless expressly limited to a single one or multiple ones.
  • the words “herein,” “above,” “below” and words of similar import when used in this application, refer to this application as a whole and not to any particular portions of this application.
  • association operation may be carried out by an "associator” or “correlator”.
  • switching may be carried out by a “switch”, selection by a “selector”, and so on.
  • implementations by which processes and/or systems described herein can be effected e.g., hardware, software, or firmware
  • the preferred vehicle will vary with the context in which the processes are deployed. If an implementer determines that speed and accuracy are paramount, the implementer may opt for a hardware or firmware implementation; alternatively, if flexibility is paramount, the implementer may opt for a solely software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, or firmware.
  • logic may be distributed throughout one or more devices, and/or may be comprised of combinations memory, media, processing circuits and controllers, other circuits, and so on. Therefore, in the interest of clarity and correctness logic may not always be distinctly illustrated in drawings of devices and systems, although it is inherently present therein. The techniques and procedures described herein may be
  • signal bearing media examples include, but are not limited to, the following: recordable type media such as floppy disks, hard disk drives, CD ROMs, digital tape, flash drives, SD cards, solid state fixed or removable storage, and computer memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un procédé de gestion de contenu de réalité virtuelle/augmentée qui consiste à télécharger vers l'amont un contenu multimédia et à faire fonctionner un transcodeur pour générer une version à faible niveau du contenu multimédia et des tranches à partir de la version à faible niveau et de la version originale du contenu multimédia ; afficher une carte de projet comprenant des échantillons multimédias du support téléchargé ; lier une paire de supports téléchargés en connectant un nœud de départ sur un échantillon multimédia à un autre échantillon multimédia, créant ainsi un nœud de sortie ; configurer un lecteur de contenu pour sélectionner les tranches de la version à faible niveau ; et configurer le lecteur de contenu pour générer un flux de contenu multimédia à 360° par positionnement géométrique des tranches orthogonales à une fenêtre d'affichage rotative virtuelle et attribution d'un système de coordonnées pour des vecteurs de la fenêtre d'affichage rotative virtuelle.
PCT/US2017/056173 2016-10-12 2017-10-11 Système de gestion de contenu de réalité virtuelle/augmentée WO2018071562A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662407422P 2016-10-12 2016-10-12
US62/407,422 2016-10-12

Publications (1)

Publication Number Publication Date
WO2018071562A1 true WO2018071562A1 (fr) 2018-04-19

Family

ID=61905985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/056173 WO2018071562A1 (fr) 2016-10-12 2017-10-11 Système de gestion de contenu de réalité virtuelle/augmentée

Country Status (1)

Country Link
WO (1) WO2018071562A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094130B2 (en) 2019-02-06 2021-08-17 Nokia Technologies Oy Method, an apparatus and a computer program product for video encoding and video decoding
CN113691883A (zh) * 2019-03-20 2021-11-23 北京小米移动软件有限公司 在vr360应用中传输视点切换能力的方法和装置
US20220180452A1 (en) * 2020-06-11 2022-06-09 Instasize, Inc. Automated Web Content Publishing
EP4027221A1 (fr) * 2021-01-11 2022-07-13 Eyeora Limited Procédé, dispositif et système de traitement de média
US11868701B1 (en) 2019-07-01 2024-01-09 Instasize, Inc. Template for creating content item

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102424A1 (en) * 2008-04-02 2011-05-05 Hibbert Ralph Animation Limited Storyboard generation method and system
US20120210255A1 (en) * 2011-02-15 2012-08-16 Kenichirou Ooi Information processing device, authoring method, and program
US20140152698A1 (en) * 2012-12-03 2014-06-05 Samsung Electronics Co., Ltd. Method for operating augmented reality contents and device and system for supporting the same
US20150040074A1 (en) * 2011-08-18 2015-02-05 Layar B.V. Methods and systems for enabling creation of augmented reality content
US20150082168A1 (en) * 2013-09-18 2015-03-19 Nxp B.V. Media content creation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102424A1 (en) * 2008-04-02 2011-05-05 Hibbert Ralph Animation Limited Storyboard generation method and system
US20120210255A1 (en) * 2011-02-15 2012-08-16 Kenichirou Ooi Information processing device, authoring method, and program
US20150040074A1 (en) * 2011-08-18 2015-02-05 Layar B.V. Methods and systems for enabling creation of augmented reality content
US20140152698A1 (en) * 2012-12-03 2014-06-05 Samsung Electronics Co., Ltd. Method for operating augmented reality contents and device and system for supporting the same
US20150082168A1 (en) * 2013-09-18 2015-03-19 Nxp B.V. Media content creation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VIAR: "Can I use YouTube 360 for Virtual Reality Marketing?", VIRTUALREALITYPOP.COM, 28 February 2015 (2015-02-28), XP055475882, Retrieved from the Internet <URL:https://virtualrealitypop.com/can-i-use-youtube-360-for-virtual-reality-marketing-ed46a8723563> [retrieved on 20150312] *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094130B2 (en) 2019-02-06 2021-08-17 Nokia Technologies Oy Method, an apparatus and a computer program product for video encoding and video decoding
CN113691883A (zh) * 2019-03-20 2021-11-23 北京小米移动软件有限公司 在vr360应用中传输视点切换能力的方法和装置
US11868701B1 (en) 2019-07-01 2024-01-09 Instasize, Inc. Template for creating content item
US20220180452A1 (en) * 2020-06-11 2022-06-09 Instasize, Inc. Automated Web Content Publishing
EP4027221A1 (fr) * 2021-01-11 2022-07-13 Eyeora Limited Procédé, dispositif et système de traitement de média
WO2022148882A1 (fr) * 2021-01-11 2022-07-14 Eyeora Limited Procédé, dispositif et système de traitement multimédia

Similar Documents

Publication Publication Date Title
WO2018071562A1 (fr) Système de gestion de contenu de réalité virtuelle/augmentée
CN111294663B (zh) 弹幕处理方法、装置、电子设备及计算机可读存储介质
CA2918687C (fr) Systeme et procede de videos a angles multiples
US8316084B2 (en) System and method for facilitating presentations over a network
US8701008B2 (en) Systems and methods for sharing multimedia editing projects
US11715275B2 (en) User interface and functions for virtual reality and augmented reality
AU2012338567A1 (en) Framework for creating interactive digital content
US20150248722A1 (en) Web based interactive multimedia system
US20150346969A1 (en) Interactive media object development system and method
US20140193138A1 (en) System and a method for constructing and for exchanging multimedia content
Miller et al. MiniDiver: A Novel Mobile Media Playback Interface for Rich Video Content on an iPhone TM
CN114363687B (zh) 三维场景互动视频创建方法及创建装置
WO2015105804A1 (fr) Système et procédé de génération et d&#39;utilisation de métadonnées spatiales et temporelles
JP2023547794A (ja) ビデオ配信システム、方法、計算機器及びユーザ機器
CN111857521B (zh) 多设备管理方法、装置以及集成化显示控制系统
US20160202882A1 (en) Method and apparatus for animating digital pictures
CN113590555B (zh) 数据处理方法、装置及系统
KR101825598B1 (ko) 컨텐츠 제공 방법을 실행하기 위하여 기록 매체에 저장된 컴퓨터 프로그램, 방법 및 장치
KR102516831B1 (ko) 싱글 스트림을 이용하여 관심 영역 고화질 영상을 제공하는 방법, 컴퓨터 장치, 및 컴퓨터 프로그램
CN113038225B (zh) 视频播放方法、装置、计算设备以及存储介质
US20200045374A1 (en) Organizing alternative content for publication
CN116939121A (zh) 多资源编辑系统以及多资源编辑方法
CN117911582A (zh) 混合媒体内容的自适应编辑体验
CN115937378A (zh) 特效渲染方法、装置、计算机可读介质及电子设备
CN116416347A (zh) 一种媒体数据生成方法、装置、计算机设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17860109

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24.07.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17860109

Country of ref document: EP

Kind code of ref document: A1