KR101721231B1 - 4D media manufacture methods of MPEG-V standard base that use media platform - Google Patents

4D media manufacture methods of MPEG-V standard base that use media platform Download PDF

Info

Publication number
KR101721231B1
KR101721231B1 KR1020160018849A KR20160018849A KR101721231B1 KR 101721231 B1 KR101721231 B1 KR 101721231B1 KR 1020160018849 A KR1020160018849 A KR 1020160018849A KR 20160018849 A KR20160018849 A KR 20160018849A KR 101721231 B1 KR101721231 B1 KR 101721231B1
Authority
KR
South Korea
Prior art keywords
image
mpeg
metadata
effect
standard
Prior art date
Application number
KR1020160018849A
Other languages
Korean (ko)
Inventor
장용석
Original Assignee
(주)다울디엔에스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)다울디엔에스 filed Critical (주)다울디엔에스
Priority to KR1020160018849A priority Critical patent/KR101721231B1/en
Application granted granted Critical
Publication of KR101721231B1 publication Critical patent/KR101721231B1/en

Links

Images

Classifications

    • H04N5/23238
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The present invention relates to a method for manufacturing 4D media based on an MPEG-V standard using a media platform for manufacturing an image in which a reality effect is inserted or a special image to correspond to an international standard file. The platform is composed of an image control unit (S10); an XML generation unit (S23); and a reality effect production unit (S20). Therefore, a general image or a special image photographed by an action camera and a reality effect interworking with the corresponding image are created based on MPEG-V which is an international standard specification.

Description

{4D media manufacture methods of MPEG-V standard based use media platform}

The present invention provides a basic operation such as reproduction, editing, and storage of a special image photographed by a video or action camera through a multipurpose image content production platform, and provides real-effect metadata to the corresponding image based on the MPEG-V standard And to a 4D media production method based on the MPEG-V standard using a media platform capable of reproducing images and inserted real-effect metadata according to the MPEG-V standard.

In recent years, a new type of images has been provided with the emergence of an 'action camera' capable of photographing the surrounding environment or surrounding objects at 360 °, and people are attracting attention to such images. In addition, as the 4D images with wind, scent, and vibration were inserted into the images, the people who encountered the images were stimulated with various senses and became a new occasion for the development of the video production.

For example, Korean Patent Registration No. 10-1580237 entitled " Method and system for providing 4D content production service and content production apparatus therefor "is disclosed, and video information including at least one of photograph, text, and moving picture, synopsis information, And transmits special effect selection information corresponding to each reproduction section of the contents to the special effect selection information transmitting section. The terminal device transmits the special effect selection information based on the synopsis information and the video time information received from the terminal device And transmits the generated content and the special effect code to the terminal device. After receiving the special effect selection information from the terminal device, the 4D content is generated using the special effect selection information And a content creation device for creating a content.

Particularly, the above-mentioned conventional technology is characterized in that it is possible to directly produce 4D contents by participating from selection of contents to creation of contents directly from a user's point of view.

As another example, Korean Patent No. 10-1075122 entitled "Special Effect Synchronization Device and 4D Screening System" has been published. In this conventional technology, a time code of video data and special effect information A special effect synchronizing device provided with an information extracting section for extracting special effect information given to a time code of an effect table matched with a current time code of the image data being displayed, And a special effect device for implementing the special effect included in the effect information.

On the other hand, in addition to the above conventional technology, as in the case of 4D content creation, which is referred to as the above-mentioned Korean Patent No. 10-1580237 entitled " 4D content production service providing method and system and content production apparatus therefor ", the movie theater, , 4D riders, and other companies that deal with 4D video make 4D content with their own technology and produce corresponding realistic effect expression device and provide it to consumers.

For example, a 4D-based movie produced by a company A movie theater can not be screened at a movie theater B, so that even if a realistic effect is produced at an A movie theater, and the movie is provided to a B movie theater, Means or reading means, the B movie theater repeatedly carries out repetitive tasks such as editing and inserting realistic effects with B's unconventional technology.

However, the means for solving the ultimate problem is not developed, and since a different effect is produced on the same image by the production of the arbitrary 4D video contents and contents, a means for solving it is very urgent.

Korean Patent No. 10-1580237 entitled "4D Content Production Service Providing Method and System, Korean Patent No. 10-1075122 entitled "Special effect synchronizing device and 4D screening system"

The present invention has been created in order to more positively solve all the above problems, and it is an object of the present invention to provide an image processing apparatus and method which can reproduce, modify, and store an action image shot with a 360 ° camera, Editing, real-time effect insertion, and the like, it is a problem to solve the problem of the present state of the 4D video production which is consistent with the non-standardization so that 4D contents can be easily used without restriction of the format in various places.

Another challenge is to modify the realism of the image more closely and easily and easily to modify the components of the platform, including the operation UI, so that it can be used by non-experts as well.

In order to accomplish the above object, a method for producing 4D media based on the MPEG-V standard using the media platform proposed in the present invention is as follows.


The method comprising the steps of: capturing an image with a general camera or an action camera; loading the photographed image with real-effect content and a reproduction platform based on the MPEG-V standard; Extracting the real-effect metadata on the motion, voice, and surrounding environment based on the MPEG-V standard, adding the interaction with the voice, gesture, and touch of the corresponding section, and Editing the General Info, the Effect Property, and the Effect Variable through a UI (User Interface) whose functions and positions are adjusted according to the user's tendency; and converting the metadata of the video into XML based on the MPEG-V standard The metadata of the video is loaded through a standard metadata parser of MPEG-V and displayed on the timeline. Step; is characterized in that successively after production of the 4D image standardized by inserting a realistic effect produced by a standard based on the MPEG-V on the image.

To this end, the present invention includes an image storage unit S11 for creating a project, storing, reading, managing a list, managing a project-specific image, an image analyzing unit S12 for analyzing and loading a stored image, And a display unit (S13) for outputting an image and reproducing, pausing and stopping the image, a property editing unit (S21) for editing the real image based on the MPEG-V standard, An intuitive UI for guiding the editing of the real-time effect attribute, a UI control section S22 for adjusting the function and position of the UI in accordance with the user's tendency, and a UI control section S22 for publishing the metadata of the video in XML based on MPEG- And a metadata loading unit S24 for loading the metadata of the image through a standard metadata parser (MPEG-V) and displaying the metadata on a timeline (S20); The feature editing unit S21 is characterized in that a special image captured by a general image or an action camera and a sensation effect linked to the corresponding image are produced based on MPEG-V, which is an international standard, The attribute of the image is edited according to the MPEG-V standard.

delete

In addition, the upper side realizing effect producing section S20 is characterized in that a module for transmitting a realizing effect is provided, and the realizing effect, which is produced by interlocking the upper module with the video operating section S10, is transmitted to the realizing effect reproducing server.

According to the present invention having the above-described structure, when a general image or an image photographed by a 360-degree camera is produced on a platform that provides an MPEG-V standard as an international standard, There is an effect that it can be used regardless of the format of the image in various places other than the maker.

In addition, since the sensation effect to be inserted into the image is prepared in accordance with the international standard, there is no need to produce a separate sensation effect in another place, and thus the manufacturing cost is reduced, thereby contributing to economic development.

In addition, it is possible for a novice novice to easily create a sensation effect that can stimulate the senses such as wind, water, and vibration in the image, as well as to be able to insert even simple motion recognition. It is possible to realize the image production for the emotional technology.

1 is a configuration diagram of a video content production platform according to a preferred embodiment of the present invention;
2 is a configuration diagram for editing attributes of metadata;
3 is a block diagram showing a UI for editing a subdivided attribute of metadata;
4 is a block diagram illustrating a UI for publishing an image-based realizing effect.
FIG. 5 is a block diagram showing a management function of real-time effect metadata for each image.

BRIEF DESCRIPTION OF THE DRAWINGS The above and other objects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which: FIG.

In addition, advantages and features of the present invention and methods of achieving them will be apparent from and elucidated with reference to the embodiments described hereinafter in detail with reference to the accompanying drawings. However, it is to be understood that the present invention is not limited to the disclosed embodiments, but may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. It is intended that the disclosure of the present invention be limited only by the terms of the appended claims. And, throughout the specification, like reference numerals refer to like elements.

The present invention provides a basic operation such as reproduction, editing, and storing of a special image photographed by a video or action camera through a multipurpose image content production platform, and also enhances the metadata of the corresponding image based on the MPEG-V standard It is based on MPEG-V standard based 4D media production method using media platform to produce objects and various sensation effects and insert them into images so that images or special images embedded with real-effects can be operated according to international standards. do.

Before describing the present invention, a brief description of MPEG-V will be given. Moving Picture Experts Group-Video (MPEG-V) is an interface standard for communication between a virtual world and the real world. Rose technology actively connects the various devices such as fans, footprints, and vibrating chairs to the corresponding images to positively provide effects such as wind, smell, and vibration, which are expressed by the images, so that viewers are immersed in the images.

However, since the image including the real-time effect or the enhancement object, or the control devices for controlling the image is formed in a non-standardized manner, the companies that display the image have to independently produce and control the 4D image. Therefore, it is not possible to operate the image produced by the company A or the real-world effect in the company B or C. In order to solve the problem, each company must produce a new image and realization effect again with the unique technology.

In order to overcome such a problem, the present invention provides a platform based on the MPEG-V standard to induce a real-effect-inserted image to be produced according to the MPEG-V standard standard, and provides an extremely simple operation method, The non-experts were also constructed to create images and realistic effects.

FIG. 1 is a configuration diagram of an image content production platform according to a preferred embodiment of the present invention. FIG. 2 is a diagram for editing attributes of metadata, and FIG. 3 is a UI for editing the subdivided attributes of metadata FIG. 4 is a diagram illustrating a UI for publishing an image-based reality effect, and FIG. 5 is a diagram illustrating a management function of reality-effect metadata according to an image.

As shown in FIG. 1, the 360 ° image content creation and reproduction platform based on the MPEG-V standard according to the present invention includes a video operation unit (S10) having basic functions of a video platform for outputting and reproducing video, (S20) for editing a meta data of a corresponding image according to the MPEG-V international standard and creating a sensation object and inserting the sensation object into an image.

The image manipulation unit S10 includes an image storage unit S11 for creating a project, storing, reading, managing a list, managing images for each project, an image analyzing unit S12 for analyzing and loading a stored image, And a display unit S13 for outputting the loaded image to reproduce, pause, and stop the image.

The above image storage unit S11 carries out two functions largely by functions such as creating a project, storing and retrieving a project, managing a list of a project, managing a project-specific image, and managing an object in a timeline. In other words, by abstracting the project, you can collectively manage information such as images, real-effects, properties, and resources used to create realistic objects, and execute commands for reading project files. to provide.

In addition, the project-specific image management function manages a plurality of image files and basic information included in the project, and the timeline object management function manages project information and image information through a timeline object.

The image analysis unit S12 analyzes the property information of the image that is completed and the image to be produced, and displays the analyzed property information on the platform, thereby providing the user with the information of the image without any operation.

A video information display function for generating a time line based on the information of the video and displaying information such as a total playback time and a current time of the video from the time line, a video thumbnail, an audio waveform support function, And a time line time bar support function for moving a video playback time.

On the other hand, the above video information display function displays the total reproduction time of the video and the information display of the codec, the editing time of the video and the editing time of the video effect, and supports a time bar indicating the current time.

The above video thumbnail function displays the video thumbnails at 1 second, 10 seconds, 30 seconds, 1 minute, and is selected flexibly according to the specified time scale. In addition, the video thumbnail is provided with a thumbnail of the corresponding section when the user selects one section of the playback bar in conjunction with the playback section of the display section S13 described below. In addition, the audio waveform function automatically displays the audio of the image and displays it as a waveform, and the user can selectively edit the waveform.

The Time Line Zoom function provides an interface for selecting a Time Scale, and the Time Scale is provided at 1 second, 10 seconds, 30 seconds, 1 minute intervals like a video thumbnail. do.

The time line time bar support function is a function of moving the time bar of the time line according to the playback time of the video in conjunction with the playback bar of the display unit S13 or the playback information of the video to be described below, The playback position of the image is changed.

The upper display unit S13 loads and displays the analyzed video on a display. The upper display unit S13 displays a playback bar for displaying the time or point at which the video is played, stopped, paused, Stop, and pause images through an extended menu provided at the top of the display, so that the user can operate the images in various ways.

That is, the image manipulation unit S10 includes an image storage unit S11 configured to create and store a project, to manage a list, to manage a list, to manage images for each project, and to manage objects of a timeline, An image analyzing unit S12 configured to display and output an image and to reproduce an image such as a playback, a stop, and a pause, and a timeline of the image analyzing unit interlocked with the timeline of the image analyzing unit And a display unit S13 for modifying and recognizing the reproduction position, thereby playing back the special image photographed by the image or action camera and managing the specific image for each project.

The real-effect creating unit S20 includes an attribute editing unit S21 for editing the real-time effect metadata based on the MPEG-V standard, an intuitive user interface (UI) for inducing editing of real- A UI control unit S22 for adjusting the function and position of the UI according to the MPEG-V standards, an XML generation unit for converting the metadata of the video into XML based on the MPEG-V standard and publishing the metadata, A metadata loading unit S24 for loading the metadata of the image through a standard metadata parser of MPEG-V and displaying the metadata on the time line, .

As shown in FIG. 2, the attribute editing unit S21 derives the metadata of the corresponding image based on the output image based on the MPEG-V standard. The above-mentioned attribute editing section provides General UI, Effect Property, and Effect Variable editing UI, so that the metadata of the corresponding image can be edited more precisely. In particular, the above Effect Property editing function is configured to enable property editing through Property Grid.

Meanwhile, a process of editing metadata of an image through the attribute editing unit S21 according to a preferred embodiment of the present invention will be briefly described. As shown in FIG. 3, an effect (Wind, Fragrance, Vibration, etc.) of metadata is selected , And the direction and intensity at which the effect is to be performed. To edit General Info, Effect Property, and Effect Variable in the selected Effect, set the Version, Creator, and Creator Time for General Info, and enter the Effect ID and Location for the Effect Property. In addition, the Effect Variable can be operated in an extremely simple manner by inputting the Type and the SEFragment ID and setting the Start Time, Duration, Active, Intensity Value, and Color. However, this is not limited to this embodiment, It is natural to be able to enter various control functions and setting values.

The UI control unit S22 provides a place and function for editing the metadata of the image to perform operations such as creating, modifying, and inserting realistic effects. As described in the above-described attribute editing unit S21, in order to edit the metadata of the image, the editing UI of General Info, Effect Property, and Effect Variable is required, and the above editing UI is moved to a position that the user can use .

The XML generation unit S23 converts the metadata of the image into XML based on the MPEG-V standard, publishes the XML, and manages the metadata converted into XML according to the list. For example, eXtensible Markup Language (XML) is a meta language established in the international standard. It is easy to learn and summarize a specific part of SGML (Standard Generalized Markup Language) for easy application to web documents, In particular, users can define the tags related to the contents of the document and use the tags for others. In addition, because XML is designed to transmit structured documents on the Web, it guarantees the independence of the elements constituting the document, thereby providing features such as document compatibility, content independence, and ease of element change.

As described above, in the present invention, a module for generating or storing metadata of an image based on MPEG-V standards is provided, batch management of a real-effects list under production, conversion of real- Or stored. For example, in the case of image-based real-time effect publishing, the function of publishing is included in the image when the image format is MP4, and the function of publishing the image in a separate XML if the format is other than MP4.

On the other hand, the generation process of the sensation effect may be different depending on whether the image format is MP4 or not. In other words, when the image format is MP4, a module for analyzing the metadata of the MP4 file is provided, and the real effect is included in the image for publishing, and the edited result of the real effect is converted into XML and provided. While the above-mentioned real-effect XML is published in the MP4 container, it is published as separate XML in the case of the non-MP4.

As shown in FIG. 4, in the publishing function according to the preferred embodiment of the present invention, it is possible to select or set New NeoDate and Open NeoDate in NeoDate which is the sensation effect metadata, add or remove Reference MP4, file support for General XML metadata, SEM-MP4 file support for Timed XML metadata, and other video options.

The metadata loading unit S24 loads metadata of the image through a standard metadata parser of MPEG-V and displays the metadata on the timeline. In other words, the standard metadata of the MPEG-V is parsed to load the metadata file as the reality object data of the authoring tool, and the metadata loaded through the MPEG-V standard metadata parser in cooperation with the timeline is recorded in the timeline .

When multiple images are included in a project, the sensed effect of multiple images is individually expressed by reproducing thumbnail images of multiple images for displaying real-effect metadata for each image, and displaying thumbnail images for respective images It provides a module to transmit real-effects through a protocol, a function to transmit real-world effects to the real-time reproduction server, Delete, search, and select UIs. Therefore, the generated real-effects list is managed for each server and transmitted or linked to each corresponding image.

The 4D media reproduction platform of the present invention having the above functions is produced by 4D stereoscopic image as follows.
The method includes: photographing an image with a general camera or an action camera; loading the photographed image with a real-effect content and a reproduction platform based on the MPEG-V standard; Extracting the real-effect metadata for the motion, voice, and surrounding environment based on the MPEG-V standard, adding a voice, a gesture, and an interaction to the corresponding section, Editing the General Info, the Effect Property, and the Effect Variable through UI (User Interface) whose functions and positions are adjusted according to the user's tendency, converting the metadata of the video into XML based on the MPEG-V standard, Managing the metadata converted into XML by list; loading the metadata of the video through a standard metadata parser (MPEG-V) After the step of sequentially can be produced with standardized realize media object.

For example, according to the MPEG-V standards-based 360 ° video content production and reproduction platform of the present invention having the above functions, a realistic effect that can stimulate the five senses such as wind, water, Can be easily inserted and edited, and a simple motion recognition can be inserted. Thus, it is possible to realize an image production for an emotion technology based on an advanced five-sensory experience. In addition, since the image is produced based on the MPEG-V, which is an international standard, the image can be used not only in the manufacturer but also in various places.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, Should be clarified. Therefore, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

S10. Image control unit S11. The image storage unit
S12. Image analysis section S13. Display portion
S20. Sense effect production department S21. Attribute editor
S22. UI control section S23. The XML generation unit
Q24. The metadata loading unit

Claims (3)

Shooting an image with a general camera or an action camera;
Loading the photographed image in a real-effect content and reproduction platform based on MPEG-V standards;
Extracts realistic effect metadata about motion, voice, and surrounding environment based on MPEG-V standard, and adds interaction for voice, gesture, and touch of corresponding section while playing loaded image ;
Editing General Info, Effect Property, and Effect Variable through UI (User Interface), which is configured on the basis of MPEG-V standards and whose functions and positions are adjusted according to user's tendency;
Converting metadata of the image into XML based on the MPEG-V standard, publishing the metadata, and managing the metadata converted into XML by list;
A step of loading the metadata of the image through a standard metadata parser of the MPEG-V and displaying the metadata on a timeline; A 4D media production method based on the MPEG-V standard using a media platform characterized by the production of 4D media.
delete delete
KR1020160018849A 2016-02-18 2016-02-18 4D media manufacture methods of MPEG-V standard base that use media platform KR101721231B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160018849A KR101721231B1 (en) 2016-02-18 2016-02-18 4D media manufacture methods of MPEG-V standard base that use media platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160018849A KR101721231B1 (en) 2016-02-18 2016-02-18 4D media manufacture methods of MPEG-V standard base that use media platform

Publications (1)

Publication Number Publication Date
KR101721231B1 true KR101721231B1 (en) 2017-03-30

Family

ID=58503080

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160018849A KR101721231B1 (en) 2016-02-18 2016-02-18 4D media manufacture methods of MPEG-V standard base that use media platform

Country Status (1)

Country Link
KR (1) KR101721231B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107995442A (en) * 2017-12-21 2018-05-04 北京奇虎科技有限公司 Processing method, device and the computing device of video data
KR102377081B1 (en) 2021-07-30 2022-03-22 아이디아이디 주식회사 Apparatus and method for generating digital contents based on recording operation of multi-track

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100034859A (en) * 2008-09-25 2010-04-02 한국전자통신연구원 Transmitting/receiving multi-device movie based on mpeg-4 single media and method thereof
JP2010263341A (en) * 2009-05-01 2010-11-18 Sony Corp Image processor, image processing method, and program
KR101075122B1 (en) 2011-03-25 2011-10-19 주식회사 포리얼 Apparatus for synchronizing of special effect and 4d projecting ststem
KR20110139868A (en) * 2010-06-24 2011-12-30 전자부품연구원 Supporting system and method for virtual object identification architecture based on a virtual world
KR20130098457A (en) * 2012-02-28 2013-09-05 권혜진 System for providing special effect synchronized movie
KR20130134130A (en) * 2012-05-30 2013-12-10 한국전자통신연구원 Apparatus and method for processing media in convergence media service platform
KR20140124096A (en) * 2013-04-16 2014-10-24 한국전자통신연구원 Apparatus and method for processing media additional information
KR101580237B1 (en) 2013-05-15 2015-12-28 씨제이포디플렉스 주식회사 Method and System for Providing 4D Content Production Service, Content Production Apparatus Therefor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100034859A (en) * 2008-09-25 2010-04-02 한국전자통신연구원 Transmitting/receiving multi-device movie based on mpeg-4 single media and method thereof
JP2010263341A (en) * 2009-05-01 2010-11-18 Sony Corp Image processor, image processing method, and program
KR20110139868A (en) * 2010-06-24 2011-12-30 전자부품연구원 Supporting system and method for virtual object identification architecture based on a virtual world
KR101075122B1 (en) 2011-03-25 2011-10-19 주식회사 포리얼 Apparatus for synchronizing of special effect and 4d projecting ststem
KR20130098457A (en) * 2012-02-28 2013-09-05 권혜진 System for providing special effect synchronized movie
KR20130134130A (en) * 2012-05-30 2013-12-10 한국전자통신연구원 Apparatus and method for processing media in convergence media service platform
KR20140124096A (en) * 2013-04-16 2014-10-24 한국전자통신연구원 Apparatus and method for processing media additional information
KR101580237B1 (en) 2013-05-15 2015-12-28 씨제이포디플렉스 주식회사 Method and System for Providing 4D Content Production Service, Content Production Apparatus Therefor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107995442A (en) * 2017-12-21 2018-05-04 北京奇虎科技有限公司 Processing method, device and the computing device of video data
KR102377081B1 (en) 2021-07-30 2022-03-22 아이디아이디 주식회사 Apparatus and method for generating digital contents based on recording operation of multi-track

Similar Documents

Publication Publication Date Title
US11301113B2 (en) Information processing apparatus display control method and program
CN101300567B (en) Method for media sharing and authoring on the web
JP5903187B1 (en) Automatic video content generation system
Davis Editing out video editing
CN101453567A (en) Apparatus and method for photographing and editing moving image
JP2004287595A (en) Device and method for converting composite media contents and its program
US20180143741A1 (en) Intelligent graphical feature generation for user content
KR101721231B1 (en) 4D media manufacture methods of MPEG-V standard base that use media platform
US20120251081A1 (en) Image editing device, image editing method, and program
JP4514671B2 (en) CONTENT EDITING DEVICE, COMPUTER-READABLE PROGRAM, AND RECORDING MEDIUM CONTAINING THE SAME
JP2010268195A (en) Video content editing program, server, apparatus and method
JP2021518616A (en) Creating rich content from text content
US11551724B2 (en) System and method for performance-based instant assembling of video clips
KR101198091B1 (en) Method and system for learning contents
JP6641045B1 (en) Content generation system and content generation method
CN101471115B (en) Photographing apparatus and photographing method
CN112969043B (en) Media file generation and playing method and equipment
CN106162376A (en) A kind of multimedia is compiled as the method and device of video playback file automatically
KR100953476B1 (en) Apparatus and method of creative image using a plural of source-file
KR101477492B1 (en) Apparatus for editing and playing video contents and the method thereof
JP2005228297A (en) Production method of real character type moving image object, reproduction method of real character type moving image information object, and recording medium
Gansing The cinema of extractions: film as infrastructure for (artistic?) research
KR20110126013A (en) Authoring method for synchronizing motion with moving image, and recording medium of its implementing program
CN109948546B (en) Media playing interaction control system and method
US10714146B2 (en) Recording device, recording method, reproducing device, reproducing method, and recording/reproducing device

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant