WO2020093865A1 - Fichier multimédia, et procédé de production et procédé de lecture associés - Google Patents

Fichier multimédia, et procédé de production et procédé de lecture associés Download PDF

Info

Publication number
WO2020093865A1
WO2020093865A1 PCT/CN2019/112527 CN2019112527W WO2020093865A1 WO 2020093865 A1 WO2020093865 A1 WO 2020093865A1 CN 2019112527 W CN2019112527 W CN 2019112527W WO 2020093865 A1 WO2020093865 A1 WO 2020093865A1
Authority
WO
WIPO (PCT)
Prior art keywords
character
interacted
file unit
animation
file
Prior art date
Application number
PCT/CN2019/112527
Other languages
English (en)
Chinese (zh)
Inventor
时陶
马志明
袁刚
何杰
曹翔
Original Assignee
北京小小牛创意科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小小牛创意科技有限公司 filed Critical 北京小小牛创意科技有限公司
Publication of WO2020093865A1 publication Critical patent/WO2020093865A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present application relates to the field of electronic information technology, in particular to a media file and its generating method and playing method.
  • the spread of the story can show the story to readers or viewers in the form of text or video through books or animations.
  • Books and animations enable readers or viewers to obtain a relatively single text or picture through sight and hearing.
  • reading a book or watching an animation has a limited effect on the story.
  • the relatively simple texts and pictures in books or animations make the display of stories more monotonous. Readers or viewers can only read or watch books or animations, but cannot participate in the stories displayed in books or animations. There is little interaction between the audience and the stories shown in books or animations.
  • the embodiments of the present application provide a media file and a method for generating and playing the same, which can improve the interaction between the user and the story or game displayed by the animation.
  • an embodiment of the present application provides a media file, including: at least one character file unit, the character file unit includes a character to be interacted, and at least one action template of the character to be interacted, the action template is used to set the role of the character to be interacted with Action; at least one interactive file unit.
  • the interactive file unit includes interactive prompt information.
  • the interactive prompt information is used to prompt the user to input an animation image material during the playing of the media file, so that the interactive prompt information is obtained and displayed according to the animation image material input by the user The animated image of the character to be interacted with in the information prompt.
  • an embodiment of the present application provides a method for generating a media file, including: adding a character to be interacted and an action template of a character to be interacted to obtain a character file unit, and the action template is used to set the action of the character to be interacted; based on the first File unit, add interactive prompt information to get the interactive file unit, the first file unit includes the character file unit, the interactive prompt information is used to prompt the user to enter the animation image material during the playback of the media file, so that the animation image material input by the user To obtain and display the animated image of the character to be interacted prompted by the interactive prompt information; based on the second file unit, a media file is generated, and the second file unit includes the character file unit and the interactive file unit.
  • an embodiment of the present application provides a method for playing a media file, including: invoking a role to be interacted and an action template of a role to be interacted in a role file unit in the media file, the action template is used to set an action of the role to be interacted with;
  • the interactive image frame is synthesized and played to prompt the user to input the animated image material.
  • the first image frame factor includes the interactive prompt information in the interactive file unit in the media file; The animated image of the interactive character; the second image frame factor is used to synthesize and play the animated image frame.
  • the second image frame factor includes the animated image of the character to be interacted and the action template of the character to be interacted.
  • An embodiment of the present application provides a media file and a method for generating and playing the media file.
  • the media file includes a role file unit and an interactive file unit.
  • the interactive prompt information in the interactive file unit may prompt the user to input animated image material. Therefore, the animation image of the character to be interacted is obtained according to the animation image material input by the user, and the animation image participates in playing the animation displayed by the media file, that is, the animation image is displayed in the animation displayed by playing the media file.
  • the user can interact with the animation displayed by playing the media file, and the interaction between the user and the story or game displayed by the animation is improved. .
  • FIG. 1 is a schematic diagram of the content of a media file in an embodiment of this application.
  • FIG. 2 is a schematic diagram of the content of a media file in another embodiment of this application.
  • FIG. 3 is a flowchart of a method for generating a media file according to an embodiment of this application.
  • FIG. 4 is a flowchart of a method for playing a media file in an embodiment of the present application.
  • the embodiments of the present application provide a media file, a generating method and a playing method thereof, which can be applied to a scenario where a user interacts with a media file.
  • the user can interact with the media file, so that the user's creation during the interaction is reflected in the animation or video displayed by playing the media file.
  • the media file in the embodiment of the present application may be played, so that the user can interact with the animation or video displayed by the media file played by the learning machine or game device, thereby improving the user
  • the interaction with the learning content or game content in the learning machine or game device can enhance the user's understanding effect of the learning content or game content.
  • the interest of learning content or game content displayed by playing media files is improved.
  • the media file in the embodiment of the present application is played to the public, and the public may interact with the animation or video displayed by the played media file.
  • the media file in the embodiment of the present application is played to an individual, and the individual can interact with the animation or video displayed by the played media file.
  • the media files in the embodiments of the present application may also display advertisements, so that users can interact with the advertisements to provide more information.
  • the media file and the method for generating and playing the media files provided in the embodiments of the present application can also be applied to other scenes that require content to be displayed in the form of animation or video, which is not limited herein.
  • the media file in this embodiment of the present application may include at least one character file unit and at least one interactive file unit.
  • FIG. 1 is a schematic diagram of content of a media file 10 in an embodiment of the present application.
  • the character file unit 11 includes a character to be interacted with, and at least one action template of the character to be interacted with.
  • the animated image of the character to be interacted can be generated according to the animated image material input by the user, that is, the animated image of the character to be interacted is generated through the interaction between the user and the device.
  • the animation image of the character to be interacted with may also be a default animation image pre-stored in the media file 10.
  • the default animated image of the character to be interacted with is a picture stored in the character file unit 11, and the picture depicts the character to be interacted with.
  • the action template is used to set the action of the character to be interacted.
  • Each action template can set an action of the character to be interacted.
  • An action can include multiple sub-actions.
  • the running template sets the running motion of the character to be interacted, and the running motion can be decomposed into multiple running sub-actions.
  • a humanoid character to be interacted may exist a walking action template, a running action template, a jumping action template, and so on.
  • the action template includes keyframes for action control handles and effect keyframes.
  • the action control handle is an identification point in the main body of the character to be interacted that can control each part to generate an action.
  • the joint of the subject of the character to be interacted, the center of the subject, or the endpoint of the subject can be used as the motion control handle.
  • Multiple key frames can be included in the action template.
  • the position of the motion control handle is marked in the key frame.
  • the motion control handles in multiple key frames can be displayed at different positions to show a continuous motion animation.
  • the effect key frame is used to realize at least part of the effects in the animation displayed by the media file 10.
  • the effect key frames may include pan frames, zoom frames, rotation frames, and opacity frames.
  • Pan frames are used to translate the elements to be interacted, the scene and other elements in the animation.
  • the zoom frame is used to realize the zooming of the characters, scenes and other elements to be interacted in the animation.
  • the rotation frame is used to realize the rotation of elements to be interacted, the scene and other elements in the animation.
  • the opacity frame is used to indicate the opacity in the image frame during the playback of elements to be interacted with, the scene and other elements.
  • the effect key frame may also include other effect function frames, which is not limited herein.
  • the interactive file unit 12 includes interactive prompt information.
  • the interactive prompt information is used to prompt the user to input an animation image material during the playing of the media file 10, so that the animation image of the character to be interacted can be obtained according to the animation image material input by the user.
  • the interactive prompt information may indicate the feature or name of the animated image to be interacted corresponding to the animated image material that needs to be input by the user. For example, the interactive prompt information indicates that the user needs to input the animation image material of "rabbit".
  • the interactive prompt information may indicate the specific time and specific form for prompting the user to input the animated image material.
  • the specific form of the interactive prompt information may be an image or sound effect, that is, the interactive prompt information may indicate a specific prompt language or prompt sound effect, which is not limited herein.
  • the interactive prompt information indicates that the prompt message "please draw an apple” is displayed while the media file 10 is being played.
  • the interactive prompt information indicates that the prompt sound effect "please draw an apple” is played while the media file 10 is being played.
  • the media file 10 includes a character file unit 11 and an interactive file unit 12.
  • the interactive prompt information in the interactive file unit 12 may prompt the user to input animated image material. Therefore, the animation image of the character to be interacted is obtained according to the animation image material input by the user, and the animation image participates in playing the animation displayed by the media file 10, that is, the animation image is displayed in the animation displayed by playing the media file 10.
  • the user can interact with the animation displayed by playing the media file 10, and the interaction between the user and the story or game displayed by the animation is improved.
  • the user's creation can participate in playing the animation displayed by the media file 10, which improves the interest of the story or game displayed by the animation.
  • the media file 10 in the embodiment of the present application occupies a smaller storage volume.
  • FIG. 2 is a schematic diagram of the content of a media file 10 in another embodiment of this application.
  • the media file 10 in the above embodiment may further include at least one script file unit 13.
  • Each script file unit 13 corresponds to a story or game.
  • FIG. 2 takes the media file 10 including one script file unit as an example, but it is not limited to that the media file 10 includes one script file unit 13. If the media file 10 includes multiple script file units 13, the media file 10 may be called according to the user's selection instruction during the playback of the media file 10.
  • the script file unit 13 is used to instruct the development of the storyline, so that the character to be interacted with can complete the action specified in the storyline, so that when the media file 10 is played, an animation with the storyline can be displayed.
  • the script file unit 13 in the above embodiment includes at least one curtain file unit 131.
  • One scene file unit 131 corresponds to one scene animation.
  • a scene of animation may include multiple image frames, and the multiple image frames included in this scene of animation are displayed in sequence, and a continuous animation may be displayed.
  • the screen file unit 131 may be labeled.
  • the label can identify the order of the curtain file unit 131, that is, the label can identify the order of the animations of the screens corresponding to the curtain file unit 131.
  • the order of the screen file unit 131 characterizes the order in which the multi-screen animation is displayed while the media file 10 is being played.
  • One curtain file unit 131 may include a character list, an interaction list, and configuration information.
  • the character list can be used to indicate the characters to be interacted in an animation.
  • the characters to be interacted in different scene animations may be different.
  • the character list can record the characters to be interacted in the animation of the scene.
  • the role list can also be regarded as a role index.
  • the interaction list can be used to indicate interaction prompt information in a scene of animation.
  • the interactive prompt information required in different screen animations is different.
  • the interaction list in the curtain file unit 131 may include interaction prompt information required in a scene animation corresponding to the curtain file unit 131.
  • the character to be interacted in the animation corresponding to the part of the scene file unit 131 has the default animation image, and there is no need to prompt the user to input the animation image material.
  • the interaction list summarized by the curtain file unit 131 is empty.
  • the configuration information can be used to configure the position of the characters to be interacted in the one-screen animation in the one-screen animation, the action templates used in the one-screen animation of the to-be-interacted characters in the one-screen animation, and the sequence of the interactive prompt information in the one-screen animation to make the media file
  • the storyline indicated by the screen file unit 131 is displayed while being played.
  • the configuration information can configure the screen position, appearance time, appearance order, etc. of the character to be interacted in the animation of a scene, as well as the action template adopted by the character to be interacted in the animation of the scene, and the representation of the action template The start time of the action in the process of the media file 10 being played.
  • the configuration information can also configure the appearance time and order of the interactive prompt information in this scene animation. It can also be said that the configuration information can characterize the rendering order between the character to be interacted associated with a scene of animation and the interaction prompt information.
  • the rendering order here refers to the order of the image frames where the characters to be interacted with and interactive prompt information are displayed in the animation displayed by the media file 10 being played.
  • the above media file 10 further includes one or more of a sound effect file unit and a scene file unit.
  • the sound effect file unit can be used to provide sound effects.
  • the audio effect file unit may include at least one audio file, and the audio effect is generated when the audio file is played.
  • the format of the audio file is not limited here.
  • the scene file unit can be used to provide a scene.
  • the scene file unit may include at least one scene.
  • Scenes are graphic elements that can be motionless, such as static pictures.
  • the set can also be a dynamic graphic element, such as a dynamic picture.
  • the format of the set is not limited here.
  • the script file unit 13 may further include one or more items in the scene list and the sound effect list.
  • the scene list may be used to indicate scenes in the storyline of the script file unit 13. Scenes that appear in different screen animations may be different.
  • the scene list can record the scene in the animation of the scene.
  • the set list can also be regarded as a set index. When the media file 10 is being played, the corresponding scene can be called from the scene file unit according to the scene list.
  • the sound effect list is used to indicate the sound effects in the storyline of the script file unit 13.
  • the sound effects that appear in different screen animations may be different.
  • the sound effect list can record the sound effects in the animation of the scene.
  • the sound effect list can also be regarded as a sound effect index. When the media file 10 is being played, the corresponding audio file can be called from the audio effect file unit according to the audio effect list.
  • the script file unit 13 includes a scene list.
  • the configuration information in the foregoing embodiment may also be used to configure the screen position, appearance time, and the like of the scene in one scene of the animation in this scene.
  • the script file unit 13 includes a sound effect list. Then, the configuration information in the foregoing embodiment may also be used to configure the playing start and end time and playing volume of the sound effect in one scene of animation in this scene of animation.
  • FIG. 3 is a flowchart of a method for generating a media file in an embodiment of this application. As shown in FIG. 3, the method for generating the media file may include steps S201 to S203.
  • step S201 add the character to be interacted and the action template of the character to be interacted to obtain a character file unit.
  • the character to be interacted may have a default animated image, or may not have a default animated image.
  • the action template is used to set the action of the character.
  • the default animated image of the character to be interacted with can be imported from the drawn animated image from the outside.
  • the animated image of the character to be interacted can also be generated according to the animated image material input by the user.
  • adding the role to be interacted and the action template of the role to be interacted can be specifically refined as adding a role to be interacted with, adding an action control handle to the role to be interacted with; setting a key frame of the character to be interacted with, and setting a key frame to be interacted
  • the action of the character controls the position of the handle; according to all key frames of the character to be interacted, the action template of the character is obtained.
  • step S202 based on the first file unit, interactive prompt information is added to obtain an interactive file unit.
  • the first file unit includes a character file unit.
  • the interactive prompt information is used to prompt the user to input an animated image material during the playing of the media file, so that the animated image of the character to be interacted indicated by the interactive prompt information is obtained and displayed according to the animated image material input by the user.
  • the character to be interacted with an action template in the above example is associated with a scene of animation, which can be understood as binding the character to be interacted with an action template to this scene of animation.
  • the interactive prompt information in the above example is associated with a scene of animation, and the interactive prompt information can be immediately bound to this scene of animation.
  • the configuration information is used to configure the position of the character to be interacted in the one-screen animation in the one-screen animation, the action template adopted by the character to be interacted in the one-screen animation in the one-screen animation, and the order of the interactive prompt information in the one-screen animation.
  • configuration information can also be used to configure the screen position, appearance time, etc. of a scene in a scene of an animation.
  • the configuration information can also be used to configure the start and end time and volume of the sound effects in a scene of animation.
  • step S203 a media file is generated based on the second file unit.
  • the second file unit includes a character file unit and an interactive file unit.
  • the above method for generating a media file may further include step S104.
  • step S204 a storyline is determined, and based on the storyline, character file unit, and interactive file unit, a script file unit is obtained.
  • the above-mentioned second file unit may further include a script file unit.
  • the script file unit is used to instruct the development of the storyline, so that the characters to be interacted can complete the actions specified in the storyline.
  • a new scene animation may be created, and a character to be interacted with in one scene animation is selected from the character file unit.
  • Select an action template for the character to be interacted in this scene animation and associate the character to be interacted with the action template with a scene animation.
  • Set the configuration information of one scene of animation, according to the configuration information of one scene of animation, the characters to be interacted with one scene of animation and the interactive prompt information of one scene of animation, the scene file unit corresponding to one scene of animation is obtained. Based on all the curtain file units, the script file unit is obtained.
  • both the first file unit and the second file unit include a sound effect file unit and / or a scene file unit.
  • the sound effect file unit is used to provide sound effects.
  • the scene file unit is used to provide a scene. If the first file unit includes a sound effect file unit, after a new scene animation is created, the sound effect corresponding to this scene animation in the sound effect file unit may also be associated, for example, an audio file is associated with this scene animation. If the first character file includes a scene file unit, after creating a new scene animation, the scene corresponding to the scene scene in the scene file unit can also be associated with the scene scene animation. Similar to the above embodiment, the association here may be specifically binding.
  • the interactive prompt information in the media file generated by the method for generating a media file in the embodiment of the present application may prompt the user to input an animated image material. Therefore, the animation image of the character to be interacted is obtained according to the animation image material input by the user, and the animation image participates in playing the animation displayed by the media file.
  • the user can interact with the animation displayed by playing the media file, thereby improving the interaction between the user and the story or game displayed by the animation. Moreover, it also improves the interest of the story or game displayed in the animation.
  • the method for playing the media file may include steps S301 to S304.
  • step S301 the character to be interacted and the action template of the character to be interacted in the character file unit in the media file are called.
  • the action template is used to set the action of the character to be interacted.
  • step S302 according to the first image frame factor, interactive image frames are synthesized and played to prompt the user to input an animated image material.
  • the first image frame factor includes interactive prompt information in the interactive file unit in the media file.
  • step S303 according to the animation image material input by the user, the animation image of the character to be interacted is obtained.
  • step S304 using the second image frame factor, an animation image frame is synthesized and played.
  • the second image frame factor includes the animation image of the character to be interacted and the action template of the character to be interacted.
  • an ordinary video file in the process of playing an ordinary video file, a fixed image frame is played. That is to say, an ordinary video file includes multiple image frames.
  • the image frames are played frame by frame in the order of the time axis.
  • the image frames of the animation displayed by playing the media file are rendered in real time during the process of playing the media file, that is, the image frames are synthesized and played in real time according to each image frame factor.
  • the first image frame factor and the second image frame factor may further include a sound effect in the sound effect file unit and / or a scene in the scene file unit.
  • the above method for playing a media file may further include the step of calling a script file unit in the media file.
  • the script file unit is used to instruct the development of the storyline so that the character can complete the actions specified in the storyline.
  • the script file unit includes at least one curtain file unit.
  • Each scene file unit corresponds to a scene animation.
  • the curtain file unit may have a label, and the label may identify the sequence of the curtain file unit, that is, the label may identify the sequence of the animation of each curtain corresponding to the curtain file unit. Therefore, in the process of playing the media file, it can be called in order according to the label of the curtain file unit, so that the final animation displayed by playing the media file can be displayed in the order of the animation of each curtain.
  • the animation image material provided by the user can be obtained, and the animation image obtained from the animation image material provided by the user can be substituted into the animation displayed by the animation file being played

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Toys (AREA)

Abstract

L'invention concerne un fichier multimédia, et un procédé de production et un procédé de lecture associés. Le fichier multimédia comprend : au moins une unité de fichier de personnage (11), l'unité de fichier de personnage (11) comprenant un personnage pour interagir avec, et au moins un modèle d'action dudit personnage, le modèle d'action étant utilisé pour configurer une action dudit personnage ; et au moins une unité de fichier d'interaction (12), l'unité de fichier d'interaction (12) comprenant des informations d'invite d'interaction, et les informations d'invite d'interaction étant utilisées pour inviter un utilisateur à entrer un matériel d'image d'animation pendant le processus au cours duquel le fichier multimédia est lu, de telle sorte qu'une image d'animation dudit personnage indiqué par les informations d'invite d'interaction est obtenue et affichée en fonction du matériel d'image d'animation entré par l'utilisateur. Le fichier multimédia peut améliorer l'interaction de l'utilisateur avec l'histoire, le jeu, etc. affiché dans l'animation.
PCT/CN2019/112527 2018-11-06 2019-10-22 Fichier multimédia, et procédé de production et procédé de lecture associés WO2020093865A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811313313.7A CN109684487A (zh) 2018-11-06 2018-11-06 媒体文件及其生成方法和播放方法
CN201811313313.7 2018-11-06

Publications (1)

Publication Number Publication Date
WO2020093865A1 true WO2020093865A1 (fr) 2020-05-14

Family

ID=66185693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/112527 WO2020093865A1 (fr) 2018-11-06 2019-10-22 Fichier multimédia, et procédé de production et procédé de lecture associés

Country Status (2)

Country Link
CN (1) CN109684487A (fr)
WO (1) WO2020093865A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109684487A (zh) * 2018-11-06 2019-04-26 北京小小牛创意科技有限公司 媒体文件及其生成方法和播放方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060217979A1 (en) * 2005-03-22 2006-09-28 Microsoft Corporation NLP tool to dynamically create movies/animated scenes
CN104318602A (zh) * 2014-10-31 2015-01-28 南京偶酷软件有限公司 人物全身动作的动画制作方法
CN106340049A (zh) * 2015-07-15 2017-01-18 中国传媒大学 一种基于动画语义的验证码产生方法
CN107845123A (zh) * 2017-09-20 2018-03-27 珠海金山网络游戏科技有限公司 基于html5将网页输入文字生成口型动画的方法、装置和系统
CN108038160A (zh) * 2017-12-06 2018-05-15 央视动画有限公司 动态动画保存方法、动态动画调用方法及装置
CN109684487A (zh) * 2018-11-06 2019-04-26 北京小小牛创意科技有限公司 媒体文件及其生成方法和播放方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100007702A (ko) * 2008-07-14 2010-01-22 삼성전자주식회사 애니메이션 제작 방법 및 장치
CN103428536A (zh) * 2012-05-22 2013-12-04 中兴通讯股份有限公司 交互式网络电视节目播放方法及装置
CN105589816B (zh) * 2015-12-16 2018-05-08 厦门优芽网络科技有限公司 编译式情景交互动画制作与播放方法
CN105854296A (zh) * 2016-04-21 2016-08-17 苏州探寻文化科技有限公司 一种基于增强现实的互动体验系统
CN106251389B (zh) * 2016-08-01 2019-12-24 北京小小牛创意科技有限公司 制作动画的方法和装置
CN107145346A (zh) * 2017-04-25 2017-09-08 合肥泽诺信息科技有限公司 一种游戏行为控制模块的虚拟框架系统
CN108335346A (zh) * 2018-03-01 2018-07-27 黄淮学院 一种互动动画生成系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060217979A1 (en) * 2005-03-22 2006-09-28 Microsoft Corporation NLP tool to dynamically create movies/animated scenes
CN104318602A (zh) * 2014-10-31 2015-01-28 南京偶酷软件有限公司 人物全身动作的动画制作方法
CN106340049A (zh) * 2015-07-15 2017-01-18 中国传媒大学 一种基于动画语义的验证码产生方法
CN107845123A (zh) * 2017-09-20 2018-03-27 珠海金山网络游戏科技有限公司 基于html5将网页输入文字生成口型动画的方法、装置和系统
CN108038160A (zh) * 2017-12-06 2018-05-15 央视动画有限公司 动态动画保存方法、动态动画调用方法及装置
CN109684487A (zh) * 2018-11-06 2019-04-26 北京小小牛创意科技有限公司 媒体文件及其生成方法和播放方法

Also Published As

Publication number Publication date
CN109684487A (zh) 2019-04-26

Similar Documents

Publication Publication Date Title
Tsika Nollywood Stars: Media and migration in West Africa and the diaspora
JP2015039216A (ja) メタデータを用いて複数の映像ストリームを処理する方法及び装置
Serafini et al. Picturebooks 2.0: Transmedial features across narrative platforms
US20130187927A1 (en) Method and System for Automated Production of Audiovisual Animations
WO2020093865A1 (fr) Fichier multimédia, et procédé de production et procédé de lecture associés
Jeffries Comics at 300 frames per second: Zack Snyder's 300 and the figural translation of comics to film
Ortabasi Indexing the past: Visual language and translatability in Kon Satoshi's Millennium Actress
Song et al. On a non-web-based multimodal interactive documentary production
US20130182183A1 (en) Hardware-Based, Client-Side, Video Compositing System
CN113391866A (zh) 一种界面展示方法
KR20060030179A (ko) 전자 만화 및 그 제작방법
Ross Technological affordances versus narrative delivery?: the practice of recent virtual reality storytelling
KR101116538B1 (ko) 공연안무제작시스템 및 공연안무제작방법
Mack Finding borderland: intermediality in the films of Marc Forster
Hashim Narrative Techno-enhancement: The Impact of the Digital Visual Effects DVFx) in Creative Narrative Performance
Hughes et al. Accessibility in 360-degree video players
Kuwahara et al. Proposal for a theatre optique simulated experience application
Hutchison Mister Rogers’ Holy Ground: Exploring the Media Phenomenology of the Neighborhood and Its Rituals
Afdile The prospect of Art-Science interplay in filmmaking as research: From Abstract to Implicit film
Las-Casas Cinedesign: typography in motion pictures.
Hilbrand How animation in user interfaces can affect HCI
KR100374329B1 (ko) 영어 연극 동화상 표시방법 및 그 기록매체
Pallant et al. “Everybody Chips in Ten Cents, and Somehow It Seems to Add up to a Dollar”: Exploring the Visual Toolbox of Animation Story Design
Dmytrenko et al. Technological features of video content creation and editing for students specialty «Construction and civil engineering»
Fodel Live Cinema: Context and Liveness

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19882366

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19882366

Country of ref document: EP

Kind code of ref document: A1