WO2013076359A1 - Procédé, appareil et produit programme d'ordinateur pour produire une image animée associée à un contenu multimédia - Google Patents
Procédé, appareil et produit programme d'ordinateur pour produire une image animée associée à un contenu multimédia Download PDFInfo
- Publication number
- WO2013076359A1 WO2013076359A1 PCT/FI2012/051025 FI2012051025W WO2013076359A1 WO 2013076359 A1 WO2013076359 A1 WO 2013076359A1 FI 2012051025 W FI2012051025 W FI 2012051025W WO 2013076359 A1 WO2013076359 A1 WO 2013076359A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- objects
- multimedia content
- content
- image
- motion
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000004590 computer program Methods 0.000 title claims abstract description 34
- 230000033001 locomotion Effects 0.000 claims abstract description 115
- 230000015654 memory Effects 0.000 claims description 63
- 238000004891 communication Methods 0.000 claims description 34
- 230000006870 function Effects 0.000 claims description 15
- 238000009877 rendering Methods 0.000 claims description 6
- 238000002156 mixing Methods 0.000 claims description 4
- 239000000203 mixture Substances 0.000 claims 1
- 238000012545 processing Methods 0.000 description 21
- 238000005516 engineering process Methods 0.000 description 7
- 230000003252 repetitive effect Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 241000406668 Loxodonta cyclotis Species 0.000 description 4
- 210000005069 ears Anatomy 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003334 potential effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44012—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8193—Monomedia components thereof involving executable data, e.g. software dedicated tools, e.g. video decoder software or IPMP tool
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
Definitions
- a computer program product comprising at least one computer-readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to perform at least: facilitating selection of at least one object from a plurality of objects in a multimedia content; accessing an object mobility content associated with the at least one object, the object mobility content being indicative of motion of the plurality of objects in the multimedia content; and generating an animated image associated with the multimedia content based on the selection of the at least one object and the object mobility content associated with the at least one object.
- computer networks such as the Internet, local area network, wide area networks, and the like; short range wireless communication networks such as include Bluetooth® networks, Zigbee® networks, Institute of Electric and Electronic Engineers (IEEE) 802.1 1x networks, and the like; wireline telecommunication networks such as public switched telephone network (PSTN).
- PSTN public switched telephone network
- the controller 108 may include circuitry implementing, among others, audio and logic functions of the device 100.
- the controller 108 may include, but are not limited to, one or more digital signal processor devices, one or more microprocessor devices, one or more processor(s) with accompanying digital signal processor(s), one or more processor(s) without accompanying digital signal processor(s), one or more special-purpose computer chips, one or more field-programmable gate arrays (FPGAs), one or more controllers, one or more application-specific integrated circuits (ASICs), one or more computer(s), various analog to digital converters, digital to analog converters, and/or other support circuits. Control and signal processing functions of the device 100 are allocated between these devices according to their respective capabilities.
- the processor 202 may specifically configure the processor 202 to perform the algorithms and/or operations described herein when the instructions are executed.
- the processor 202 may be a processor of a specific device, for example, a mobile terminal or network device adapted for employing embodiments by further configuration of the processor 202 by instructions for performing the algorithms and/or operations described herein.
- the processor 202 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 202.
- ALU arithmetic logic unit
- a user interface 206 may be in communication with the processor 202.
- Examples of the user interface 206 include, but are not limited to, input interface and/or output user interface.
- the input interface is configured to receive an indication of a user input.
- the output user interface provides an audible, visual, mechanical or other output and/or feedback to the user.
- Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, and the like.
- the captured multimedia content may include a mobile background portion and a stationary foreground portion. In some other embodiments, the captured multimedia content may include a mobile background portion and a mobile foreground portion.
- a processing means may be configured to perform the segmentation of the plurality of objects based on the depth map for determining the motion of the plurality of objects.
- An example of the processing means may include the processor 202, which may be an example of the controller 108.
- segmenting may be done by methods other than based on 'depth map' determination. For example, a user may chose a face portion as an object, and may segment the object. In an embodiment, the segmenting may be performed in a manner similar to two dimensional segmenting methods.
- the processor 202 is configured to, with the content of the memory 204, and optionally with other components described herein, to cause the apparatus 200 to facilitate a selection of at least one object from the plurality of objects for generating the animated image.
- the selected at least one objects may be mobile objects in the animated image while the unselected objects may be stationary.
- the selection of the objects may be swapped in various alternative embodiments.
- the selected objects may be stationary while the unselected objects may be mobile in the animated image.
- the selection of mobile and stationary objects is discussed in more detail in conjunction with FIGURES 3A and 3B.
- the selection of the at least one object is performed by a user action.
- the user action may include a mouse click, a touch on a display of the user interface, a gaze of the user, and the like.
- the selected at least one object may appear highlighted on the user interface.
- the Ul 300 is shown that may be an example of a user interface 206 of the apparatus 200.
- the user interface 300 is caused to display a scene area 310 and an option display area 320.
- the scene area 310 displays a viewfinder of the image capturing and animated image generation application of the apparatus 200. For instance, as the apparatus 200 moves in a direction, the preview of a current scene focused by the camera of the apparatus 200 also changes and is simultaneously displayed in the screen area 310, and the preview displayed on the screen area 310 can be instantaneously captured by the apparatus 200.
- the screen area 310 may display a pre-recorded multimedia content of the apparatus 200.
- FIGURE 4C illustrates selection of the at least one object and/or options by means of a gaze (represented as 410) of a user 412.
- a user may gaze at least one object displayed on a display screen of a user interface for example, the Ul 300.
- the at least one object may be selected for being in motion in the animated image.
- various other objects and/or options may be selected based on the gaze 410 of the user 412.
- the apparatus for example, the apparatus 200 may include sensors and other gaze detecting means for detecting the gaze or retina of the user for performing gaze based selection.
- These computer program instructions may also be stored in a computer-readable storage memory (as opposed to a transmission medium such as a carrier wave or electromagnetic signal) that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the operations specified in the flowchart.
- the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions, which execute on the computer or other programmable apparatus provide operations for implementing the operations in the flowchart.
- the operations of the method 600 are described with help of apparatus 200. However, the operations of the method can be described and/or practiced by using any other apparatus.
- the speed of the motion may be adjusted based on the mode.
- the mode may be indicative of a repetitive and/or non-repetitive motion of the objects.
- the sequence of images may include movement of the at least one object in one direction, and the movement of the object in the other direction may be recreated by playing the sequence of images in the reverse direction.
- an animated image of a person may include a scene of a person walking on a street.
- the motion of the feet in the forward direction may be captured in a sequence of images, say in frames 1 to 10, and the backward motion of the feet may be reconstructed by playing the sequence of images in the reverse direction.
- an animated image associated with the multimedia content is generated based on the selection of the at least one object, the object mobility content and the mode associated with the at least one object. For example, in a multimedia content having two objects in the foreground portion, the user may select only one object to be in motion in the animated image. In that case, the object mobility information associated with the selected object may be accessed, and the other object may be kept still. Also, the first image associated with the background portion of the animated image may be accessed, and the animated image may be generated.
- the animated image generated at block 622 may be stored at block 624. In an embodiment, the animated image may be stored in a memory, for example, the memory 204.
- a "computer- readable medium” may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of an apparatus described and depicted in FIGURES 1 and/or 2.
- a computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Processing Or Creating Images (AREA)
Abstract
Dans un mode de réalisation cité à titre d'exemple, on décrit un procédé, un appareil et un produit programme d'ordinateur. Le procédé consiste à permettre la sélection d'au moins un objet parmi une pluralité d'objets dans un contenu multimédia. Le procédé consiste également à accéder à un contenu sur la mobilité de l'objet associé audit objet. Le contenu sur la mobilité de l'objet indique un mouvement de la pluralité des objets dans le contenu multimédia. Une image animée associée au contenu multimédia est produite sur la base de la sélection de l'objet et du contenu sur la mobilité de l'objet associé audit objet.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12851764.6A EP2783349A4 (fr) | 2011-11-24 | 2012-10-25 | Procédé, appareil et produit programme d'ordinateur pour produire une image animée associée à un contenu multimédia |
CN201280054345.4A CN103918010B (zh) | 2011-11-24 | 2012-10-25 | 用于生成与多媒体内容相关联的动画图像的方法、装置和计算机程序产品 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN4042CH2011 | 2011-11-24 | ||
IN4042/CHE/2011 | 2011-11-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013076359A1 true WO2013076359A1 (fr) | 2013-05-30 |
Family
ID=48469195
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2012/051025 WO2013076359A1 (fr) | 2011-11-24 | 2012-10-25 | Procédé, appareil et produit programme d'ordinateur pour produire une image animée associée à un contenu multimédia |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140218370A1 (fr) |
EP (1) | EP2783349A4 (fr) |
CN (1) | CN103918010B (fr) |
WO (1) | WO2013076359A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015196920A1 (fr) * | 2014-06-27 | 2015-12-30 | 努比亚技术有限公司 | Procédé et dispositif de prise de vues pour image dynamique |
CN108810597A (zh) * | 2018-06-25 | 2018-11-13 | 百度在线网络技术(北京)有限公司 | 视频特效处理方法及装置 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140351723A1 (en) * | 2013-05-23 | 2014-11-27 | Kobo Incorporated | System and method for a multimedia container |
WO2015009750A1 (fr) * | 2013-07-15 | 2015-01-22 | Fox Broadcasting Company | Fourniture de fichiers de format d'image en mode point à partir de programmes audiovisuels |
US10089786B2 (en) * | 2013-08-19 | 2018-10-02 | Qualcomm Incorporated | Automatic customization of graphical user interface for optical see-through head mounted display with user interaction tracking |
US10021366B2 (en) * | 2014-05-02 | 2018-07-10 | Eys3D Microelectronics, Co. | Image process apparatus |
US10386996B2 (en) * | 2015-06-11 | 2019-08-20 | Microsoft Technology Licensing, Llc | Communicating emotional information via avatar animation |
US10163245B2 (en) * | 2016-03-25 | 2018-12-25 | Microsoft Technology Licensing, Llc | Multi-mode animation system |
US10547776B2 (en) | 2016-09-23 | 2020-01-28 | Apple Inc. | Devices, methods, and graphical user interfaces for capturing and recording media in multiple modes |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030035412A1 (en) * | 2001-07-31 | 2003-02-20 | Xuejun Wang | Animated audio messaging |
US20050070257A1 (en) * | 2003-09-30 | 2005-03-31 | Nokia Corporation | Active ticket with dynamic characteristic such as appearance with various validation options |
US20070121146A1 (en) * | 2005-11-28 | 2007-05-31 | Steve Nesbit | Image processing system |
US20090096796A1 (en) * | 2007-10-11 | 2009-04-16 | International Business Machines Corporation | Animating Speech Of An Avatar Representing A Participant In A Mobile Communication |
US20090278851A1 (en) * | 2006-09-15 | 2009-11-12 | La Cantoche Production, S.A. | Method and system for animating an avatar in real time using the voice of a speaker |
US20110227932A1 (en) * | 2008-12-03 | 2011-09-22 | Tencent Technology (Shenzhen) Company Limited | Method and Apparatus for Generating Video Animation |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6081278A (en) * | 1998-06-11 | 2000-06-27 | Chen; Shenchang Eric | Animation object having multiple resolution format |
CA2388095A1 (fr) * | 1999-10-22 | 2001-05-03 | Activesky, Inc. | Systeme video oriente-objet |
JP3452893B2 (ja) * | 2000-11-01 | 2003-10-06 | コナミ株式会社 | 表示制御プログラムを記録したコンピュータ読み取り可能な記録媒体、ならびに、表示制御装置および方法 |
US7609271B2 (en) * | 2006-06-30 | 2009-10-27 | Microsoft Corporation | Producing animated scenes from still images |
JP5551867B2 (ja) * | 2008-12-05 | 2014-07-16 | ソニー株式会社 | 情報処理装置、及び情報処理方法 |
JP4752921B2 (ja) * | 2009-01-28 | 2011-08-17 | ソニー株式会社 | 情報処理装置、アニメーション付加方法、及びプログラム |
-
2012
- 2012-10-25 WO PCT/FI2012/051025 patent/WO2013076359A1/fr active Application Filing
- 2012-10-25 EP EP12851764.6A patent/EP2783349A4/fr not_active Withdrawn
- 2012-10-25 CN CN201280054345.4A patent/CN103918010B/zh not_active Expired - Fee Related
- 2012-11-19 US US13/680,883 patent/US20140218370A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030035412A1 (en) * | 2001-07-31 | 2003-02-20 | Xuejun Wang | Animated audio messaging |
US20050070257A1 (en) * | 2003-09-30 | 2005-03-31 | Nokia Corporation | Active ticket with dynamic characteristic such as appearance with various validation options |
US20070121146A1 (en) * | 2005-11-28 | 2007-05-31 | Steve Nesbit | Image processing system |
US20090278851A1 (en) * | 2006-09-15 | 2009-11-12 | La Cantoche Production, S.A. | Method and system for animating an avatar in real time using the voice of a speaker |
US20090096796A1 (en) * | 2007-10-11 | 2009-04-16 | International Business Machines Corporation | Animating Speech Of An Avatar Representing A Participant In A Mobile Communication |
US20110227932A1 (en) * | 2008-12-03 | 2011-09-22 | Tencent Technology (Shenzhen) Company Limited | Method and Apparatus for Generating Video Animation |
Non-Patent Citations (2)
Title |
---|
JAMES TOMPKIN ET AL.: "Towards Moment Imagery: Aautomatic Cinemagraphs", VISUAL MEDIA PRODUCTION (CVMP, 2011, pages 87 - 93, XP032074521, DOI: doi:10.1109/CVMP.2011.16 |
See also references of EP2783349A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015196920A1 (fr) * | 2014-06-27 | 2015-12-30 | 努比亚技术有限公司 | Procédé et dispositif de prise de vues pour image dynamique |
US10237490B2 (en) | 2014-06-27 | 2019-03-19 | Nubia Technology Co., Ltd. | Shooting method and shooting device for dynamic image |
CN108810597A (zh) * | 2018-06-25 | 2018-11-13 | 百度在线网络技术(北京)有限公司 | 视频特效处理方法及装置 |
CN108810597B (zh) * | 2018-06-25 | 2021-08-17 | 百度在线网络技术(北京)有限公司 | 视频特效处理方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN103918010B (zh) | 2017-06-30 |
US20140218370A1 (en) | 2014-08-07 |
EP2783349A4 (fr) | 2015-05-27 |
EP2783349A1 (fr) | 2014-10-01 |
CN103918010A (zh) | 2014-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140218370A1 (en) | Method, apparatus and computer program product for generation of animated image associated with multimedia content | |
US9563977B2 (en) | Method, apparatus and computer program product for generating animated images | |
WO2019141100A1 (fr) | Procédé et dispositif pour afficher un objet supplémentaire, dispositif informatique et support de stockage | |
US9342866B2 (en) | Method, apparatus and computer program product for generating panorama images | |
US9928628B2 (en) | Method, apparatus and computer program product to represent motion in composite images | |
US9443130B2 (en) | Method, apparatus and computer program product for object detection and segmentation | |
US20130300750A1 (en) | Method, apparatus and computer program product for generating animated images | |
US20140359447A1 (en) | Method, Apparatus and Computer Program Product for Generation of Motion Images | |
US10003743B2 (en) | Method, apparatus and computer program product for image refocusing for light-field images | |
EP2680222A1 (fr) | Procédé, appareil et produit de programme informatique permettant de traiter du contenu média | |
US9183618B2 (en) | Method, apparatus and computer program product for alignment of frames | |
US9147226B2 (en) | Method, apparatus and computer program product for processing of images | |
US9269158B2 (en) | Method, apparatus and computer program product for periodic motion detection in multimedia content | |
US20150325040A1 (en) | Method, apparatus and computer program product for image rendering | |
US9158374B2 (en) | Method, apparatus and computer program product for displaying media content | |
US20130107008A1 (en) | Method, apparatus and computer program product for capturing images | |
US10097807B2 (en) | Method, apparatus and computer program product for blending multimedia content | |
US20130215127A1 (en) | Method, apparatus and computer program product for managing rendering of content | |
WO2012131149A1 (fr) | Procédé, appareil et produit programme informatique pour détecter des expressions faciales | |
WO2018002800A1 (fr) | Procédé et appareil pour créer un sous-contenu dans un contenu de réalité virtuelle et son partage | |
CN115278041B (zh) | 图像处理方法、装置、电子设备以及可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12851764 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012851764 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |