EP1941509A1 - Procede et appareil de codage / decodage - Google Patents

Procede et appareil de codage / decodage

Info

Publication number
EP1941509A1
EP1941509A1 EP06799203A EP06799203A EP1941509A1 EP 1941509 A1 EP1941509 A1 EP 1941509A1 EP 06799203 A EP06799203 A EP 06799203A EP 06799203 A EP06799203 A EP 06799203A EP 1941509 A1 EP1941509 A1 EP 1941509A1
Authority
EP
European Patent Office
Prior art keywords
information
area
data
image
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06799203A
Other languages
German (de)
English (en)
Other versions
EP1941509A4 (fr
Inventor
Tae Hyeon Kim
Hyouk Jean Cha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020060066557A external-priority patent/KR101212692B1/ko
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of EP1941509A1 publication Critical patent/EP1941509A1/fr
Publication of EP1941509A4 publication Critical patent/EP1941509A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers

Definitions

  • the present invention relates to a method and apparatus for encoding/decoding multimedia data including a video, an audio, and a text, and more particularly, to an encoding/decoding method and apparatus for sequentially reproducing a plurality of media data, thereby constructing a slide show.
  • Background Art
  • the present invention is to solve at least the problems and disadvantages of the background art.
  • the present invention is to provide a multimedia data structure for efficiently encoding/decoding multimedia data, and a multimedia data encoding/decoding method and apparatus using the same.
  • the encoding method includes generating a data area including a plurality of media data areas; generating a plurality of track areas corresponding to the plurality of media data areas, respectively; and generating an animation area including at least one of grouping information on an animation effect, opacity effect information, size information on an image to which the animation effect is to be applied, and geometrical transform effect information.
  • the track area includes timing information for sequentially reproducing a plurality of media data included in the media data area, and the plurality of media data included in the data area are dependent on one timeline.
  • the decoding method includes receiving multimedia data including: a data area including a plurality of media data areas; a plurality of track areas corresponding to the pluralty of media data areas, respectively; and an animation area including at least one of grouping information on an animation effect, opacity effect information, size information on an image to which the animation effect is to be applied, and geometrical transform effect information; and sequentially reproducing a plurality of media data included in the media data area, using the information included in the animation area.
  • the plurality of media data included in the data area are dependent on one timeline.
  • an encoding apparatus includes a data area generator for generating a data area including a plurality of media data areas; a track area generator for generating a plurality of track areas corresponding to the plurality of media data areas, respectively; and an animation area generator for generating an animation area including at least one of grouping information on an animation effect, opacity effect information, size information on an image to which the animation effect is to be applied, and geometrical transform effect information.
  • the track area includes timing information for sequentially reproducing a plurality of media data included in the media data area, and the plurality of media data included in the data area are dependent on one timeline.
  • the decoding apparatus includes a data input unit for receiving multimedia data including: a data area including a plurality of media data areas; a media information area including a plurality of track areas corresponding to the pluralty of media data areas, respectively; and an animation area including at least one of grouping information on an animation effect, opacity effect information, size information on an image to which the animation effect is to be applied, and geometrical transform effect information; and a reproducing unit for sequentially reproducing a plurality of media data included in the media data area, using the animation effect information included in the animation area.
  • the plurality of media data included in the data area are dependent on one timeline.
  • amultimedia data structure includes a file type area having information on a file format; a data area including a plurality of media data areas; a plurality of track areas corresponding to the plurality of media data areas, respectively; and a media information area including an animation area including at least one of grouping information on an animation effect, opacity effect information, size information on an image to which the animation effect is to be applied, and geometrical transform effect information.
  • a multimedia data encoding/decoding method and apparatus has an effect of being capable of constructing a slide show by only a small amount of multimedia data.
  • a time taken to process and transmit the multimedia data can reduce.
  • FIG. 1 is a schematic diagram illustrating an entire structure of multimedia data according to the present invention
  • FIG. 2 illustrates a multimedia data structure according to a first exemplary embodiment of the present invention
  • FIG. 3 illustrates a multimedia data structure according to a second exemplary embodiment of the present invention
  • FIG. 4 illustrates a multimedia data structure according to a third exemplary embodiment of the present invention
  • FIG. 5 illustrates timing information on a plurality of media data according to an exemplary embodiment of the present invention
  • FIG. 5 illustrates timing information on a plurality of media data according to an exemplary embodiment of the present invention
  • FIG. 6 is a block diagram illustrating a construction of an encoding apparatus according to an exemplary embodiment of the present invention
  • FIG. 7 is a block diagram illustrating a construction of a decoding apparatus according to an exemplary embodiment of the present invention
  • FIG. 8 is a flowchart illustrating an encoding method according to the present invention
  • FIG. 9 is a flowchart illustrating a decoding method according to the present invention.
  • FIG. 1 is a schematic diagram illustrating an entire structure of multimedia data according to the present invention.
  • a multimedia data file is comprised of a file type area, a media information area, and a data area.
  • the file type area represents a format of a multimedia data file, and can be expressed in a form or version of the multimedia data file. For example, it can represent that the format of the multimedia data file is an MPEG-4 version 2 format when a file type is "mp42".
  • the data area includes a pluraity of media data areas. Each of the media data areas includes media data.
  • FIG. 1 is a schematic diagram illustrating an entire structure of multimedia data according to the present invention.
  • the file type area represents a format of a multimedia data file, and can be expressed in a form or version of the multimedia data file. For example, it can represent that the format of the multimedia data file is an MPEG-4 version 2 format when a file type is "mp42".
  • the data area includes a pluraity of media data areas. Each of the media data areas includes
  • the data area includes first, second, and third media data areas. But, the data area can also include four or more media data areas or two or less media data areas.
  • the media data areas can include several types of media data such as image data, audio data, or text data.
  • the image data can be still picture data or moving picture data.
  • the media information area has information on the media data included in the data area. Referring to FIG. 1, it is desirable that the media information area includes a plurality of track areas that correspond to the plurality of media data areas included in the data area, respectively.
  • the media information area can include a first track area, a second track area, and a third track area.
  • the first track area has information on the media data included in the first media data area.
  • the second track area has information on the media data included in the second media data area.
  • the third track area has information on the media data included in the second media data area.
  • the track area included in the media information area can have timing information for sequentially reproducing the media data included in the corresponding media data area, thereby constructing a slide show.
  • the first track area can have information on a duration for reproducing the media data included in the first media data area.
  • the track area can include several pieces of information on the media data.
  • the media data is music data
  • its corresponding track area can include musician informaion or musical composer information.
  • FIG. 2 illustrates a multimedia data structure according to a first exemplary embodiment of the present invention.
  • Media information area can include track areas corresponding to media data areas, respectively, and a meta area.
  • the track area can be comprised of a media area and a track meta area.
  • the meta area is included in the media information area at the same level as those of the track areas.
  • the meta area includes information on the media data included in the data area.
  • the information is information on an attribute for distinguishing the plurality of media data from each other.
  • the meta area includes identification (ID) information and position information on the media data. More desirably, the meta area can include name information, contents type information, ID, position information, and size information on the media data.
  • the first media data area includes the N number of JPEG image data
  • the second media data area includes MP3 audio data
  • the third media data area includes text data.
  • the meta area includes a first me second meta areas can share and include information on the media data.
  • the first meta area can include name and content type information on the media data
  • the second meta area can include physical position and size information on the media data.
  • a handler type of the meta area can be designated to "lsr 1 " and used.
  • a meta area can include an animation area having information on an animation effect to be applied to media data.
  • the animation area can include at least one of grouping information on the animation effect, size informationon the media data to which the animation effect is applied, opacity effect information, and geometrical transform information.
  • the grouping information represents a combination of the animation effects to be applied to the media data.
  • the size information describes a variation of an image size when the media data is image data.
  • the opacity effect information describes an image fade-in or fade-out effect.
  • the geometrical transform information describes effects of transition between images, image scale transform, rotation, and skew, and the like.
  • the animation area can include information on a motion path of an image or information on motion paths of objects included in the image.
  • the animation area can include image color change information or image form information.
  • the image form information can be a rectangle, a circle, an oval, a line, a polyline, a polygon, and the like.
  • the animation area can include attribute control information for controlling the attribute of the media data, to realize several animation effects applicable to the media data.
  • a meta area can be positioned on a file level, not included in a media information area.
  • a multimedia data file can be comprised of four areas: a file type area, the meta area, a media information area, and a data area.
  • the animation area can use a language such as Light-weight Application Scene
  • LASeR Scalable Vector Graphics
  • BIFS Binary Format for Scene
  • the LASeR, SVG, or BIFS can be realized in an extensible Mark-up Language (XML) format or a Binary encoded format.
  • XML extensible Mark-up Language
  • a symbol ( ⁇ ) signifies a start, and a symbol (>) signifies an end.
  • a symbol (/) signifies an end of a context to be defined.
  • a context of ⁇ svg> to ⁇ /svg> is a bundle
  • the "opacity” and “transform” are names of animation attributes or animation effects.
  • the "opacity” and “transform” represent opacity effect and geometrical transform, respectively.
  • a symbol (sum) represents a sum of the animation attributes.
  • a symbol (dur) represents information on the duration for reproducing the image data.
  • a symbol (infinite) signifies indefiniteness.
  • the image data comprised of one or more samples is distinguished from each other in a chunk unit.
  • the samples can be arranged in a temporal sequence at each chunk.
  • Each sample included in the chunk has its inherent identification number (ID).
  • the inherent identification number (ID) of each sample can be given starting from T.
  • the track area can include the media area, and the track meta area.
  • the media area includes the timing information for sequentially reproducing the media data, thereby constructing the slide show.
  • the track meta area includes the information on the media data.
  • the timing information on the media data refers to information on the duration or a sequence for reproducing the media data on a timeline. It is desirable that all the media data included in the data area are dependent on one timeline. In other words, it is desirable that the timing information on all the media data included in the data area are expressed on one timeline.
  • Each of the media data has the timing information, separately. Thus, the durations of reproducing the media data, respectively, cannot be consistent with each other.
  • the media area can include a first area having the reproduction duration information on the media data; a second area having the position information on the media data; and a third area having the size information on the media data.
  • the media data to be reproduced can be searched using the position and size information included in the second and third areas.
  • the timing information on the media area can be expressed using the language such as the LASeR, the SVG, or the BIFS.
  • the LASeR, the SVG, or the BIFS can be realized in the XML format or the Binary encoded format.
  • the timing information on all the media data included in the data area can be included in one media area, for example, the media area of the first track area.
  • the media areas can have the timing information on the corresponding media data, respectively.
  • the media area of the first track area can have the timing information on the first media data
  • the media area of the second track area can have the timing information on the second media data
  • the media area of the third track area can have the timing information on the third media data.
  • the track meta area can include information for distinguishing the media data from each other.
  • the attribute information on the media data can be included in the meta area of the media information area, or included in the track meta area of the track area.
  • the information on all the media data included in the data area are desirably included in one meta area.
  • the information on each media data is desirably divided and positioned in the track meta area included in the corresponding track area.
  • the track meta area can have information on the animation effect.
  • FIG. 3 illustrates a multimedia data structure according to a second exemplary embodiment of the present invention.
  • An data area can include an image data area, an audio data area, and a text data area.
  • An media information area can include a slide show area having information on image data, an audio track area having information on audio data, and a text track area having information on text data.
  • the image data included in the image data area can be still picture data or moving picture data.
  • the image data can be data compressed in a format of Joint Picture Expert Group (JPEG), Moving Picture Expert Group (MPEG)-I, 2, or Advanced Video Coding (AVC).
  • JPEG Joint Picture Expert Group
  • MPEG Moving Picture Expert Group
  • AVC Advanced Video Coding
  • the image data can be data such as various formats of video clips or photographs acquired by a device (not shown) such as a camcorder (not shown) or a portable terminal (not shown).
  • the audio data included in the audio data area can be music data, accompaniment data, or voice data.
  • the audio data can be data compressed in a format of MPEG Layer-3 (MP3) or Advanced Audio Coding (AAC). Alternatly, the audio data can be a result obtained by synthesizing the accompaniment data and the voice data.
  • the accompaniment data can be data expressed by only a musical instrument sound excluding a musician's voice in music.
  • the text data included in the text data area can be data having a character string distinguished in a line unit.
  • each line can be treated as a sample.
  • FIG. 5 illustrates timing information on a plurality of media data according to an exemplary embodiment of the present invention.
  • An image data area has six pieces of image data
  • an audio data area has three pieces of audio data
  • a text data area has four pieces of text data.
  • a media area of a slide show area can have all of the reproduction duration information, the position information, and the size information on the six pieces of image data, the three pieces of audio data, and the four pieces of text data.
  • a media area of a slide show area has reproduction duration information, position information, and size information on six pieces of image data.
  • a media area of an audio track area has reproduction duration information, position information, and size information on three pieces of audio data.
  • a media area of a text track area can have reproduction duration information, position information, and size information on four pieces of text data.
  • FIG. 6 is a block diagram illustrating a construction of an encoding apparatus according to an exemplary embodiment of the present invention.
  • the encoding apparatus includes a file type area generator 100, a media information area generator 110, a data area generator 120, and an output unit 130.
  • the file type area generator 100 generates a file type area representing a format of a multimedia data file (Step 300).
  • the media information area generator 110 generates a media information area including information on media data, for example, timing information on the media data included in a data area (Step 310).
  • the data area generator 120 generates a data area including a plurality of media data areas (Step 320).
  • a sequence of generating the area in the encoding apparatus shown in FIG. 6 is merely one example of an operation of the encoding apparatus according to the present invention. Thus, it is not intended to limit the scope of the present invention.
  • the area generation sequence can be modified, or two or more areas can be simultaneously generated in parallel.
  • the output unit 130 constructs the generated file type area, media information area, and data area as one file, and outputs the encoded multimedia data (Step 330).
  • FIG. 7 is a block diagram illustrating a construction of a decoding apparatus according to an exemplary embodiment of the present invention.
  • the decoding apparatus includes a data input unit 200, a timing information extractor 210, an animation effect information extractor 220, a media data extractor 230, and a reproducing unit 240. An operation of the decoding apparatus shown in FIG. 7 will be described with reference to FIG. 9.
  • FIG. 9 is a flowchart illustrating a decoding method according to an exemplary embodiment of the present invention.
  • the data input unit 200 receives multimedia data (Step 400).
  • the timing information extractor 210 extracts timing information on media data from the received multimedia data (Step 410). It is desirable that the timing information extractor 210 parses a media information area from the received multimedia data and then, extracts the timing information on the media data from a media area included in the media information area.
  • the media data extractor 230 extracts the media data to be reproduced depending on the extracted timing information from a data area (Step 420). It is desirable that the media data extractor 230 searches the data area for the media data, using size information and position information on the media data included in the media area.
  • the reproducing unit 240 sequentially reproduces the extracted media data using the extracted timing information, thereby constructing a slide show (Step 430).
  • the animation effect information extractor 220 parses the animation area, and extracts the animation effect information.
  • the reproducing unit 240 can reproduce image data included in an image data area, using the animation effect information.
  • the reproducing unit 240 reproduce audio data and text data, using the timing information extracted by the timing information extractor 210.
  • the encoding/decoding method according to the present invention can be programmed for execution in a computer and stored in a computer readable recording medium.
  • the multimedia data having the data structure according to the present invention can be also stored in the computer readable recording medium.
  • the computer readable recording medium includes all kinds of storage units storing data readable by a computer system.
  • the computer readable recording medium is exemplified as a Read Only Memory (ROM), a Random Access Memory (RAM), a Compact-Disk Read Only Memory (CD-ROM), a magnetic tape, a floppy disk, and an optic data storage unit, and includes a unit realized in a format of carrier wave (e.g., Internet transmission).
  • the computer readable recording medium is dispersed to the computer system connecting thereto through a network, and can store and execute a computer readable code in a dispersion method.
  • Function program, code and code segments for realizing a user tracking method can be easily inferred by programmers in a technological field of the present invention.
  • an encoding/decoding method and apparatus can be widely used for a multimedia player or a multimedia coding device for reproducing a plurality of media data, thereby reducing a time taken to process and transmit the multimedia data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un appareil de codage / décodage de données multimédia. Le procédé de codage consiste à générer une zone de données comprenant une pluralité de zones de données de support; générer une pluralité de zones de pistes correspondant à la pluralité de zones de données de support, respectivement; et à générer an zone d'animation comprenant au moins des informations sur le groupage sur l'effet d'animation, des informations sur l'effet d'opacité, des informations sur la taille sur une image à laquelle est appliqué l'effet d'animation, et des informations sur la transformée géométrique. Selon cette invention, le procédé et l'appareil de codage / décodage de données multimédia a la capacité de former un diaporama en n'utilisant qu'une faible quantité de données multimédia. De cette manière, on peut réduire le temps nécessaire pour traiter et transmettre les données multimédia.
EP06799203A 2005-10-13 2006-10-13 Procede et appareil de codage / decodage Withdrawn EP1941509A4 (fr)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US72565505P 2005-10-13 2005-10-13
US72565205P 2005-10-13 2005-10-13
US72623005P 2005-10-14 2005-10-14
US75746306P 2006-01-10 2006-01-10
US78717306P 2006-03-30 2006-03-30
US78873606P 2006-04-04 2006-04-04
US78987606P 2006-04-07 2006-04-07
KR1020060066557A KR101212692B1 (ko) 2006-03-30 2006-07-14 미디어 재생 방법 및 장치와 이를 위한 미디어 파일 포맷
PCT/KR2006/004125 WO2007043829A1 (fr) 2005-10-13 2006-10-13 Procede et appareil de codage / decodage

Publications (2)

Publication Number Publication Date
EP1941509A1 true EP1941509A1 (fr) 2008-07-09
EP1941509A4 EP1941509A4 (fr) 2011-11-16

Family

ID=37943018

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06799203A Withdrawn EP1941509A4 (fr) 2005-10-13 2006-10-13 Procede et appareil de codage / decodage

Country Status (2)

Country Link
EP (1) EP1941509A4 (fr)
WO (1) WO2007043829A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8275814B2 (en) * 2006-07-12 2012-09-25 Lg Electronics Inc. Method and apparatus for encoding/decoding signal

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999037072A2 (fr) * 1998-01-15 1999-07-22 Apple Computer, Inc. Procede et equipement de transmission de donnees
WO1999064944A2 (fr) * 1998-06-08 1999-12-16 Microsoft Corporation Compression geometrique par chronodependance
US6037983A (en) * 1996-11-08 2000-03-14 Hughes Electronics Corporation High quality reduced latency transmission of video objects
EP1018840A2 (fr) * 1998-12-08 2000-07-12 Canon Kabushiki Kaisha Récepteur digital et méthode
WO2001095632A2 (fr) * 2000-06-06 2001-12-13 General Instrument Corporation Estimation globale des mouvements en vue de la creation de lutins
US6369835B1 (en) * 1999-05-18 2002-04-09 Microsoft Corporation Method and system for generating a movie file from a slide show presentation
EP1302900A1 (fr) * 2000-05-31 2003-04-16 Sharp Kabushiki Kaisha Dispositif de montage d'animation, procede de montage d'animation, programme de montage d'animation, et support enregistre contenant un programme informatique de montage d'animation
EP1376406A2 (fr) * 2002-06-26 2004-01-02 Microsoft Corporation Un système et procédé pour créer des présentations interactives avec des composants multimedia
US20040220926A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc., A California Cpr[P Personalization services for entities from multiple sources

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0692114B1 (fr) * 1993-03-31 2000-02-02 Object Technology Licensing Corp. Sequences de scripts basees sur le temps
JP2003249057A (ja) * 2002-02-26 2003-09-05 Toshiba Corp デジタル情報媒体を用いるエンハンスド・ナビゲーション・システム
EP1577795A3 (fr) * 2003-03-15 2006-08-30 Oculus Info Inc. Système et méthode pour visualiser des informations temporelles et spatiales liées comme représentation visuelle intégrée dans une interface utilisateur
JP2005276344A (ja) * 2004-03-25 2005-10-06 Toshiba Corp 情報記録媒体及び情報再生装置
JP2005332521A (ja) * 2004-05-21 2005-12-02 Toshiba Corp 情報記録媒体及び情報再生装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6037983A (en) * 1996-11-08 2000-03-14 Hughes Electronics Corporation High quality reduced latency transmission of video objects
WO1999037072A2 (fr) * 1998-01-15 1999-07-22 Apple Computer, Inc. Procede et equipement de transmission de donnees
WO1999064944A2 (fr) * 1998-06-08 1999-12-16 Microsoft Corporation Compression geometrique par chronodependance
EP1018840A2 (fr) * 1998-12-08 2000-07-12 Canon Kabushiki Kaisha Récepteur digital et méthode
US6369835B1 (en) * 1999-05-18 2002-04-09 Microsoft Corporation Method and system for generating a movie file from a slide show presentation
US20040220926A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc., A California Cpr[P Personalization services for entities from multiple sources
EP1302900A1 (fr) * 2000-05-31 2003-04-16 Sharp Kabushiki Kaisha Dispositif de montage d'animation, procede de montage d'animation, programme de montage d'animation, et support enregistre contenant un programme informatique de montage d'animation
WO2001095632A2 (fr) * 2000-06-06 2001-12-13 General Instrument Corporation Estimation globale des mouvements en vue de la creation de lutins
EP1376406A2 (fr) * 2002-06-26 2004-01-02 Microsoft Corporation Un système et procédé pour créer des présentations interactives avec des composants multimedia

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"INTERNATIONAL STANDARD ISO/IEC 14496-12 Information technology Coding of audio-visual objects Part 12: ISO base media file format", INTERNET CITATION, 1 January 2005 (2005-01-01), pages 1-84, XP007914375, Retrieved from the Internet: URL:http://www.iso.org/iso/iso_catalogue/c atalogue_ics/catalogue_detail_i cs.htm?csnumber=41828 [retrieved on 2010-08-11] *
APPLE COMPUTER ET AL: "QuickTime File Format", QUICKTIME FILE FORMAT, APPLE COMPUTER, INC , 1 March 2001 (2001-03-01), pages 1-274, XP002588828, Retrieved from the Internet: URL:http://developer.apple.com/standards/qtff-2001.pdf [retrieved on 2010-06-21] *
See also references of WO2007043829A1 *
SINGER D ET AL: "ISO/IEC 14496-1/PDAM7 USE OF AVC IN MPEG-4 SYSTEMS AND THE MP4 FILE FORMAT", INTERNATIONAL STANDARD ISO/IEC, XX, XX, 26 July 2002 (2002-07-26), pages I-VI,01, XP001074666, *

Also Published As

Publication number Publication date
EP1941509A4 (fr) 2011-11-16
WO2007043829A1 (fr) 2007-04-19

Similar Documents

Publication Publication Date Title
US8199826B2 (en) Method and apparatus for encoding/decoding
US8275814B2 (en) Method and apparatus for encoding/decoding signal
US20090160862A1 (en) Method and Apparatus for Encoding/Decoding
WO2007043829A1 (fr) Procede et appareil de codage / decodage
KR101397146B1 (ko) 멀티 미디어 데이터의 부호화/복호화 방법 및 장치
CN101313577A (zh) 编码/解码的方法和装置
KR101275555B1 (ko) 멀티 미디어 재생 방법 및 장치와 이를 위한 멀티 미디어파일 포맷

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080411

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: LG ELECTRONICS INC.

A4 Supplementary search report drawn up and despatched

Effective date: 20111014

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 7/24 20110101ALI20111010BHEP

Ipc: G06F 17/30 20060101ALI20111010BHEP

Ipc: G11B 27/034 20060101ALI20111010BHEP

Ipc: G11B 27/10 20060101AFI20111010BHEP

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20130822

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160503