AU5440301A - Information processing apparatus and method, program, and recorded medium - Google Patents

Information processing apparatus and method, program, and recorded medium Download PDF

Info

Publication number
AU5440301A
AU5440301A AU54403/01A AU5440301A AU5440301A AU 5440301 A AU5440301 A AU 5440301A AU 54403/01 A AU54403/01 A AU 54403/01A AU 5440301 A AU5440301 A AU 5440301A AU 5440301 A AU5440301 A AU 5440301A
Authority
AU
Australia
Prior art keywords
stream
recording
information
stream data
reproducing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU54403/01A
Other versions
AU779673B2 (en
Inventor
Toshiya Hamada
Motoki Kato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of AU5440301A publication Critical patent/AU5440301A/en
Application granted granted Critical
Publication of AU779673B2 publication Critical patent/AU779673B2/en
Anticipated expiration legal-status Critical
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/036Insert-editing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/329Table of contents on a disc [VTOC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/00086Circuits for prevention of unauthorised reproduction or copying, e.g. piracy
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/7921Processing of colour television signals in connection with recording for more than one processing mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)
  • Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)

Description

1 DESCRIPTION Information Processing Method and Apparatus, Program and Recording Medium Technical Field This invention relates to an information processing method and apparatus, a program and a recording medium. More particularly, it relates to an information processing method and apparatus, a program and a recording medium for recording a file including the information indicated for explanation on the GUI, the information on the main reproducing path, the information on the subsidiary reproducing path, connection information between respective reproducing domains making up the main reproducing path, or the information on bookmarks or resume points useful for the user to set a desired scene. Background Art Recently, a variety of types of optical discs have been proposed as a recording medium that can be removed from a recording apparatus. These recordable optical discs have been proposed as a large capacity medium of several GBs and are thought to be promising as a medium for recording AV (audio visual) signals. Among the sources (supply sources) of digital AV signals, recorded on the recordable optical disc, there are, for example, CS digital satellite broadcast and BS digital broadcast. For future use, ground wave television broadcast of the digital system has also been 2 proposed. It should be noted that digital video signals, furnished from these sources, are routinely compressed in accordance with MPEG (Moving picture Experts Group) 2 system. For the recording apparatus, a recording rate proper to the apparatus is set. If digital video signals, derived from the digital broadcast, are recorded by a conventional video storage medium for domestic use, in accordance with an analog recording system, digital video signals are first decoded and subsequently band-limited for recording. Alternatively, with the digital recording system, exemplified first of all by MPEG1 video, MPEG2 video or DV system, the digital video signals are once decoded and subsequently re-encoded in accordance with the recording rate and encoding system proper to the apparatus for recording. However, with such recording method, in which the bitstream furnished is once decoded and subsequently bandwidth-limited or re-encoded prior to recording, the picture quality is necessarily deteriorated. If, in recording compressed digital signals, the transmission rate for input digital signals is not higher than the recording rate of the recording and/or reproducing apparatus, such a method in which the bitstream furnished is directly recorded without decoding or re-encoding suffers from picture quality deterioration to the least extent. However, if the transmission rate of the compressed digital signals exceeds the recording rate of the disc as the recording medium, it is truly necessary to first decode the digital signals in the recording and/or reproducing apparatus and to re-encode the digital signals for recording so that the 3 transmission rate will be not higher than the upper limit of the recording rate of the disc. If the signals being transmitted in accordance with a variable rate system in which the bitrate of the input digital signals is increased or decreased with time, the capacity of the recording medium can be exploited less wastefully in such a disc recording system in which data can be stored once in a buffer and recorded in a burst like fashion than with a recording system in which the rotary head is of a fixed rpm and hence the recording rate is a fixed recording rate. In the near future when digital broadcast becomes the mainstream, it may be premeditated that there is a positive demand for such a recording and/or reproducing apparatus in which aired signals are recorded in the state of digital signals, without decoding or re-encoding, as in the case of a data streamer, and in which a disc is used as a recording medium. Meanwhile, in recording an AV stream data on a recording medium by the above-described recording apparatus, an AV stream data may be analyzed to enable fast replay to detect the position of an I-picture to effect recording such as to enable accessing an I-picture. Alternatively, the AV stream may be directly recorded without analysis. In such case, the conventional practice has been to provide respective dedicated application programs by means of which the AV stream is recorded on the recording medium as AV streams of different formats. The result is that development of an 4 application program tends to be costly and time-consuming. In the AV streams recorded in respective application programs, the format is different, from one AV stream to another, with the result that the respective AV streams cannot be reproduced on the same apparatus because of lack of compatibility. In addition, the conventional recording apparatus has a drawback that audio data, for example, are difficult to post-record. Disclosure of the Invention It is therefore an object of the present invention to provide an arrangement in which an AV stream capable of performing high speed recording and an AV stream incapable of performing high speed recording can be supervised in common. It is another object of the present invention to provide an arrangement in which post-recording is possible. An information processing apparatus for recording AV stream data on a recording medium includes first generating means for generating a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, selection means for selecting one of the first table and the second table depending on a recording method and first recording means for 5 recording the selected table on the recording medium along with the AV stream data. The first table may be EP map and the second table may be TU-map. The selection means may select the second table in case of non-cognizant recording. The selection means may select the first table in case of self encoding recording. The selection means may select the first table in case of cognizant recording. The information processing apparatus may further include second generating means for generating the reproduction specifying information specifying the reproduction of the AV stream data, and second recording means for recording the reproduction specifying information generated by the second generating means on the recording medium. The reproduction specifying information may include the identification information for specifying whether the time information of the reproduction domain of the AV stream data is to be represented on the presentation time basis or on the arrival time basis. If the first table is recorded along with the AV stream data, the reproduction specifying information expresses the time information of the reproduction domain of the AV stream data on the presentation time basis. If the second table is recorded along with the AV stream data, the reproduction specifying information expresses the time information of the reproduction domain of the AV stream data on the arrival time basis.
6 The present invention also provides an information processing method for recording AV stream data on a recording medium including a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, a selecting step of selecting one of the first table and the second table depending on a recording method and a recording step of recording the selected table on the recording medium along with the AV stream data. The present invention also provides a recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, in which the program includes a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, a selecting step of selecting one of the first table and the second table depending on a recording method and a recording step of recording the selected table on the recording medium along with the AV stream data.
7 The present invention also provides a program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, a selecting step of selecting one of the first table and the second table depending on a recording method and a recording step of recording the selected table on the recording medium along with the AV stream data. The present invention also provides an information processing apparatus for reproducing AV stream data from a recording medium including reproducing means for reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, from the recording medium, which has the first table or the second table recorded thereon depending on a recording method, and control means for controlling the outputting of the AV stream data based on the reproduced table. The present invention also provides an information processing method for 8 reproducing AV stream data from a recording medium including a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, from the recording medium, which has the first table or the second table recorded thereon depending on a recording method, and a controlling step of controlling the outputting of the AV stream data based on the reproduced table. The present invention also provides a recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, in which the program includes a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, from the recording medium, which has the first table or the second table recorded thereon depending on a recording method, and a controlling step of controlling the outputting of the AV stream data based on the reproduced table. The present invention also provides a program for having a computer, 9 controlling an information processing apparatus recording AV stream data on a recording medium, execute a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, from the recording medium, having the first table or the second table recorded thereon depending on a recording method, and a controlling step of controlling the outputting of the AV stream data based on the reproduced table. The present invention also provides a recording medium having recorded thereon one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, depending on a recording method. The present invention also provides an information processing apparatus for recording AV stream data on a recording medium including generating means for generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information indicating a subsidiary reproducing path, and recording means for recording the AV stream data and the 10 reproduction specifying information on the recording medium. The main reproducing path may be a path for the post-recording of audio data. The first information may be mainpath and the second information may be the Sub-path. The second information may include the type information indicating the type of the subsidiary reproducing information, a filename of the AV stream referenced by the subsidiary reproducing path, in- and out-points of the AV stream of the subsidiary reproducing path and the time on the main path at which the in-point of the reproducing path starts in synchronization on the time axis of the main path. The present invention also provides an information processing method for an information processing apparatus recording AV stream data on a recording medium, including a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a recording step of recording the AV stream data and the reproduction specifying information on the recording medium. The present invention also provides a recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, in which the program includes a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information 11 specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a recording step of recording the AV stream data and the reproduction specifying information on the recording medium. The present invention also provides a program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a recording step of recording the AV stream data and the reproduction specifying information on the recording medium. The present invention also provides an information processing apparatus for reproducing AV stream data from a recording medium including reproducing means for reproducing, from the recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and control means for controlling the outputting of the AV stream data based on the reproduced reproduction specifying information. The present invention also includes an information processing method for reproducing AV stream data from a recording medium including a reproducing step of reproducing, from the recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second 12 information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a controlling step of controlling the outputting of the AV stream data based on the reproduced reproduction specifying information. The present invention also provides a recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, in which the program includes a reproducing step of reproducing, from the recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a controlling step of controlling the outputting of the AV stream data based on the reproduced reproduction specifying information. The present invention also provides a program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a reproducing step of reproducing, from the recording medium, the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path, and a controlling step of controlling the outputting of the AV stream data based on the reproduced reproduction specifying information. The recording medium according to the present invention has recorded thereon 13 the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with the main reproducing path. In the information recording and/or reproducing method and apparatus, program for a recording medium, program, and in the recording medium, according to the present invention, one of the first table describing the first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet is recorded depending on a recording method. In the information processing method and apparatus, program for a recording medium and the program, according to the present invention, one of a first table describing the relation of correspondence between presentation time stamp and an address in the AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in the AV stream data of a corresponding transport packet, is reproduced from the recording medium, which has the first table or the second table recorded thereon, depending on a recording method, and the outputting is controlled accordingly. In the information processing method and apparatus, program for a recording 14 medium, program and the second recording medium, according to the present invention, the reproduction specifying information comprised of the first information indicating the main reproducing path and the second information indicating the subsidiary reproduction path reproduced in synchronism with the main reproducing path is recorded. In the information processing method and apparatus, program for a recording medium, and the program, according to the present invention, the reproduction specifying information comprised of the first information indicating the main reproducing path and the second information indicating the subsidiary reproduction path reproduced in synchronism with the main reproducing path is reproduced to control the outputting accordingly. Other objects, features and advantages of the present invention will become more apparent from reading the embodiments of the present invention as shown in the drawings. Brief Description of the Drawings Fig.1 shows a configuration of an embodiment of a recording and/or reproducing apparatus according to the present invention. Fig.2 illustrates the data format of data recorded on a recording medium by a recording and/or reproducing apparatus 1. Fig.3 illustrates Real PlayList and Virtual PlayList.
15 Figs.4A, 4B and 4C illustrate the creation of the Real PlayList. Figs.5A, 5B and 5C illustrate deletion of the Real PlayList. Figs.6A and 6B illustrate assemble editing. Fig.7 illustrates provision of a sub path in the Virtual PlayList. Fig.8 illustrates the changing of the playback sequence of the PlayList. Fig.9 illustrates a mark on the PlayList and a mark on the Clip. Fig.10 illustrates a menu thumbnail. Fig.11 illustrates mark added to the PlayList. Fig.12 illustrates a mark added to the Clip. Fig.13 illustrates the relation between the PlayList, Clip and the thumbnail file. Fig.14 illustrates a directory structure. Fig.15 illustrates a syntax of infr.dvr. Fig.16 shows a syntax of DVRVolume. Fig.17 shows a syntax of ResumeVolume. Fig.18 shows a syntax of UlAppInfoVolume. Fig.19 shows a table of character set values. Fig.20 shows a syntax of TableOfPlayList. Fig.21 shows another syntax of TableOfPlayList. Fig.22 shows a syntax of the MakersPrivateData. Fig.23 shows a syntax of xxxx.rpls and yyyy.vpls. Figs.24A to 24C illustrate the PlayList.
16 Fig.25 shows a syntax of PlayList. Fig.26 shows a table of PlayListjtype. Fig.27 shows a syntax of UlAppInfoPlayList. Figs.28A to 28C illustrate flags in the UIAppInfoPlayList syntax shown in Fig.27. Fig.29 illustrates a PlayItem. Fig.30 illustrates a PlayItem. Fig.31 illustrates a PlayItem. Fig.32 shows a syntax of the PlayItem. Fig.33 illustrates IN-time. Fig.34 illustrates OUT-time. Fig.35 shows a table of ConnectionCondition. Figs.36A to 36D illustrate ConnectionCondition. Fig.37 illustrates BridgeSequencelnfo. Fig.38 shows a syntax of BridgeSequenceInfo. Fig.39 illustrates SubPlayltem. Fig.40 shows a syntax of SubPlayltem. Fig.41 shows a table of Mark-type. Fig.42 shows a syntax of PlayListMark. Fig.43 shows a table of Mark-type. Fig.44 illustrates Marktimestamp.
17 Fig.45 shows a syntax of zzzzz.clip. Fig.46 shows a syntax of ClipInfo. Fig.47 shows a table of Clipstreamjtype. Fig.48 illustrates offsetSPN. Fig.49 illustrates offsetSPN. Figs.50A, 50B illustrate the STC domain. Fig.51 illustrates STCInfo. Fig.52 shows a syntax of STCInfo. Fig.53 illustrates ProgramInfo. Fig.54 shows a syntax of ProgramInfo. Fig.55 shows a syntax of VideoCondingInfo. Fig.56 shows a table of Videoformat. Fig.57 shows a table of framerate. Fig.58 shows a table of displayaspectjratio. Fig.59 shows a syntax of AudioCondingInfo. Fig.60 shows a table of audiocoding. Fig.61 shows a table of audiocomponentjtype. Fig.62 shows a table of samplingjrequency. Fig.63 illustrates CPI. Fig.64 illustrates CPI. Fig.65 shows a syntax of CPI.
18 Fig.66 shows a table of CPItype. Fig.67 illustrates video EP-map. Fig.68 illustrates EPmap. Fig.69 illustrates EPmap. Fig.70 shows a syntax of EP-map. Fig.71 shows a table of EP-typevalues. Fig.72 shows a syntax of EP map_foronestreamPID. Fig.73 illustrates TUmap. Fig.74 shows a syntax of TUmap. Fig.75 shows a syntax of ClipMark. Fig.76 shows a table of Marktype. Fig.77 shows a table of Mark typestamp. Fig.78 shows a syntax of menu.thmb and mark.thmb. Fig.79 shows the syntax of thumbnail. Fig.80 shows a table of thumbnail-pictureformat. Figs.81A and 81Billustrate tnblock. Fig.82 illustrates a structure of a transport stream of DVR MPEG2. Fig.83 shows a recorder model of a transport stream of DVR MPEG2. Fig.84 shows a player model of a transport stream of DVR MPEG2. Fig.85 shows the syntax of a source packet. Fig.86 shows the syntax of TPextraheader.
19 Fig.87 shows a table of a copy permission indicator. Fig.88 illustrates seamless connection. Fig.89 illustrates seamless connection. Fig.90 illustrates seamless connection. Fig.91 illustrates seamless connection. Fig.92 illustrates seamless connection. Fig.93 illustrates audio overlap. Fig.94 illustrates seamless connection employing BridgeSequence. Fig.95 illustrates seamless connection not employing BridgeSequence. Fig.96 shows a DVR STD model. Fig.97 is a timing chart for decoding and display. Fig.98 shows the syntax of a PlayList file. Fig.99 shows the syntax of UIAppInfoPlayList in the PlayList file of Fig.98. Fig.100 shows the syntax of the PlayListo in the PlayList file of Fig.98. Fig.101 shows the syntax of SubPlayItem. Fig.102 is a flowchart for illustrating the method for forming RealPlayList. Fig.103 is a flowchart for illustrating the method for forming VirtualPlayList. Fig.104 is a flowchart for illustrating the method for reproducing the PlayList. Fig.105 is a flowchart for illustrating the method for reproducing a sub-path of PlayList. Fig.106 is a flowchart for illustrating the method for forming PlayListMark.
20 Fig.107 is a flowchart for illustrating the locating reproducing method employing PlayListMark. Fig.108 illustrates a medium. Best Mode for Carrying out the Invention Referring to the drawings, present embodiment of the present invention will be explained in detail. Fig.1 shows a typical inner structure of a recording and/or reproducing apparatus 1 embodying the present invention. First, the structure of a recording unit 2, configured for recording signals input from outside, is explained. The recording and/or reproducing apparatus 1 is configured for being fed with and recording analog or digital data. Analog video signals and analog audio signals are fed to terminals 11, 12, respectively. The video signals, input to the terminal 11, are output to an analysis unit 14 and to an AV encoder 15. The audio signals, input to the terminal 12, are output to the analysis unit 14 and to the AV encoder 15. The analysis unit 14 extracts feature points, such as scene changes, from the input video and audio signals. The AV encoder 15 encodes input video and audio signal to output the system information (S), such as an encoded video stream (V), an encoded audio stream (A) and AV synchronization, to a multiplexer 16. The encoded video stream is a video stream encoded e.g., with the MPEG (Moving Picture Expert Group) 2 system, whilst the encoded audio stream is an audio 21 stream encoded in accordance with the MPEG1 system, with the encoded audio stream being e.g., an audio stream encoded in e.g., the MPEG1 system or an audio stream encoded in accordance with the Dolby AC3 (trademak) system. The multiplexer 16 multiplexes the input video and audio streams, based on the input system information, to output a multiplexed stream through a switch 17 to a multiplexed stream analysis unit 18 and to a source packetizer 19. The multiplexed stream is e.g., an MPEG-2 transport stream or an MPEG2 program stream. The source packetizer 19 encodes the input multiplexed stream into an AV stream composed of source packets in accordance with an application format of a recording medium 100 on which to record the stream. The AV stream is processed in ECC (error correction and coding) unit 20 and a modulation unit 21 with appendage of ECC codes and with modulation, before being output to a write unit 22, which then writes (records) an AV stream file based on a control signals output by the controller 23. The transport stream, such as digital television broadcast, input from a digital interface or a digital television tuner, is input to a terminal 13. There are two recording systems for recording the transport stream input to the terminal 13, one being a transparent recording system and the other being a system in which recording is preceded by re-encoding aimed to lower e.g., the recording bit rate. The recording system command information is input from a terminal 24 as a user interface to a controller 23.
22 In the transparent recording of the input transport stream, a transport stream, input to a terminal 13, is output through a switch 17 to a multiplexed stream analysis unit 18 and to the source packetizer 19. The ensuing processing of recording an AV stream on a recording medium is the same as that of encoding and recording analog input audio and video signals, as described above, and hence is not explained here for simplicity. If the input transport stream is re-encoded and subsequently recorded, the transport stream, input to the terminal 13, is fed to a demultiplexer 26, which demultiplexes the input transport stream to extract a video stream (V), an audio stream (A) and the system information (S). Of the stream (information), as extracted by the demultiplexer 26, the video stream is output to an audio decoder 27, whilst the audio stream and the system information are output to the multiplexer 16. The audio decoder 27 decodes the input transport stream to output the encoded video stream (V) to the multiplexer 16. The audio stream and the system information, output from the demultiplexer 26 and input to the multiplexer 16, and the video stream, output by the AV encoder 15, are multiplexed, based on the input system information, and output to the multiplexed stream analysis unit 18 and to the source packetizer 19 through switch 17, as a multiplexed stream. The ensuing processing of recording an AV stream on a recording medium is the same as that of encoding and recording analog input audio and video signals, as described above, and hence is not explained here for simplicity.
23 The recording and/or reproducing apparatus 1 of the present embodiment records a file of the AV stream on the recording medium 100, while also recording the application database information which accounts for the file. The input information to the controller 23 is the feature information for the moving picture from the analysis unit 14, the feature information of the AV stream from the multiplexed stream analysis unit 18 and the user command information input at a terminal 24. The feature information of the moving picture, supplied from the analysis unit 14, is generated by the analysis unit 14 when the AV encoder 15 encodes video signals. The analysis unit 14 analyzes the contents of the input video and audio signals to generate the information pertinent to the pictures characteristic of the input moving picture signals (clip mark). This information is the information indicating a picture of characteristic clip mark points, such as program start points, scene change points, CM commercial start and end points, title or telop in input video signals, and also includes a thumbnail of the picture and the information pertinent to stereo/monaural switching points and muted portions of audio signals. The above picture indicating information is fed through controller 23 to the multiplexer 16. When multiplexing a encoded picture specified as clip mark by the controller 23, the multiplexer 16 returns the information for specifying the encoded picture on the AV stream to the controller 23. Specifically, this information is the PTS (presentation time stamp) of a picture or the address information on the AV stream of an encoded version of the picture. The controller 23 stores the sort of feature pictures 24 and the information for specifying the encoded picture on the AV stream in association with each other. The feature information of the AV stream from the multiplexed stream analysis unit 18 is the information pertinent to the encoding information of the AV stream to be recorded, and is recorded by an analysis unit 18. For example, the feature information includes the time stamp and address information of the I-picture in the AV stream, discontinuous point information of system time clocks, encoding parameters of the AV stream and change point information of the encoding parameters in the AV stream. When transparently recording the transport stream, input from the terminal 13, the multiplexed stream analysis unit 18 detects the picture of the aforementioned clip mark, from the input transport stream, and generates the information for specifying a picture designated by the clip mark and its type. The user designation information from the terminal 24 is the information specifying the playback domain, designated by the user, character letters for explaining the contents of the playback domain, or the information such as bookmarks or resuming points set by the user for his or her favorite scene. Based on the aforementioned input information, the controller 23 creates a database of the AV stream (Clip), a database of a group (PlayList) of playback domains (PlayItem) of the AV stream, management information of the recorded contents of the recording medium 100 (info.dvr) and the information on thumbnail pictures. Similarly to the AV stream, the application database information, constructed 25 from the above information, is processed in the ECC unit 20 and the modulation unit 21 and input to the write unit 22, which then records a database file on the recording medium 100. The above-described application database information will be explained subsequently in detail. When the AV stream file recorded on the recording medium 100 (files of picture data and speech data) and the application database information, thus recorded on the recording medium 100, are reproduced by a reproducing unit 3, the controller 23 first commands a readout unit 28 to read out the application database information from the recording medium 100. The readout unit 28 reads out the application database information from the recording medium 100, which then reads out the application database information from the recording medium 100 to send the application database information through demodulation and error correction processing by a demodulating unit 29 and an ECC decoder 30 to the controller 23. Based on the application database information, the controller 23 outputs a list of PlayList recorded on the recording medium 100 to a user interface of the terminal 24. The user selects the PlayList, desired to be reproduced, from the list of PlayLists. The information pertinent to PlayList, specified to be reproduced, is input to the controller 23. The controller 23 commands the readout unit 28 to read out the AV stream file necessary in reproducing the PlayList. In accordance with the command, the readout unit 28 reads out the corresponding AV stream from the recording medium 26 100 to output the read-out AV stream to the demodulating unit 29. The AV stream, thus input to the demodulating unit 29, is demodulated by preset processing and output through the processing by the ECC decoder 30 to a source depacketizer 31. The source depacketizer 31 converts the AV stream of the application format, read out from the recording medium 100 and processed in a preset fashion, into a stream processable by the demultiplexer 26. The demultiplexer 26 outputs the system information (S), such as the video stream (V), audio stream (A) or the AV synchronization, forming the playback domain (PlayItem) of the AV stream specified by the controller 23, to the audio decoder 27, which AV decoder 27 decodes the video stream and the audio stream to output the playback video signal and the playback audio signal to associated terminals 32, 33, respectively. If fed from the terminal 24, as a user interface, with the information instructing random access playback or special playback, the controller 23 determines the readout position of the AV stream from the recording medium 100, based on the contents of the database (Clip) of the AV stream, to command the readout unit 28 to read out the AV stream. If the PlayList as selected by the user is to be reproduced as from a preset time point, the controller 23 commands the readout unit 28 to read out data from an I-picture having a time stamp closest to the specified time point. When the user has selected a certain clip mark from indexing points or scene change points for the program stored in the ClipMark in the Clip Information, as when the user selects a certain picture from a list of thumbnail pictures, as demonstrated on 27 a user interface, of the indexing points or scene change points stored in the ClipMark, the controller 23 determines the AV stream readout position from the recording medium 100 to command the readout unit 28 to read out the AV stream. That is, the controller 23 commands the readout unit 28 to read out data from an I-picture having an address closest to the address on the AV stream which has stored the picture selected by the user. The readout unit 28 reads out data from the specified address. The read-out data is processed by the demodulating unit 29, ECC decoder 30 and by the source packetizer 19 so as to be supplied to the demultiplexer 26 and decoded by the audio decoder 27 to reproduce AV data indicated by an address of the mark point picture. If the user has commanded fast forward playback, the controller 23 commands the readout unit 28 to sequentially read out I-picture data in the AV stream in succession based on the database (Clip) of the AV stream. The readout unit 28 reads out data of the AV stream from a specified random access point. The so read-out data is reproduced through processing by various components on the downstream side. The case in which the user edits the AV stream recorded on the recording medium 100 is now explained. If desired to specify a playback domain for the AV stream recorded on the recording medium 100, for example, if desired to create a playback route of reproducing a portion sung by a singer A from a song program A, and subsequently reproducing a portion sung by the same singer A from another song 28 program B, the information pertinent to a beginning point (IN-point) and an end point (OUT-point) of the playback domain is input to the controller 23 from the terminal as a user interface. The controller 23 creates a database of the group (PlayList) of playback domains (PlayItem) of the AV streams. When the user desires to erase a portion of the AV stream recorded on the recording medium 100, the information pertinent to the IN-point and the OUT-point of the erasure domain is input to the controller 23, which then modifies the database of the PlayList so as to refer to only the needed AV streams. The controller 23 also commands the write unit 22 to erase an unneeded stream portion of the AV stream. The case in which the user desires to specify playback domains of an AV stream recorded on the recording medium to create a new playback route, and to interconnect the respective playback domains in a seamless fashion, is now explained. In such case, the controller 23 creates a database of a group (PlayList) of the playback domains (PlayItem) of the AV stream and undertakes to partially re-encode and re-multiplex the video stream in the vicinity of junction points of the playback domains. The picture information at the IN-point and that at the OUT-point of a playback domain are input from a terminal 24 to a controller 23. The controller 23 commands the readout unit 28 to read out data needed to reproduce the pictures at the IN-point and at the OUT-point. The readout unit 28 reads out data from the recording medium 100. The data so read out is output through the demodulating unit 29, ECC decoder 30 and the source packetizer 19 to the demultiplexer 26.
29 The controller 23 analyzes data input to the demultiplexer 26 to determine the re-encoding method for the video stream (change of picture.codingtype and assignment of the quantity of encoding bits for re-encoding) and the re-multiplexing system to send the system to the AV encoder 15 and to the multiplexer 16. The demultiplexer 26 then separates the input stream into the video stream (V), audio stream (A) and the system information (S). The video stream may be classed into data input to the audio decoder 27 and data input to the multiplexer 16. The former is data needed for re-encoding, and is decoded by the audio decoder 27, with the decoded picture being then re-encoded by the AV encoder 15 and thereby caused to become a video stream. The latter data is data copied from an original stream without re-encoding. The audio stream and the system information are directly input to the multiplexer 16. The multiplexer 16 multiplexes an input stream, based on the information input from the controller 23, to output a multiplexed stream, which is processed by the ECC unit 20 and the modulation unit 21 so as to be sent to the write unit 22. The write unit 22 records an AV stream on the recording medium 100 based on the control signals supplied from the controller 23. The application database information and the operation based on this information, such as playback and editing, are hereinafter explained. Fig.2 shows the structure of an application format having two layers, that is PlayList and Clip, for AV stream management. The Volume Information manages all Clips and PlayLists in the 30 disc. Here, one AV stream and the ancillary information thereof, paired together, is deemed to be an object, and is termed Clip. The AV stream file is termed a Clip AV stream file, with the ancillary information being termed the Clip Information file. One Clip AV stream file stores data corresponding to an MPEG-2 transport stream arranged in a structure prescribed by the application format. By and large, a file is treated as a byte string. The contents of the Clip AV stream file are expanded on the time axis, with entry points in the Clip (I-picture) being mainly specified on the time basis. When a time stamp of an access point to a preset Clip is given, the Clip Information file is useful in finding the address information at which to start data readout in the Clip AV stream file. Referring to Fig.3, PlayList is now explained, which is provided for a user to select a playback domain desired to be viewed from the Clip and to edit the playback domain readily. One PlayList is a set of playback domains in the Clip. One playback domain in a preset Clip is termed PlayItem and is represented by a pair of an IN-point and an OUT-point on the time axis. So, the PlayList is formed by a set of plural PlayItems. The PlayList is classified into two types, one of which is Real PlayList and the other of which is Virtual PlayList. The Real PlayList co-owns stream portions of the Clip it is referencing. That is, the Real PlayList takes up in the disc the data capacity corresponding to a stream portion of the Clip it is referencing and, when Real PlayList is erased, the data of the stream portion of the Clip it is referencing is also erased.
31 The Virtual PlayList is not co-owning Clip data. Therefore, if the Virtual PlayList is changed or erased, the contents of the Clip are in no way changed. The editing of the Real Playlist is explained. Fig.4A shows creation of Real PlayList and, if the AV stream is recorded as a new Clip, the Real PlayList which references the entire Clip is a newly created operation. Fig.4B shows the division of the real PlayList, that is the operation of dividing the Real PlayList at a desired point to split the Real PlayList in two Real PlayLists. This division operation is performed when two programs are managed in one clip managed by a sole PlayList and when the user intends to re-register or re-record the programs as separate individual programs. This operation does not lead to alteration of the Clip contents, that is to division of the Clip itself. Fig.4C shows the combining operation of the Real PlayList which is the operation of combining two Real PlayLists into one new Real PlayList. This combining operation is performed such as when the user desires to re-register two programs as a sole program. This operation does not lead to alteration of the Clip contents, that is to combining the clip itself into one. Fig.5A shows deletion of the entire Real PlayList. If the operation of erasing the entire preset Real PlayList, the associated stream portion of the Clip referenced by the deleted Real PlayList is also deleted. Fig.5B shows partial deletion of the Real PlayList. If a desired portion of the Real PlayList is deleted, the associated PlayItem is altered to reference only the needed 32 Clip stream portion. The corresponding stream portion of the Clip is deleted. Fig.5C shows the minimizing of the Real PlayList. It is an operation of causing the PlayItem associated with the Real PlayList to reference only the stream portion of the Clip needed for Virtual PlayList. The corresponding stream portion of the Clip not needed for the Virtual PlayList is deleted. If the Real PlayList is changed by the above-described operation such that the stream portion of the Clip referenced by the Real PlayList is deleted, there is a possibility that the Virtual PlayList employing the deleted Clip is present such that problems may be produced in the Virtual PlayList due to the deleted Clip. In order to prevent this from occurring, such a message which runs: "If there exists the Virtual PlayList referencing the stream portion of the Clip the Real PlayList is referencing, and the Real PlayList is deleted, the Virtual PlayList itself is deleted - is it all right?" is displayed for the user in response to the user's operation of deletion by way of confirmation or alarming, after which the processing for deletion is executed or cancelled subject to user's commands. Alternatively, the minimizing operation for the Real PlayList is performed in place of deleting the Virtual PlayList. The operation for the Virtual PlayList is now explained. If an operation is performed for the Virtual PlayList, the contents of the Clip are not changed. Figs.6A and 6B show the assembling and editing (IN-OUT editing). It is an operation of creating PlayItem of the playback domain the user has desired to view to create Virtual PlayList. The seamless connection between PlayItems is supported by the application 33 format, as later explained. If there exist two Real PlayLists 1, 2 and clips 1, 2 associated with the respective Real PlayLists, the user specifies a preset domain in the Real PlayList 1 (domain from IN1 to OUT1: PlayItem 1) as the playback domain, and also specifies, as the domain to be reproduced next, a preset domain in the Real PlayList 2 (domain from IN2 to OUT2: PlayItem 2) as the playback domain, as shown in Fig.6A, a sole Virtual PlayList made up of PlayItem 1 and the Playltem2 is prepared, as shown in Fig.6B. The re-editing of the Virtual PlayList is now explained. The re-editing may be enumerated by alteration of IN- or OUT points in the Virtual PlayList, insertion or appendage of new PlayItems to the Virtual PlayList and deletion of PlayItems in the Virtual PlayList. The Virtual PlayList itself may also be deleted. Fig.7 shows the audio dubbing (post recording) to the Virtual PlayList. It is an operation of registering the audio post recording to the Virtual PlayList as a sub path. This audio post recording is supported by the application software. An additional audio stream is added as a sub path to the AV stream of the main path of the Virtual PlayList. Common to the Real PlayList and the Virtual PlayList is an operation of changing (moving) the playback sequence of the PlayList shown in Fig.8. This operation is an alteration of the playback sequence of the PlayList in the disc (volume) and is supported by TableOfPlayList as defined in the application format, as will be 34 explained subsequently with reference to e.g., Fig.20. This operation does not lead to alteration of the Clip contents. The mark (Mark) is now explained. The mark is provided for specifying a highlight or characteristic time in the Clip and in the PlayList, as shown in Fig.9. The mark added to the Clip is termed the ClipMark. The ClipMark is e.g., a program indexing point or a scene change point for specifying a characteristic scene ascribable to contents in the AV stream. The ClipMark is generated by e.g., the analysis unit 14 of Fig.1. When the PlayList is reproduced, the mark of the Clip referenced by the PlayList may be referenced and used. The mark appended to the PlayList is termed the PlayListMark (play list mark). The PlayListMark is e.g., a bookmark point or a resuming point as set by the user. The setting of the mark to the Clip and to the PlayList is by adding a time stamp indicating the mark time point to the mark list. On the other hand, mark deletion is removing the time stamp of the mark from the mark list. Consequently, the AV stream is in no way changed by mark setting or by mark deletion. As another format of the ClipMark, a picture referenced by the ClipMark may be specified on the address basis in the AV stream. Mark setting on the Clip is by adding the address basis information indicating the picture of the mark point to the mark list. On the other hand, mark deletion is removing the address basis information indicating the mark point picture from the mark list. Consequently, the AV stream is in no way changed by mark setting or by mark deletion.
35 A thumbnail is now explained. The thumbnail is a still picture added to the Volume, PlayList and Clip. There are two sorts of the thumbnail, one of them being a thumbnail as a representative picture indicating the contents. This is mainly used in a main picture in order for the user to select what the or she desired to view on acting on a cursor, not shown. Another sort of the thumbnail is a picture indicating a scene pointed by the mark. The Volume and the respective PlayLists need to own representative pictures. The representative pictures of the Volume are presupposed to be used for initially demonstrating a still picture representing the disc contents when the disc is set in position in the recording and/or reproducing apparatus 1. It is noted that the disc means the recording medium 100 which is presupposed to be a of disc shape. The representative picture of the PlayList is presupposed to be used as a still picture for representing PlayList contents. As the representative picture of the PlayList, it may be contemplated to use the initial picture of the PlayList as the thumbnail (representative picture). However, the leading picture at the playback time of 0 is not necessarily an optimum picture representing the contents. So, the user is allowed to set an optional picture as a thumbnail of the PlayList. Two sorts of the thumbnails, that is the thumbnail as a representative picture indicating the Volume and the thumbnail as a representative picture indicating PlayList, are termed menu thumbnails. Since the menu thumbnails are demonstrated frequently, these thumbnails need to be read out at an elevated speed 36 from the disc. Thus, it is efficient to store the totality of the menu thumbnails in a sole file. It is unnecessary for the menu thumbnails to be pictures extracted from the moving pictures in the volume, but may be a picture captured from a personal computer or a digital still camera, as shown in Fig. 10. On the other hand, the Clip and the PlayList need be marked with plural marks, whilst the pictures of the mark points need to be readily viewed in order to grasp the contents of the mark positions. The picture indicating such mark point is termed a mark thumbnail. Therefore, the picture which is the original of the mark thumbnail is mainly an extracted mark point picture rather than a picture captured from outside. Fig.11 shows the relation between the mark affixed to the PlayList and the mark thumbnail, whilst Fig.12 shows the relation between the mark affixed to the Clip and the mark thumbnail. In distinction from the menu thumbnail, the mark thumbnail is used in e.g., a sub-menu for representing details of the PlayList, while it is not requested to be read out in a short access time. So, whenever a thumbnail is required, the recording and/or reproducing apparatus 1 opens a file and reads out a portion of the file, while there is no problem presented even if file opening and reading out a portion of the file by the recording and/or reproducing apparatus 1 takes some time. For decreasing the number of files present in a volume, it is preferred for the totality of the mark thumbnails to be stored in one file. While the PlayList may own one menu thumbnail and plural mark thumbnails, the user is not required to select the Clip directly (usually, the Clip is selected through PlayList), and hence there is no 37 necessity of providing menu thumbnails. Fig.13 shows the relation between the menu thumbnails, mark thumbnails, PlayList and Clips. In the menu thumbnail file are filed menu thumbnails provided from one PlayList to another. In the menu thumbnail file is contained a volume thumbnail representing the contents of data recorded on the disc. In the menu thumbnail file are filed thumbnails created from one PlayList to another and from one Clip to another. The CPI (Characteristic Point Information) is hereinafter explained. The CPI is data contained in the Clip information file and is used mainly for finding a data address in the Clip AV stream file at which to start the data readout when a time stamp of the access point to the Clip is afforded. In the present embodiment two sorts of the CPI are used, one of them being EPmap and the other being TUmap. The EP map is a list of entry point (EP) data extracted from the elementary stream and the transport stream. This has the address information used to find the site of entry points in the AV stream at which to start the decoding. One EP data is made up of a presentation time stamp (PTS) and a data address in the AV stream of the accessing unit associated with the PTS, with the data address being paired to the PTS. The EP map is used mainly for two purposes. First, it is used for finding a data address in the AV stream in the accessing unit referenced by the PTS in the PlayList. Second, the EP map is used for fast forward playback or fast reverse playback. If, in recording the input AV stream by the recording and/or reproducing apparatus 1, the 38 syntax of the stream can be analyzed, the EP map is created and recorded on the disc. The TU map has a list of time unit (TU) data which is derived from the arrival time point of the transport packet input through a digital interface. This affords the relation between the arrival-time-based time and the data address in the AV stream. When the recording and/or reproducing apparatus 1 records an input AV stream, and the syntax of the stream cannot be analyzed, a TUmap is created and recorded on the disc. The STCInfo stores the discontinuous point information in the AV stream file which stores the MPEG-2 transport stream. When the AV stream has discontinuous points of STC, the same PTS values may appear in the AV stream file. Thus, if a time point in the AV stream is specified on the PTS basis, the PTS pf the access point is insufficient to specify the point. Moreover, an index of the continuous STC domain containing the PTS is required. In this format, the continuous STC domain and its index are termed an STC-sequence and STC-sequence-id, respectively. The STC-sequence information is defined by the STCInfo of the Clip Information file. The STC-sequence-id is used in an AV stream file and is optional in the AV stream file having the TU-map. The programs are each a collection of elementary streams and co-own a sole system time base for synchronized reproduction of these streams. It is useful for a reproducing apparatus (recording and/or reproducing apparatus 39 1 of Fig.1) to know the contents of an AV stream prior to its decoding. These contents include e.g., values of the PID of a transport packet transmitting an audio or video elementary stream or the type of the video or audio components, such as HDTV video or MPEG-2 AAC audio stream. This information is useful for creating a menu screen foy illustrating to the user the contents of the PlayList referencing the AV stream. It is similarly useful for setting the initial state of the AV decoder and the demultiplexer of the respective apparatus. For this reason, the Clip Information file owns ProgramInfo for illustrating the program contents. It may be an occurrence that program contents be changed in the AV stream file in which the MPEG-2 transport stream is stored. For example, the PID of the transport packet transmitting the video elementary stream may be changed, or the component type of the video stream may be changed from SDTV to HDTV. The ProgramInfo stores the information on change points of program contents in the AV stream file. The domain of the AV stream file in which the program contents remain constant is termed program-sequence. This program-sequence is used in an AV stream file having EP map and is optional in an AV stream file having TU-map. The present embodiment defines the self-encoding stream format (SESF). The SESF is used for encoding analog input signals and for decoding digital input signals for subsequently encoding the decoded signal into an MPEG-2 transport stream.
40 The SESF defines an elementary stream pertinent to the MPEG-2 transport stream and the AV stream. When the recording and/or reproducing apparatus encodes and records input signals in the SESF, an EP map is created and recorded on the disc. A digital broadcast stream uses one of the following systems for recording on the recording medium 100: First, the digital broadcast stream is transcoded into an SESF stream. In this case, the recorded stream must conform to SESF and EP-map must be prepared and recorded on the disc. Alternatively, an elementary stream forming a digital broadcast stream is transcoded to a new elementary stream and re-multiplexed to a new transport stream conforming to the stream format prescribed by the organization for standardizing the digital broadcast stream. In this case, an EPmap must be created and recorded on the disc. For example, it is assumed that the input stream is an MPEG-2 transport stream conforming to the ISDB (standard appellation of digital BS of Japan), with the transport stream containing the HDTV video stream and the MPEG AAC audio stream. The HDTV video stream is transcoded to an SDTV video stream, which SDTV video stream and the original AAC audio stream are re-multiplexed to TS. The SDTV stream and the transport stream both need to conform to the ISDB format. Another system of recording the digital broadcast stream on the recording medium 100 is to make transparent recording of the input transport stream, that is to record the input transport stream unchanged, in which case the EPmap is formulated 41 and recorded on the disc. Alternatively, the input transport stream is recorded transparently, that is an input transport stream is recorded unchanged, in which case TUmap is created and recorded on the disc. The directory and the file are hereinafter explained. The recording and/or reproducing apparatus 1 is hereinafter described as DVR (digital video recording). Fig.14 shows a typical directory structure on the disc. The directories of the disc of the DVR may be enumerated by a root directory including "DVR" directory, and the "DVR" directory, including "PLAYLIST" directory, "CLIPINF" directory, "M2TS" directory and "DATA" directory, as shown in Fig. 14. Although directories other than these may be created below the root directory, these are discounted in the application format of the present embodiment. Below the "DATA" directory, there are stored all files and directories prescribed by the DVR application format. The "DVR" directory includes four directories. Below the "PLAYLIST" directory are placed Real PlayList and Virtual PlayList database files. The latter directory may exist in a state devoid of PlayList. Below "CLIPINF" is placed a Clip database. This directory, too, may exist in a state devoid of AV stream files. In the "DATA" directory, there are stored files of data broadcast, such as digital TV broadcast. The "DVR" directory stores the following files: That is, an "info.dvr" is created below the DVR directory to store the comprehensive information of an application 42 layer. Below the DVR directory, there must be a sole info.dvr. The filename is assumed to be fixed to info.dvr. The "menu.thmb" stores the information pertinent to the menu thumbnails. Below the DVR directory, there must be 0 or 1 mark thumbnail. The filename is assumed to be fixed to "menu.thmb". If there is no menu thumbnail, this file may not exist. The "mark.thmb" file stores the information pertinent to the mark thumbnail picture. Below the DVR directory, there must be 0 or 1 mark thumbnail. The filename is assumed to be fixed to "menu.thmb". If there is no menu thumbnail, this file may not exist. The "PLAYLIST" directory stores two sorts of the PlayList files which are Real PlayList and Virtual PlayList. An "xxxxx.rpls" file stores the information pertinent to one Real PlayList. One file is created for each Real PlayList. The filename is "xxxxx.rpls", where "xxxxx" denotes five numerical figures from 0 to 9. A file extender must be "rpls". The "yyyyy.vpls" stores the information pertinent to one Virtual PlayList. One file with a filename "yyyyy,vpls" is created from one Virtual PlayList to another, where "yyyyy" denotes five numerical figures from 0 to 9. A file extender must be "vpls". The "CLIPINF" directory stores one file in association with each AV stream file. The ' t zzzzz.clpi" is a Clip Information file corresponding to one AV stream file (Clip AV stream file or Bridge-Clip stream file). The filename is "zzzzz.clpi", where "zzzzz" denotes five numerical figures from 0 to 9. A file extender must be tclpi".
43 The "M2TS" directory stores an AV stream file. The "zzzzz.m2ts" file is an AV stream file handled by the DVR system. This is a Clip AV stream file or a Bridge-Clip AV stream file. The filename is "zzzzz.m2ts", where "zzzzz" denotes five numerical figures from 0 to 9. A file extender must be "m2ts". The "DATA" directory stores data transmitted from data broadcasting. This data may, for example, be XML or MPEG files. The syntax and the semantics of each directory (file) are now explained. Fig.15 shows the syntax of the "info.dvr" file. The "info.dvr" file is made up of three objects, that is DVRVoumeo, TableOfPlayLists() and MakersPrivateDatao. The syntax of info.dvr shown in Fig.15 is explained. The TableOfPlayListsStartaddress indicates the leading address of the TableOfPlayListsQ in terms of the relative number of bytes from the leading byte of the "info.dvr" file. The relative number of bytes is counted beginning from 0. The MakersPrivateDataStartaddress indicates the leading address of the MakersPrivateDatao, in terms of the relative number of bytes as from the leading byte of the "info.dvr" file. The relative number of bytes is counted from 0. The paddingword is inserted in association with the syntax of "info.dvr". N1 and N2 are optional positive integers. Each padding word may assume an optional value. The DVRVolumeO stores the information stating the contents of the volume (disc). Fig.16 shows the syntax of the DVRVolume. The syntax of the DVRVolumeO, shown in Fig.16, is now explained. The versionnumber indicates four 44 character letters indicting the version numbers of the DVRVolumeO. The versionnumber is encoded to "0045" in association with IS0646. Length is denoted by 32-bit unsigned integers indicating the number of bytes from directly after the length field to the trailing end of DVRVolumeO. The ResumeVolumeo memorizes the filename of the Real PlayList or the Virtual PlayList reproduced last in the Volume. However, the playback position when the user has interrupted playback of the Real PlayList or the Virtual PlayList is stored in the resume-mark defined in the PlayListMarkO (see Figs.42 and 43). Fig.17 shows the syntax of the ResumeVolumeO. The syntax of the ResumeVolumeo shown in Fig.17 is explained. The valid-flag indicates that the resumePlayListname field is valid or invalid when this 1-bit flag is set to 1 or 0, respectively. The 10-byte field of resumePlayListname indicates the filename of the Real PlayList or the Virtual PlayList to be resumed. The UIAppInfoVolume in the syntax of the DVRVolumeO, shown in Fig.16, stores parameters of the user interface application concerning the Volume. Fig.18 shows the syntax of the UIAppInfoVolume, the semantics of which are now explained. The 8-bit field of characterset indicates the encoding method for character letters encoded in the Volume-name field. The encoding method corresponds to the values shown in Fig.19. The 8-bit field of the name-length indicates the byte length of the Volume name 45 indicated in the Volumename field. The Volumename field indicates the appellation of the Volume. The number of bytes of the number of the namelength counted from left of the field is the number of valid characters and indicates the volume appellation. The values next following these valid character letters may be any values. The Volume_protectflag is a flag indicating whether or not the contents in the Volume can be shown to the user without limitations. If this flag is set to 1, the contents of the Volume are allowed to be presented (reproduced) to the user only in case the user has succeeded in correctly inputting the PIN number (password). If this flag is set to 0, the contents of the Volume are allowed to be presented to the user even in case the PIN number is not input by the user. If, when the user has inserted a disc into a player, this flag has been set to 0, or the flag is set to 1 but the user has succeeded in correctly inputting the PIN number, the recording and/or reproducing apparatus 1 demonstrates a list of the PlayList in the disc. The limitations on reproduction of the respective PlayLists are irrelevant to the Volumeprotectflag and are indicated by playbackscontrol-flag defined in the UIAppInfoVolume. The PIN is made up of four numerical figures of from 0 to 9, each of which is coded in accordance with ISO/IEC 646. The refthumbnailindex field indicates the information of a thumbnail picture added to the Volume. If the refthumbnailindex field is of a value other than OxFFFF, a thumbnail picture is added to the Volume. The thumbnail picture is stored in a menu.thumb file. The picture is referenced using the 46 value of the refthumbnailindex in the menu.thumb file. If the refthumbnailindex field is OxFFFF, it indicates that a thumbnail picture has been added to the Volume. The TableOfPlayListo in the info.dvr syntax shown in Fig.15 is explained. The TableOfPlayListo stores the filename of the PlayList (Real PlayList and Virtual PlayList). All PlayList files recorded in the Volume are contained in the TableOfPlayListO, which TableOfPlayList() indicates the playback sequence of the default of the PlayList in the Volume. Fig.20 shows the syntax of the TableOfPlayListo, which is now explained. The versionnumber of the TableOfPlayListO indicates four character letters indicating the version numbers of the TableOfPlayLists. The versionnumber must be encoded to "0045" in accordance with ISO 646. Length is a unsigned 32-bit integer indicating the number of bytes of the TableOfPlayListo from directly after the length field to the trailing end of the TableOfPlayListO. The 16-bit field of the numberofPlayLists indicates the number of loops of the for-loop inclusive of the PlayListfilename. This numerical figure must be equal to the number of PlayLists recorded in the Volume. The 10-byte numerical figure of the PlayListfilename indicates the filename of the PlayLists. Fig.21 shows another configuration of the syntax of the TableOfPlayListo. The syntax shown in Fig.21 is comprised of the syntax shown in Fig.20 in which is contained the UIAppInfoPlayList. By such structure including the UIAppInfoPlayList, it becomes possible to create a menu picture simply on reading out the 47 TableOfPlayLists. The following explanation is premised on the use of the syntax shown in Fig.20. The MakersPrivateData in the info.dvr shown in Fig.15 is explained. The MakersPrivateData is provided to permit the maker of the recording and/or reproducing apparatus 1 to insert private data of the maker in the MakersPrivateDataO for special applications of different companies. The private data of each maker has standardized maker_ID for identifying the maker who has defined it. The MakersPrivateDataO may contain one or more makerID. If a preset maker intends to insert private data, and the private data of a different maker is already contained in the MakersPrivateDataO, the new private data is added to the MakersPrivateDatao without erasing the pre-existing old private data. Thus, in the present embodiment, private data of plural makers can be contained in one MakersPrivateDatao. Fig.22 shows the syntax of the MakersPrivateData. The syntax of the MakersPrivateData shown in Fig.22 is explained. The versionnumber of the TableOfPlayListo indicates four character letters indicating the version numbers of the TableOfPlayLists. The version-number must be encoded to "0045" in accordance with ISO 646. Length is a unsigned 32-bit integer indicating the number of bytes of the TableOfPlayListo from directly after the length field to the trailing end of the MakersPrivateDatao. The mpdblocksstartaddress indicates the leading end address of the first 48 mp d_block( in terms of the relative number of bytes from the leading byte of the MakersPrivateDataO. The numberofmakerentries is the 16-bit codeless integer affording the number of entries of the maker private data included in the MakersPrivateDatao. There must not be present two or more maker private data having the same makerID values in the MakersPrivateDatao. The mpd_blockssize is a 16-bit unsigned integer affording one mpd-block size in terms of 1024 bytes as a unit. For example, if mpdblocksize = 1, it indicates that the size of one mpdblock is 1024 bytes. The numberofmpdblocks is a 16-bit unsigned integer affording the number of mpd.blocks contained in the MakersPrivateDatao. The makerID is the 16-bit unsigned integer indicating the model number code of the DVR system which has created the maker private data. The value encoded to the makerID is specified by the licensor. The makermodelcode is a 16-bit unsigned integer indicating the model number code of the DVR system which has created the maker private data. The value encoded to the makermodelcode is set by the maker who has received the license of the format. The start_mpdblocknumber is a 16-bit unsigned integer indicating the number of the mpdblock number at which begins the maker private data. The leading end of the maker private data must be aligned with the leading end of the mpdblock. The start_mpd blocknumber corresponds to a variable j in the for-loop of the mpd_block. The mpdjength is a 32-bit unsigned integer indicating the size of the maker 49 private data. The mpd block is an area in which is stored maker private data. All of the mpdblocks in the MakersPrivateData() must be of the same size. The real PlayList file and the Virtual PlayList file, in other words, xxxxx.rpls and yyyyy.vpls, are explained. Fig.23 shows the syntax of xxxxx.rpls (Real PlayList) and yyyyy.vpls (Virtual PlayList), which are of the same syntax structure. Each of the xxxxx.rpls and yyyyy.vpls is made up of three objects, that is PlayListO, PlayListMark() and MakersPrivateDatao. The PlayListMarkStartaddress indicates the leading address of the PlayListMark(, in terms of the relative number of bytes from the leading end of the PlayList file as a unit. The relative number of bytes is counted from zero. The MakersPrivateDataStartaddress indicates the leading address of the MakersPrivateDatao, in terms of the relative number of bytes from the leading end of the PlayList file as a unit. The relative number of bytes is counted from zero. The paddingword (padding word) is inserted in accordance with the syntax of the PlayList file, with N1 and N2 being optional positive integers. Each padding word may assume an optional value. PlayList will be further explained in the following although it has been explained briefly. A playback domain in all Clips except Bridge-Clip must be referred by all PlayLists in the disc. Also, two or more Real PlayLists must not overlap the playback domains shown by their PlayItems in the same Clip. Reference is made to Figs.24A, 24B and 24C. For all Clips, there exist 50 corresponding Real PlayLists, as shown in Fig.24A. This rule is observed even after the editing operation has come to a close, as shown in Fig.24B. Therefore, all Clips must be viewed by referencing one of Real PlayLists. Referring to Fig.24C, the playback domain of the Virtual PlayList must be contained in the playback domain and in the Bridge-Clip playback domain. There must not be present in the disc Bridge-Clip not referenced by any Virtual PlayList. The Real PlayList, containing the list of the PlayItem, must not contain SubPlayItem. The Virtual PlayList contains the PlayItem list and, if the CPIjtype contained in the PlayListQ is the EPmap type and the PlayList-type is 0 (PlayList containing video and audio) , the Virtual PlayList may contain one SubPlayltem. In the PlayListQ in the present embodiment, the SubPlayltem is used only for audio post recording. The number of the SubPlayltems owned by one Virtual PlayList must be 0 or 1. The PlayList is hereinafter explained. Fig.25 shows the PlayList syntax which is now explained. The versionnumber indicates four character letters indicting the version numbers of the PlayListo. The versionnumber is encoded to "0045" in association with ISO 646. Length is a 32-bit unsigned integer indicating the total number of byte of the PlayList() as from directly after the length field to the trailing end of the PlayListo. The PlayListtype, one example of which is shown in Fig.26, is an 8-bit field indicating the PlayList type. The CPItype is one-bit flag indicating the value of the CPI type of the Clip 51 referenced by the PlayItem() and by the SubPlayItem(. The CPI types defined in the CPIs of all Clips referenced by one PlayList must be of the same values. The numberofPlayItems is a 16-bit field indicating the number of PlayItems present in the PlayList. The PlayItemid corresponding to the preset PlayltemO is defined by the sequence in which the PlayltemO appears in the for-loop containing the PlayItem(. The PlayItemid begins with 0. The nimberofSubPlayItems is a 16-bit field indicating the number of SubPlayItem in the PlayList. This value is 0 or 1. An additional audio stream path (audio stream path) is a sort of a sub path. The UlAppInfoPlayList of the PlayList syntax shown in Fig.25 is explained. The UlAppInfoPlayList stores parameters of the user interface application concerning the PlayList. Fig.27 shows the syntax of the UIAppInfoPlayList, which is now explained. The characterset is an 8-bit field indicating the method for encoding character letters encoded in the PlayListname field. The encoding method corresponds to the values conforming to the table shown in Fig.19. The name-length is an 8-bit field indicating the byte length of the PlayList name indicated in the PlayListname field. The PlayListname field shows the appellation of the PlayList. The number of bytes of the number of the namelength counted from left of the field is the number of valid characters and indicates the PlayList appellation. The values next following these valid character letters may be any values. The record timeanddate is a 56-bit field storing the date and time on which 52 the PlayList was recorded. This field is 14 numerical figures for year/ month/ day/ hour/minute/ second encoded in binary coded decimal (BCD). For example, 2001/12/ 23:01:02:03 is encoded to "x2011223010203". The duration is a 24-bit field indicating the total replay time of the PlayList in terms of hour/ minute/ second as a unit. This field is six numerical figures encoded in binary coded decimal (BCD). For example, 01:45:30 is encoded to "0x014530". The validperiod is a 32-bit field indicating the valid time periods of the PlayList. This field is 8 numerical figures encoded in 4-bit binary coded decimal (BCD). The validperiod is used in the recording and/or reproducing apparatus 1 e.g., when the PlayList, for which the valid period has lapsed, is to be automatically erased. For example, 2001/05/07 is encoded to "Ox2001050 7 ". The makerID is a 16-bit unsigned integer indicating the maker of the DVR player (recording and/or reproducing apparatus 1) which has been the last to update its PlayList. The value encoded to makerID is assigned to the licensor of the DVR format. The makercode is a 16-bit unsigned integer indicating the model number of the DVR player which has been the last to update the PlayList. The value encoded to the makercode is determined by the maker who has received the license of the DVR format. If the flag of the playbackcontrol-flag is set to 1, its PlayList is reproduced only when the user successfully entered the PIN number. If this flag is set to 0, the user may view the PlayList without the necessity of inputting the PIN number.
53 If the writeprotect flag is set to 1, the contents of the PlayList are not erased nor changed except the writeprotectflag. If this flag is set to 0, the user is free to erase or change the PlayList. If this flag is set to 1, the recording and/or reproducing apparatus 1 demonstrates a message requesting re-confirmation by the user before the user proceeds to erase, edit or overwrite the PlayList. The Real PlayList, in which the writeprotect-flag is set to 0, may exist, the Virtual PlayList, referencing the Clip of the Real PlayList may exist, and the writeprotect flag of the Virtual PlayList may be set to 1. If the user is desirous to erase the Real PlayList, the recording and/or reproducing apparatus 1 issues an alarm to the user as to the presence of the aforementioned Virtual PlayList or "minimizes" the Real PlayList before erasing the Real PlayList. If isplayed flag is set to 1, as shown in Fig.28B, it indicates that the PlayList was reproduced at least once since it was recorded, whereas, if it is set to 0, it indicates that the PlayList was not reproduced even once since it was recorded. Archive is a two-bit field indicating whether the PlayList is an original or a copy, as shown in Fig.28C. The field ofrefthumbnail-index indicates the information of a thumbnail picture representative of the PlayList. If the refthumbnailindex field is of a value other than OxFFFF, a thumbnail picture representative of the PlayList is added in the PlayList, with the PlayList being stored in the menu.thmb file. The picture is referenced using the value of refthumbnailindex in the menu.thmb file. If the refthumbnail-index field is 0xFFFF, no thumbnail picture representative of the 54 PlayList is added in the PlayList. The PlayItem is hereinafter explained. One PlayltemO basically contains the following data: Clip_Informationfilename for specifying the filename of the Clip, IN-time and OUT-time, paired together to specify the playback domain of Clip, STCsequenceid referenced by IN-time and OUT-time in case the CPI type defined in PlayListQ is EPmap type, and ConnectionCondition indicating the connection condition of previous PlayItem and current PlayItem. If PlayList is made up of two or more PlayItems, these PlayItems are arrayed in a row, without temporal gap or overlap, on the global time axis of the PlayList. If CPI_type defined in the PlayList is EPmap type and the current PlayList does not have the BridgeSequenceo, the IN-time and OUT-time pair must indicate the same time on the STC continuous domain as that specified by the STCsequenceid. Such instance is shown in Fig.29. Fig.30 shows such a case in which the CPI type defined by PlayListo and, if the current PlayItem has the BridgeSequenceo, the rules as now explained are applied. The IN-time of the PlayItem previous to the current PlayItem, shown as INtimel, indicates the time in Bridge-Clip specified in the BridgeSequencelnfoo of the current PlayItem. This OUTtime must obey the encoding limitations which will be explained subsequently. The INtime of the current PlayItem, shown as INtime2, indicates the time in Bridge-Clip specified in the BridgeSequencelnfoo of the current PlayItem. This 55 INtime also must obey the encoding limitations as later explained. The OUTtime of PlayItem of the current PlayItem, shown as OUTtime2, indicates the time on the STC continuous domain specified by STC sequenbceid of the current PlayItem. If the CPI-type of PlayListo is TU map type, the INtime and OUTtime of PlayItem, paired together, indicate the time on the same Clio AV stream, as shown in Fig.31. The PlayItem syntax is as shown in Fig.32. As to the syntax of the PlayItem, shown in Fig.32, the field of the Clipinformation_filename indicates the filename of the Clip Information. The Clipstreamtype defined by the ClipInfoo of this Clip Information file must indicate the Clip AV stream. The STC sequenceid is an 8-bit field and indicates the STC sequenceid of the continuous STC domain referenced by the PlayItem. If the CPI type specified in the PlayListQ is TU map type, this 8-bit field has no meaning and is set to 0. IN-time is a 32-bit field and used to store the playback start time of PlayItem. The semantics of INtime differs with CPItype defined in the PlayListQ, as shown in Fig.33. OUTtime is a 32-bit field and is used to store the playback end time of PlayItem. The semantics of OUTtime differs with CPItype defined in the PlayListo, as shown in Fig.34. Connectioncondition is a 2-bit field indicating the connection condition between the previous PlayItem and the current PlayItem, as shown in Fig.35. Figs.36A to 36D illustrate various states of Connectioncondition shown in Fig.35.
56 BridgeSequenceInfo is explained with reference to Fig.37. This BridgeSequenceInfo is the ancillary information of the current PlayItem and includes the following information. That is, BridgeSequenceInfo includes Bridge_Clipjnformation_filename for specifying the BridgeClip AV file and a BridgeClipjnformationfilename specifying the corresponding Clip Information file (Fig.45). It is also an address of a source packet on the Clip AV stream referenced by the previous PlayItem. Next to this source packet is connected the first source packet of the Bridge-Clip AV stream. This address is termed the RSPN-exit-from_previousClip. It is also an address of the source packet on the Clip AV stream referenced by the current PlayItem. Ahead of this source packet is connected the last source packet of the Bridgeclip AV stream file. This address is termed RSPNentertocurrentClip. In Fig.37, RSPN_arrivaltimediscontinuity indicates an address of a source packet in the BridgeClip AV stream where there is a discontinuous point in the arrival time base. This address is defined in the ClipInfoo (Fig.46). Fig.38 shows the syntax of the BridgeSequenceInfo. Turning to the syntax of BridgeSequenceInfo shown in Fig.38, the field of BridgeClipjnformationfilename indicates the filename of the Clip Information file corresponding to the Bridge._Clipjnformation_file. The Clip_streamtype defined in ClipInfoo of this Clip information file must indicate 'BridgeClip AV stream'.
57 The 32-bit field of the RSPNexitfrompreviousClip is a relative address of a source packet on the Clip AV stream referenced by the previous PlayItem. Next to this source packet is connected the first source packet of the BridgeClip AV stream file. The RSPN-exit-from_previousClip has a size based on the source packet number as a unit, and is counted with the value of the offsetSPN defined in the ClipInfoo from the first source packet of the Clip AV stream file referenced by the previous PlayItem. The 32-bit field of RSPNentertocurentClip is the relative address of the source packet on the Clip AV stream referenced by the current PlayItem. Ahead of this source packet is connected the last source packet of the BridgeClipAV stream file. The RSPNentertocurentClip has a size that is based on the source packet number as a unit. The RSPNentertocurentClip is counted with the value of the offsetSPN, defined in the ClipInfo() from the first source packet of the Clip AV stream file referenced by the current PlayItem, as an initial value. The SubPlayltem is explained with reference to Fig.39. The use of SubPlayltemO is permitted only if the CPI-type of the PlayListQ is the EPmap type. In the present embodiment, SubPlayltem is used only for audio post recording. The SubPlayltemo includes the following data. First, it includes Clip_Information filename for specifying the Clip referenced by the sub path in the PlayList. It also includes SubPathIN time and SubPathOUT time for specifying the 58 sub path playback domain in the Clip. Additionally, it includes syncPlayItemid and startPTSofPlayItem for specifying the time of starting the sub path reproduction on the main path time axis. The Clip AV stream, referenced by the sub path, must not contain STC discontinuous points (discontinuous points of the system time base). The clocks of audio samples of the Clip used in the sub path are locked to the clocks of the audio samples of the main path. Fig.40 shows the syntax of the SubPlayltem. Turning to the syntax of the SubPlayltem, shown in Fig.40, the field of the ClipInformationfilename indicates the filename of the Clip Information file and is used by a sub path in the PlayList. The Clip_streamtype defined in this ClipInfoO must indicate the Clip AV stream. An 8-bit field of syncPlayltem_id indicates the sub path type. Here, only 'OxOO' is set, as shown in Fig.41, while other values are reserved for future use. The 8-bit field of syncPlaytem-id indicates the Playltemid of the PlayItem containing the time of playback start of the sub path on the time axis of the main path. The value of Playltemid corresponding to the preset PlayItem is defined in the PlayList( (Fig.25). A 32-bit field of sync startPTSof PlayItem denotes the time of playback start of the sub path on the time axis of the main path, and denotes the upper 32 bits of the PTS (presentation time stamp) on the PlayItem referenced by the syncPlayItemid. The upper 32 bit field of the SubPathIN time stores the playback start time of the sub path. SubPathINtime denotes upper 32 bits of the PTS of 33 bits corresponding to 59 the first presentation unit in the sub path. The upper 32 bit field of subPathOUTtime stores the playback end time of the sub path. SubPathOUTtime indicates upper 32 bits of the value of the PresentationendTS calculated by the following equation: PresentationendTS = PTSOUT + AUduration where PTSout is the PTS of the 33 bit length corresponding to the last presentation unit of the SubPath and AUduration is the 90 kHz based display period of the last presentation unit of the SubPath. Next, PlayListMark() in the syntax of xxxxx.rpls and yyyyy.vpls shown in Fig.23 is explained. The mark information pertinent to the PlayList is stored in this PlayListMark. Fig.42 shows the syntax of PlayListMark. Turning to the syntax of the PlayListMark shown in Fig.42, version-number is four character letters indicating the version number of this PlayListMarko. The versionnumber must be encoded to "0045" in accordance with ISO 646. Length is an unsigned 32-bit integer indicating the number of bytes of PlayListMark() from directly after the length field to the trailing end of the PlayListMarko. The numberofPlayListMarks is a 16-bit unsigned integer indicating the number of marks stored in the PlayListMark. The numberofPlayListMarks may be zero. The mark-type is an 8-bit field indicating the mark type and is encoded in the table shown in Fig.43. A 32-bit filed of marktimestamp stores a time stamp indicating the point 60 specified by the mark. The semantics of the mark-time-stamp differ with CPI-type defined in the PlayList(, as shown in Fig.44. The Playltemid is an 8-bit field specifying the PlayItem where the mark is put. The values of PlayItemid corresponding to a preset PlayItem is defined in the PlayList() (see Fig.25). An 8-bit field of characterset shows the encoding method of character letters encoded in the markname field. The encoding method corresponds to values shown in Fig.19. The 8-bit field of name-length indicates the byte length of the mark name shown in the markname field. The markname field denotes the mark name indicated in the markname field. The number of bytes corresponding to the number of name-lengths from left of this field is the effective character letters and denotes the mark name. In the markname field, the value next following these effective character letters may be arbitrary. The field of the refthumbnail-index denotes the information of the thumbnail picture added to the mark. If the field of the refthumbnailindex is not OxFFFF, a thumbnail picture is added to its mark, with the thumbnail picture being stored in the mark.thmb file. This picture is referenced in the mark.thmb file, using the value of refthumbnailindex, as explained subsequently. If the refthumbnailindex field is OxFFFF, it indicates that no thumbnail picture is added to the mark. The Clip Information file is now explained. The zzzzz.clpi (Clip Information file) is made up of six objects, as shown in Fig.45. These are ClipInfo(, STCInfoo, Program, CPIO, ClipMarko and MakersPrivateDatao. For the AV stream (Clip AV 61 stream or Bridge-Clip AV stream) and the corresponding Clip Information file, the same string of numerals "zzzzz" is used. Turning to the syntax of zzzzz.clpi (Clip Information file) shown in Fig.45 is explained. The ClipInfoStartaddress indicates the leading end address of ClipInfo( with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The relative number of bytes is counted from zero. The STCInfoStartaddress indicates the leading end address of STCInfo with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The ProgramInfoStartaddress indicates the leading end address of ProgramInfoQ with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The relative number of bytes is counted from 0. The CPIStartaddress indicates the leading end address of CPI() with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The relative number of bytes is counted from zero. The ClipMarkStartaddress indicates the leading end address of ClipMarko with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The relative number of bytes is counted from zero. TheMakersPrivateData Startaddress indicates the leading end address of MakersPrivateDatao with the relative number of bytes from the leading end byte of the zzzzz.clpi file as a unit. The relative number of bytes is counted from zero. The padding_word is inserted in accordance with the syntax of the zzzzz.clpi file. N1, N2, N3, N4 and N5 must be zero 62 or optional positive integers. The respective padding words may also assume optional values. The ClipInfo is now explained. Fig.46 shows the syntax of ClipInfo. Fig.46 shows the syntax of ClipInfo. In the ClipInfoo is stored the attribute information of corresponding AV stream files (Clip AV stream or Bridge-Clip AV stream file). Turning to the syntax of the ClipInfo shown in Fig.46, versionnumber is the four character letters indicating the version number of this ClipInfoo. The versionnumber must be encoded to "00 45 " in accordance with the ISO 646. Length is a 32-bit unsigned integer indicating the number of bytes of ClipInfoO from directly at back of the length field to the trailing end of the ClipInfoO. An 8-bit field of Clipstreamtype indicates the type of the AV stream corresponding to the Clip Information file, as shown in Fig.47. The stream types of the respective AV streams will be explained subsequently. The 32-bit field of offsetSPN gives an offset value of the source packet number of the first source packet number of the first source packet of the AV stream (Clip AV stream or the Bridge-Clip AV stream). When the AV stream file is first recorded on the disc, this offsetSPN must be zero. Referring to Fig.48, when the beginning portion of the AV stream file is erased by editing, offsetSPN may assume a value other than 0. In the present embodiment, the relative source packet number (relative address) referencing the offsetSPN is frequently described in the form of RSPNxxx, where xxx is modified such that 63 RSPN xxx is RAPNEPstart. The relative source packet number is sized with the source packet number as a unit and is counted from the first source packet number of the AV stream file with the value of the offsetSPN as the initial value. The number of source packets from the first source packet of the AV stream file to the source packet referenced by the relative source packet number (SPN-xxx) is calculated by the following equation: SPNxxx = RSPNxxx - offsetSPN. Fig.48 shows an instance in which offsetSPN is 4. TSrecordingrate is a 24-bit unsigned integer, which affords an input/output bit rate required for the AV stream to the DVR drive (write unit 22) or from the DVR drive (readout unit 28). The recordtimeanddate is a 56-bit field for storing the date and time of recording of the AV stream corresponding to the Clip and is an encoded representation of year/month/day/hour/minute in 4-bit binary coded decimal (BCD) for 14 numerical figures. For example, 2001/2/23:01:02:03 is encoded to "0x20011223010203". The duration is a 24-bit field indicating the total playback time of the Clip by hour/minute/second based on arrival time clocks. This field is six numerical figures encoded in 4-bit binary coded decimal (BCD). For example, 01:45:30 is encoded to "0x014530". A flag timecontrolledflag indicates the recording mode of an AV stream file. If this timecontrolled flag is 1, it indicates that the recording mode is such a mode 64 in which the file size is proportionate to the time elapsed since recording, such that the condition shown by the following equation: Tsaverage_rate*1 9 2 /1 8 8 *(t - starttime)- a <= size clip(t) <= TSaveragerate*19 2 /188*(t - start-time) + a where TSaverage-rate is an average bit rate of the transport stream of the AV stream file expressed by bytes/second. In the above equation, t denotes the time in seconds, while starttime is the time point when the first source packet of the AV stream file was recorded. The size-clip(t) is 10*192 bytes and a is a constant dependent on TSaverage-rate. If timecontrolledflag is set to 0, it indicates that the recording mode is not controlling so that the time lapse of recording is proportionate to the file size of the AV stream. For example, the input transport stream is recorded in a transparent fashion. If timecontrolledflag is set to 1, the 24-bit field of TSaveragerate indicates the value of TSaverage_rate used in the above equation. If timecontrolledflag is set to 0, this field has no meaning and must be set to 0. For example, the variable bit rate transport stream is encoded by the following sequence: First, the transport rate is set to the value of TSrecording_rate. The video stream is encoded with a variable bit rate. The transport packet is intermittently encoded by not employing null packets. The 32-bit field of RSPN_arrivaltimediscontinuity is a relative address of a site where arrival timebase discontinuities are produced on the Bridge-Clip AV stream file. The RSPN-arrival time discontinuity is sized with the source packet number as 65 a unit and is counted with the value of offsetSPN defined in the Cliplnfoo as from the first source packet of the Bridge-Clip AV stream file. An absolute address in the Bridge-Clip AV stream file is calculated based on the aforementioned equation: SPN_xxx = RSPN_xxx - offsetSPN. The 144-bit field of reserverfor systemuse is reserved for a system. If isformatidentifier_valid flag is 1, it indicates that the field of formatidentifier is effective. If isformatidentifier valid flag is 1, it indicates that the formatidentifier field is valid. If isoriginal-networkIDvalid flag is 1, it indicates that the field of istransport-streamID-valid is valid. If the flag is transportstreamID-valid is 1, it indicates that the transport-streamID field is valid. If is serveceID valid flag is 1, it indicates that the serveceID field is valid. If iscountrycodevalid flag is 1, it indicates that the field country_code is valid. The 32-bit field of format-identifier indicates the value of formatidentifier owned by a registration descriptor (defined in ISO/IEC13818-1) in the transport stream. The 16-bit field of originalnetworkID indicates the value of the originalnetworkID defined in the transport stream. The 16-bit field in serveceID denotes the value of serveceID defined in the transport stream. The 24-bit field of countrycode shows a country code defined by IS03166. Each character code is encoded by IS08859-1. For example, Japan is represented as "JPN" and is encoded to "Ox4A 0x50 Ox4E". The stream_formatname is 15 character codes of ISO-646 showing the name of a format organization affording 66 stream definitions of transport streams. An invalid byte in this field has a value of 'OxFF'. Formatidentifier, originalnetworkID, transport streamID, serveceID, country-code and streamformatname indicate service providers of transport streams. This allows to recognize encoding limitations on audio or video streams and stream definitions of private data streams other than audio video streams or SI (service information). These information can be used to check if the decoder is able to decode the stream. If such decoding is possible, the information may be used to initialize the decoder system before starting the decoding. STCInfo is now explained. The time domain in the MPEG-2 transport stream not containing STC discontinuous points(discontinuous points of the system time base) is termed the STC sequence. In the Clip, STCsequence is specified by the value of STCsequenceid. Figs.50A and 50B illustrate a continuous STC domain. The same STC values never appear in the same STC sequence, although the maximum tine length of Clip is limited, as explained subsequently. Therefore, the same PTS values also never appear in the same STC sequence. If the AV stream contains N STC discontinuous points, where N > 0, the Clip system time base is split into (N+1) STC sequences. STCInfo stores the address of the site where STC discontinuities (system timebase discontinuities) are produced. As explained with reference to Fig.51, the RSPNSTCstart indicates the address and begins at a time point of arrival of the 67 source packet referenced by the (k+1)st RSPNSTC-start and ends at a time point of arrival of the last source packet. Fig.52 shows the syntax of the STCInfo. Turning to the syntax of STCInfo shown in Fig.52, versionnumber is four character letters indicating the version number of STC Info(. The versionnumber must be encoded to "0045" in accordance with ISO 646. Length is a 32-bit unsigned integer indicating the number of bytes of STCInfoO from directly after this length field to the end of STCInfo. If CPI type of CPI() indicates TUmap type, 0 may be set in this length field. If CPItype of CPIO indicates EP map type, the numofSTCsequence mut be of a value not less than 1. An 8-bit unsigned integer of numofSTC-sequence indicates the number of sequences in the Clip. This value indicates the number of the for-loops next following the field. The STC sequenceid corresponding to the preset STCsequence is defined by the order in which appears the RSPNSTCstart corresponding to the STC-sequence in the for-loop containing the RSPNSTCstart. The STC sequenceid commences at 0. The 32-bit field of RSPNSTCstart indicates an address at which the STC-sequence commences on the AV stream file. RSPNSTCstart denotes an address where system time base discontinuities are produced in the AV stream file. The RSPNSTCstart may also be a relative address of the source packet having the 68 first PCR of the new system time base in the AV stream. The RSPNSTCstart is of a size based on the source packet number and is counted from the first source packet of the AV stream file with the offsetSPN value defined in ClipInfo( as an initial value. In this AV stream file, the absolute address is calculated by the above mentioned equation, that is SPNxxx = RSPNxxx - offsetSPN. ProgramInfo in the syntax of zzzz.clip shown in Fig.45 is now explained with reference to Fig.53. The time domain having the following features in the Clip is termed program-sequence. These feature are that the value of PCRPID is not changed, the number of audio elementary streams is also not changed, the PID values in the respective video streams are not changed, the encoding information which is defined by VideoCodingInfo thereof is not changed, the number of audio elementary streams is also not changed, the PID values of the respective audio streams are not changed, and that the encoding information, which is defined by AudioCodinglnfo thereof, is not changed. Program-sequence has only one system time base at the same time point. Program-sequence has a sole PMT at the same time point. ProgramInfo( stores the address of the site where the programsequence commences. RSPNprogram sequence-start indicates the address. Fig.54 illustrates the syntax of ProgramInfo. Turning to the ProgramInfo shown in Fig.54, version-number is four character letters indicating the version number of 69 ProgramInfoo. The versionnumber must be encoded to "00 4 5 " in accordance with ISo 646. Length is a 32-bit unsigned integer indicating the number of bytes of ProgramInfoO from directly at back of this length field to the end of program(infoO. If CPI type of CPI() indicates the TUmap type, this length field may be set to 0. If the CPI type of CPI() indicates EP map type, the numberof programs must be of a value not less than 1. An 8-bit unsigned integer of number-of program-sequences denotes the number of program-sequences in the Clip. This value indicates the number of for loops next following this field. If program-sequence in the Clip is not changed, 1 must be set in the number of program-sequences. A 32-bit field of RSPNprogram-sequencestart is a relative address where the program sequence commences on the AV stream. RSPNprogram-sequencestart is sized with the source packet number as a unit and is counted with the value of offsetSPN defined in the ClipInfoO as from the first source packet of the AV stream file. An absolute address in the AV stream file is calculated by: SPN_xxx = RSPN_xxx - offsetSPN. The values of RSPNprogram-sequence-start in the for-loop syntax must appear in the rising order. A 16-bit field of PCRPID denotes the PID of the transport packet containing 70 an effective PCR field effective for the program-sequence. An 8-bit field of numberofaudios indicates the number of for-loops containing audiostreamPID and AudioCodingInfoo. A 16-bit field of videostreamPID indicates the PID of the transport packet containing a video stream effective for its program-sequence. VideoCodingInfoO, next following this field, must explain the contents of the video stream referenced by its videostreamPID. A 16-bit field of audiostreamPID indicates the PID of a transport packet containing the audio stream effective for its program-sequence. The AudioCodingnfoo, next following this field, must explain the contents of the video stream referenced by its audiostreamPID. The order in which the values ofvideostreamPID in the for-loop of the syntax must be equal to the sequence of PID encoding of the video stream in the PMT effective for the program-sequence. Additionally, the order in which the values of audio streamPID appears in the for-loop of the syntax must be equal to the sequence of PID encoding of the audio stream in the PMT effective for the program-sequence. Fig.55 shows the syntax of VideoCodinglnfo in the syntax of the ProgramInfo shown in Fig.54. Turning to the syntax of the VideoCoding Info shown in Fig.55, an 8-bit field of videoformat indicates the video format corresponding to videostreamPID in ProgramInfoQ, as shown in Fig.56. Referring to Fig.57, an 8-bit field of framerate indicates the video frame rate corresponding to the videostreamPID in ProgramInfoo. An 8-bit field of 71 displayaspectratio indicates a video display aspect ratio corresponding to videostreamPID in ProgramInfo(. Fig.59 shows the syntax of AudioCodingInfo in the syntax of ProgramInfo shown in Fig.54. Turning to the syntax of the AudioCoding Info shown in Fig.59, an 8-bit field of audioformat indicates the audio encoding method corresponding to audiostreamPID in ProgramInfo(, as shown in Fig.60. An 8-bit field of audiocomponenttype indicates an audio component type corresponding to audiostreamPID in ProgramInfoo as shown in Fig.61, whilst an 8-bit field of samplingfrequency indicates an audio sampling frequency corresponding to audiostreamPID in ProgramInfo() as shown in Fig.62. The CPI (Characteristics Point Information) in the syntax of zzzzz.clip shown in Fig.45 is explained. The CPI is used for correlating the time information in the AV stream with the address in its file. The CPI is of two types, namely EP map and TU_map. In Fig.63, if CPI type in CPI( is EP map, its CPIO contains EPmap. In Fig.64, if CPItype in CPIO is TUmap, its CPIO contains TU map. One AV stream has one EPmap or one TUmap. If the AV stream is an SESF transport stream, the corresponding Clip must own an EPmap. Fig.65 show the syntax of CPI. Turning to the syntax of CPI shown in Fig.65, the versionnumber is four character letters indicating the version number of this CPIO. The versionnumber must be encoded to "0045" in accordance with ISO 646. Length is a 32-bit unsigned integer indicating the number of bytes as from directly 72 after this length field to the trailing end of the CPI(). The CPItype is a 1-bit flag and indicates the CPI type of Clip, as shown in Fig.66. The EP map in the CPI syntax shown in Fig.65 is explained. There are two types of the EP map, that is EPmap for a video stream and an EPmap for an audio stream. The EP maptype in the EP map differentiates between these EPmap types. If the Clip contains one or more video streams, the EP map for the video stream must be used. If the Clip does not contain a video stream but contains one or more audio streams, the EP map for the audio stream must be used. The EP map for a video stream is explained with reference to Fig.67. The EP_map for the video stream has data streamPID, PTSEPstart and RSPNEPstart. The streamPID shows the PID of the transport packet transmitting a video stream. The PTSEPstart indicates the PTS of an access unit beginning from the sequence header of the video stream. The RSPNRPstart indicates the address of a source packet including the first byte of the access unit referenced by the PTSEPstart in the AV stream. A sub table, termed EPmapforonestream._PID() is created from one video stream transmitted by the transport packet having the same PID to another. If plural video streams exist in the Clip, the EPmap may contain plural EPmapforonestreamPID(. The EP-map for audio stream has data stream_PID, PTSEPstart and RSPNEP start. The streamPID shows a PID of a transport packet transmitting an 73 audio stream. The PTSEPstart shows the PTS of an accessing unit in the audio stream. The RSPNEP-start indicates an address of a source packet containing a first byte of the access unit referenced by PTSEPstart of the AV stream. The sub table termed EP_mapfor onestreamPID() is created from one audio stream transmitted by the transport packet having the same PID to another. If there exist plural audio streams in the Clip, EPmap may contain plural EP-map _foronestreamPIDO. Turning to the relation between EP-map and STCInfo, one EPmapforonestreamPID() is created in one table irrespective of discontinuous points in the STC. Comparison of the value of the RSPNEPstart to the value of RSPNSTCstart defined in STCInfo() reveals the boundary of data of EPmap belonging to respective STCsequences (see Fig.68). The EP_map must have one EPmapforonestreamPID for a continuous stream range transmitted by the same PID. In the case shown in Fig.69, program#1 and program# 3 have the same video PID, however, the data range is not continuous, so that EPmapfor onestream_PID must be provided for each program. Fig.70 shows the EP map syntax. By way of explanation of the EP_map syntax shown in Fig.70, the EP-type is a 4-bit field and shows the EP map entry point type, as shown in Fig.71. The EP type shows the semantics of the data field next following this field. If Clip includes one or more video stream, the EP type must be set to 0 ('video'). Alternatively, if the Clip contains no video stream but contains one or more 74 audio stream, then EP_ type must be set to 1 ('audio'). The 16-bit field of numberofstreamPIDs indicates the number of times of loops of the for-loop having numberofstreamPIDs in the EP.mapo as a variable. The 16-bit field of streamPID(k) indicates the PID of the transport packet transmitting the number k elementary stream (video or audio stream) referenced by EPmapforonestreamPID (numEPentries(k)). If EP type is 0 ('video'), its elementary stream must be a video stream. If EPjtype is equal to 1 ('audio'), its elementary stream must be the audio stream. The 16-bit field of numEPentries(k) indicates the numEPentries(k) referenced by EP-map_entries(k)). The EPmap_for_one_streamPID_ Startaddress(k): This 32-bit field indicates the relative address position at which the EP mapjor onestream_ PID(numEPentries(k)) begins in the EP-mapo. This value is indicated by the size as from the first byte of the EP.mapo. Padding_word must be inserted in accordance with the EP mapo syntax. X and Y must be optional positive integers. The respective padding words may assume any optional values. Fig.72 shows the syntax of EPmapjor_onestream_PID. By way of explanation of the syntax of the EP_map_for_one_stream_PID shown in Fig.72, the semantics of the 32-bit field of PTSEP start differs with the EP type defined by EPmap(. If EP type is equal to 0 ('video'), this field has upper 32 bits of the 33-bit precision PTS of the access unit beginning with a sequence header of the video stream.
75 If the EP type is equal to 1 ('audio'), this field has upper 32 bits of PTS of 33 bit precision of the access unit of the audio stream. The semantics of the 32-bit field of RSPNEPstart differs with the EP-type defined in EPmapO. If EP-type is equal to O('video'), this field indicates the relative address of the source packet including the first byte of the sequence header of the access unit referenced by the PTSEPstart in the AV stream. Alternatively, if EPtype is equal to 1 ('audio'), this field indicates the relative address of the source packet containing the first byte in the audio stream of the access unit referenced by the PTSEPstart in the AV stream. RSPNEPstart is of a size which is based on the source packet number as a unit, and is counted from the first source packet of the AV stream file, with the value of the offsetSPN, defined in ClipInfoo, as an initial value. The absolute address in the AV stream file is calculated by SPNxxx = RSPNxxx - offsetSPN. It is noted that the value of the RSPNEPstart in the syntax must appear in the rising order. The TU map is now explained with reference to Fig.73. TUmap forms a time axis based on the source packet arrival time clock (timepiece of the arrive time base). This time axis is termed TU map_timeaxis. The point of origin of TU maptimeaxis is indicated by offset-time in the TUmapo. TUmap time axis is divided in a preset unit as from offsettime, this unit being termed timeunit.
76 In each timeunit in the AV stream, addresses on the AV stream file of the source packet in the first complete form are stored in TUmap. These addresses are termed RSPNtimeunitstart. The time at which begins the k(k 2 O)th timeunit on the TU maptimeaxis is termed TU_startjtime(k). This value is calculated based on the following equation: TU_start time(k) = offsettime + k*timeunitsize. It is noted that TUstarttime(k) has a precision of 45 kHz. Fig.74 shows the syntax of TUmap. By way of explanation of the TU-map syntax shown in Fig.74, the 32-bit field of offsettime gives an offset time relative to TUmap time-axis. This value indicates the offset time relative to the first timeunit in the Clip. The offsettime is of a size based on 45 kHz clock derived from the 27 MHz precision arrival time clocks as unit. If the AV stream is to be recorded as new Clip, offsettime must be set to 0. The 32-bit field of timeunitsize affords the size of the timeunit, and is based on 45 kHz clocks, derived from the 27 MHz precision arrival time clocks, as unit. Preferably, timeunit size is not longer than one second (time-unitsize s 45000). The 32 bit field of numberoftimeunitentries indicates the number of entries stored in TU-mapo. The 32-bit field of RSNtimeunit start indicates the relative address of a site in the AV stream at which begins each timeunit. RSNtimeunitstart is of a size based on the source packet number as unit and is counted with the value of offsetSPN 77 defined in ClipInfoO as from the first source packet of the AV stream file as an initial value. The absolute address in the AV stream file is calculated by SPNxxx = RSPNxxx - offsetSPN. It is noted that the value of RSNtimeunitstart in the for-loop of the syntax must appear in the rising order. If there is no source packet in the number (k+1) timeunit, the number (k+1) RSN timeunitstart must be equal to the number k RSPNtimeunitstart. By way of explanation of the ClipMark in the syntax of zzzzz.clip shown in Fig.45, the ClipMark is the mark information pertinent to clip and is stored in the ClipMark. This mark is not set by a user, but is set by a recorder (recording and/or reproducing apparatus 1). Fig.75 shows the ClipMark syntax. By way of explanation of the ClipMark syntax shown in Fig.75, the versionnumber is four character letters indicating the version number of this ClipMark. The versionnumber must be encoded in accordance with ISO 646 to "0045". Length is a 32-bit unsigned integer indicating the number of bytes of the ClipMark() as from directly after the length field to the trailing end of ClipMarkO. The numberofClip marks is a 16-bit unsigned integer indicating the number of marks stored in ClipMark and may be equal to 0. Mark-type is an 8-bit field indicating the mark type and is encoded in accordance with the table shown in Fig.76. Marktime-stamp is a 32-bit field and stores the time stamp indicating a pointer 78 having a specified mark. The semantics of marktimestamp differs with CPI_type in the PlayListO, as shown in Fig.77. If CPI type in CPIO indicates the EPmap type, this 8-bit field indicates the STCsequence_id of the continuous STC domain where there is placed marktime stamp. If CPI_type in CPI() indicates TU-map type, this 8-bit field has no meaning but is set to 0. The 8-bit field of Characterset indicates the indicating method of character letters encoded in the markname field. The encoding method corresponds to the value shown in Fig.19. The 8-bit field of namelength indicates the byte length of the mark name shown in the markname field. This markname field indicates the mark name. The byte number corresponding to the number of the namelength from left of this field is the effective character number and denotes the mark name. In the markname field, the values next following these effective character letters may be arbitrary. The field of refthumbnailindex indicates the information of the thumbnail picture appended to the mark. If the refthumbnailindex field is of a value different from OxFFFF, a thumbnail picture is added to its mark, with the thumbnail picture being stored in the mark.thumb file. This picture is referenced using the value of refthumbnailindex in the mark.thumb file. If the refthumbnailindex field is of a value equal to OxFFFF, a thumbnail picture is not appended to its mark. MakerPrivateData has already been explained with reference to Fig.22 and hence is not explained here specifically.
79 Next, thumbnailinformation is explained. A thumbnail picture is stored in a menu.thmb file or in a mark.thmb file. These files are of the same syntax structure and own a sole Thumbnailo. The menu.thmb file stores a picture representing respective PlatyLists. The totality of menu thumbnails are stored in the sole menu.thmb file. The mark.thmb file stores a mark thumbnail picture, that is a picture representing a mark point. The totality of mark thumbnails corresponding to the totality of PlayLists and Clips are stored in the sole mark.thmb file. Since the thumbnails are frequently added or deleted, the operation of addition and partial deletion must be executable readily and speedily. For this reason, Thmbnail() has a block structure. Picture data is divided into plural portions each of which is stored in one tnblock. One picture data is stored in consecutive tnblocks. In the string of tnblocks, there may exist a tnblock not in use. The byte length of a sole thumbnail picture is variable. Fig.78 shows the syntax of menu.thmb and mark.thmb and Fig.79 the syntax of Thumbnail in the syntax of menu.thmb and mark.thmb shown in Fig.78. By way of explanation of the syntax of Thumbnail, shown in Fig.79, versionnumber is four character letters denoting the version number of this Thumbnail(). Versionnumber must be encoded to "0045" in accordance with ISO 646. Length is a 32-bit unsigned integer indicating the number of bytes of MakerPrivateDatao as from directly at back of the length field up to the trailing end of Thumbnail(). Tu blockstartaddress is a 32-bit unsigned integer indicating the leading end byte address of the first tnblock, in terms of the relative number of bytes 80 as from the leading end byte of Thumbnailo as a unit. The number of relative bytes is counted from 0. Numberofthumbnails is a 16-bit unsigned integer which gives the number of entries of a thumbnail picture contained in Thumbnailo. Tu block size is a 16-bit unsigned integer which gives the size of one tnblock, in terms of 1024 bytes as a unit. If, for example, tn blocksize = 1, it indicates that the size of one tnblock is 1024 bytes. Numberoftnblocks is a 116-bit unsigned integer indicating the number of entries of tnblock in this ThumbnailO. Thumbnailindex is a 16-bit unsigned integer indicating the index number of the thumbnail picture represented by the thumbnail information for one for-loop beginning from the thumbnail-index field. The value OxFFFF must not be used as Thumbnail-index. This Thumbnailindex is referenced by refthumbnailindex in UIAppInfoVolumeO, UlAppInfoPlayListo, PlayListMarko and ClipMarko. Thumbnailpictureformat is an 8-bit unsigned integer representing the picture format of a thumbnail picture and assumes a value shown in Fig.80. In the table, DCF and PNG are allowed only in menu.thumb. The mark thumbnail must assume the value of "OxOO" (MPEG-2 Video 1-picture). Picturedatasize is a 32-bit unsigned integer indicating the byte length of a thumbnail picture in terms of bytes as a unit. Starttnblocknumber is a 16-bit unsigned integer indicating the tnblock number of the tnblock where data of the thumbnail picture begins. The leading end of the thumbnail picture data must coincide with the leading end of the tnblock. The tnblock number begins from 0 and is 81 relevant to the value of a variable k in the for-loop of tnblock. X_picturejlength is a 16-bit unsigned integer indicating the number of pixels in the horizontal direction of a frame of a thumbnail picture. Y-picturelength is a 16-bit unsigned integer indicating the number of pixels in the vertical direction of a frame of a thumbnail picture. Tn block is an area in which to store a thumbnail picture. All tnblock in the Thumbnail() are of the same size (fixed length) and are of a size defined by tn-blocksixe. Figs.81A and 81B schematically show how thumbnail picture data are stored in tnblock. If, as shown in Figs.81A and 81B, the thumbnail picture begins at the leading end of tnblock, and is of a size exceeding 1 tn block, it is stored using the next following tn block. By so doing, data with a variable length can be managed as fixed length data, so that the editing of deletion can be coped with by simpler processing. An AV stream file is now explained. The AV stream file is stored in the "M2TS" directory (Fig.14). There are two types of the AV stream file, namely a Clip A stream file and a Bridge-Clip AV stream file. Both AV streams must be of the structure of DVR MPEG-2 transport stream file as hereinafter defined. First, the DVR MPEG2 transport stream is explained. The structure of the DVR MPEG-2 transport stream is shown in Fig.82. The AV stream file has the structure of a DVR MPEG 2 transport stream. The DVR MPEG 2 transport stream is made up of an integer number of Aligned units. The size of the aligned unit is 6144 bytes (2048*3 bytes). The Aligned unit begins from the first byte of the source packet. The source 82 packet is 192 bytes long. One source packet is comprised of TPextraheader and a transport packet. TPextraheader is 4 bytes long, with the transport packet being 188 bytes long. One Aligned unit is made up of 32 source packets. The last Aligned unit in the DVR MPEG 2 transport stream is also made up of 32 source packets. Therefore, the DVR MPEG 2 transport stream ends at a boundary of the Aligned unit. If the number of the transport packets of the input transport stream recorded on a disc is not a multiple of 32, a source packet having a null packet (transport packet of PID = Ox1FFFF) must be used as the last Aligned unit. The file system must not use excess information in the DVR MPEG 2 transport stream. Fig.83 shows a recorder model of the DVR MPEG 2 transport stream. The recorder shown in Fig.83 is a conceptual model for prescribing the recording process. The DVR MPEG 2 transport stream obeys this model. The input timing of the MPEG 2 transport stream is now explained. The input MPEG 2 transport stream is a full transport stream or a partial transport stream. The input MPEG 2 transport stream must obey the ISO/IEC13818-1 or ISO/IEC 13818-9. The number i byte of the MPEG 2 transport stream is input simultaneously at time t(i) to T-STD (transport stream system target decoder provided for in SO/IEC13818-1) and to the source packetizer. Rpk is an instantaneous maximum value of the input rate of the transport packet. A 27 MHz PLL 52 generates a frequency of 27 MHz clock. The 27 MHz clock 83 frequency is locked at a value of the program clock reference (PCR) of the MPEG 2 transport stream. An arrival time clock counter 53 counts the pulses of the 27 MHz frequency. Arrivaltime clock(i) is a count value of the arrival time clock counter at time t(i). A source packetizer 54 appends TPextra_ header to the totality of the transport packets to create a source packet. Arrivaltimestamp indicates the time when the first byte of the transport packet reaches both the T-STD and the source packetizer. Arrivaltime_stamp(k) is a sampled value of the Arrivaltimeclock(k) as represented by the following equation: arrivaltime_stamp(k) = arrival-time_clock(k)% 230 where k denotes the first byte of the transport packet. If the time separation between two neighboring transport packets is 230/2 7000000 sec (about 40 sec) or longer, the difference of the arrivaltime-stamp of the two transport packets should be set to 230/2 7000000 sec. The recorder is provided for such case. A smoothing buffer 55 smoothes the bitrate of the input transport stream. The smoothing buffer must not overflow. Rmax is the output bitrate of the source packet from the smoothing buffer when the smoothing buffer is not null. If the smoothing buffer is null, the output bitrate of the smoothing buffer is 0. Next, the parameters of the recorder model of the DVR MPEG 2 transport stream are explained. The value of Rmax is given by TS recording_rate as defined in 84 ClipInfo() associated with the AV stream file. This value may be calculated from the following equation: Rmax = TSrecording rate* 192/188 where the value of TSrecording rate is of a size in bytes/second. If the input transport stream is an SESF transport stream, Rpk must be equal to TSrecording_rate as defined in ClipInfoo associated with the AV stream file. If the input transport stream is not an SESF transport stream, reference may be made to values defined e.g., in a descriptor of the MPEG 2 transport stream, such as maximumbitratedescriptor or partial streamdescriptor for this value. If the input transport stream is an SESF transport stream, the smoothing buffer size is 0. If the input transport stream is not an SESF transport stream, reference may be made to values defined in the descriptor of the MPEG 2 transport stream, such as, for example, the values defined in the smoothingbufferdescriptor, shortsmoothing _buffer-descriptor or in the partial transport_streamdescriptor. For the recorder and the player (reproducing apparatus), a sufficient size buffer needs to be provided. The default buffer size is 1536 bytes. Next, a player model of the DVR MPEG 2 transport stream is explained. Fig.84 shows a player model of the DVR MPEG 2 transport stream. This is a conceptual model for prescribing the reproduction process. The DVR MPEG 2 transport stream obeys this model. A 27 MHz X-tal 61 generates the frequency of 27 MHz. An error range of the 85 27MHx frequency must be ±30 ppm (2 7000000 ± 810 Hz). The arrival time clock counter 62 is a binary counter for counting the pulses of the frequency of 27 MHz. Arrivaltimeclock(i) is a count value of the arrival time clock counter at time t(i). In the smoothing buffer 64, Rmax is the input bitrate of the source packet to the smoothing buffer when the smoothing buffer is not full. If the smoothing buffer is full, the input bitrate to the smoothing buffer is 0. By way of explaining the output timing of the MPEG 2 transport stream, if the arrivaltimestamp of the current source packet is equal to 30 bits on the LSB side of arrival timeclock(i), the transport packet of the source packet is removed from the smoothing buffer. Rpk is an instantaneous maximum value of the transport packet rate. The overflow of the smoothing buffer is not allowed. The parameters of the player model of the DVR MPEG 2 transport stream are the same as those of the recorder model of the DVR MPEG 2 transport stream described above. Fig.85 shows the syntax of the source packet. Transportpacket() is an MPEG 2 transport stream provided in ISO/IEC 13818-1. The syntax of TPExtra-header in the syntax of the source packet shown in Fig.85 is shown in Fig.86. By way of explaining the syntax of the TPExtra-header, shown in Fig.86, copypermissionindicator is an integer representing the copying limitation of the payload of the transport packet. The copying limitation may be copy free, no more copy, copy once or copying prohibited. Fig.87 shows the relation between the value of copypermission_indicator and the 86 mode it designates. Copypernissionindicator is appended to the totality of transport packets. If the input transport stream is recorded using the IEEE1394 digital interface, the value of copypermissionindicator may be associated with the value of EMI (encryption mode indicator). If the input transport stream is recorded without employing the IEEE1394 digital interface, the value of copypermissionindicator may be associated with the value of the CCI embedded in the transport packet. If an analog signal input is self-encoded, the value of copypermissionindicator may be associated with the value of CGMS-A of the analog signal. Arrivaltimestamp is an integer having a value as specified by arrivaltimestamp in the following equation: arrivaltimestamp(k)= arrivaltimeclock(k)%230. By way of defining the ClipAV stream, the ClipAV stream must have a structure of the DVR MPEG 2 transport stream defined as described above. Arrivaltimeclock(i) must increase continuously in the Clip AV stream. If there exists a discontinuous point of the system time base (STC base) in the Clip AV stream, arrivaltimeclock(i) in the Clip AV stream must increase continuously. The maximum value of the different of the arrivaltimeclock(i) between the beginning and the end of the Clip AV stream must be 26 hours. This limitation guarantees that, if there is no discontinuous point in the system time base (STC base) in the MPEG 2 transport stream, the PTS (presentation time stamp) of the same value 87 never appears in the Clip AV stream. The MPEG 2 system standard provides that the PTS has a wraparound period of 233/90000 sec (about 26.5 hours). By way of defining the Bridge-Clip AV stream, the Bridge-Clip AV stream must have a structure of the DVR MPEG 2 transport stream defined as described above. The Bridge-Clip AV stream must include a discontinuous point of one arrival time base. The transport stream ahead and at back of the discontinuous point of the arrival time base must obey the encoding limitations and the DVR-STD as later explained. The present embodiment supports the video-audio seamless connection between PlayItems being edited. Seamless connection between PlayItems guarantees "continuous data supply" to the player/decoder and "seamless decoding processing". The "continuous data supply" is the capability of guaranteeing data supply to the decoder at a bitrate necessary to prevent buffer underflow. In order to enable data to be read out from the disc as data real-time properties are assured, data is to be stored in terms of a continuous block of a sufficiently large size as a unit. The "seamless decoding processing" means the capability of a player in displaying audio video data recorded on the disc without producing pause or gap in the playback output of the decoder. The AV stream, referenced by the seamless connected PlayItems, is explained. Whether or not the seamless display of a previous PlayItem and the current PlayItem is guaranteed may be verified from the connectioncondition field defined in the current PlayItem. There are two methods for seamless connection of PlayItems, that 88 is a method employing Bridge-Clip and a method not employing Bridge-Clip. Fig.88 shows the relation between the previous PlayItem and the current PlayItem in case of employing Bridge-Clip. In Fig.88, the stream data, read out by the player, is shown shaded. In Fig.88, TS1 is made up of shaded stream data of the Clip1 (Clip AV stream) and shaded stream data previous to RSPNarrivaltimediscontinuity. The shaded stream data of Clip1 of TS1 is stream data from an address of a stream required for decoding the presentation unit corresponding to INitem of the previous PlayItem (shown as IN-timel in Fig.88) up to the source packet referenced by RSPN exitfrompreviousClip. The shaded stream data prior to RSPNarrivaltimediscontinuity of Bridge-Clip contained in TS1 is stream data as from the first source packet of Bridge-Clip up to the source packet directly previous to the source packet referenced by RSPNarrival time discontinuity. In Fig.88, TS2 is made up of shaded stream data of Clip 2 (Clip AV stream) and shaded stream data subsequent to RSPNarrivaltimediscontinuity of Bridge-Clip. The shaded stream data as from the RSPN-arrival-time-discontinuity of Bridge-Clip contained in TS2 stream data from the source packet referenced by RSPNarrivaltimediscontinuity to the last source packet of Bridge-Clip. The shaded stream data of Clip2 of TS2 is stream data from the source packet referenced by RSPNentertocurrentClip to the address of the stream required for decoding the presentation unit corresponding to OUT-time of current PlayItem (shown by 89 OUTtime2 in Fig.88). Fig.89 shows the relation between the previous PlayItem and the current PlayItem in case of not employing Bridge-Clip. In this case, the stream data read out by the player is shown shaded. In Fig.89,TS1 is made up of shaded stream data of the Clip1 (Clip AV stream). The shaded stream data of Clip1 of TS1 is data beginning at an address of a stream necessary in decoding a presentation unit corresponding to INtime of the previous PlayItem, shown at INtimel in Fig.89 as far as the last source packet of Clip1. In Fig.89, TS2 is shaded stream data of Clip2 (Clip AV stream). The shaded stream data of Clip2 of TS2 is stream data beginning at a first source packet of Clip2 as far as an address of the stream necessary for decoding the presentation unit corresponding to OUT-time of current PlayItem (shown at OUTtime2 in Fig.89). In Figs.88 and 89, TS1 and T2 are continuous streams of the source packet. Next, the stream provisions of TS1 and TS2 and the connection conditions therebetween are scrutinized. First, encoding limitations for seamless connection are scrutinized. By way of limitations on the encoding structure of a transport stream, the number of programs contained in TS1 and TS2 must be 1. The number of video streams contained in TS1 and TS2 must be 1. The number of audio streams contained in TS and TS2 must be 2 or less. The numbers of the audio streams contained in TS1 and TS2 must be equal to each other. It is also possible for elementary streams or 90 private streams other than those depicted above to be contained in TS1 and/or TS2. The limitations on the video bitstream are now explained. Fig.90 shows a typical seamless connection indicated by a picture display sequence. In order for a video stream to be demonstrated seamlessly in the vicinity of a junction point, unneeded pictures displayed at back of OUTtimel (OUTtime of Clip1) and ahead of INtime2 (INtime of Clip2) must be removed by a process of re-encoding the partial stream of the Clip in the vicinity of the junction point. Fig.91 shows an embodiment of realizing seamless connection using BridgeSequence. The video stream of Bridge-Clip previous to RSPNarrivaltimediscontinuity is comprised of an encoded video stream up to a picture corresponding to OUTtimel of Clip1 of Fig.90. This video stream is connected to the video stream of previous Clip1 and is re-encoded to form an elementary stream conforming to the MPEG2 standard. The video stream of Bridge-Clip subsequent to RSPNarrivaltimediscontinuity is made up of an encoded video stream subsequent to a picture corresponding to IN time2 of Clip2 of Fig.90. The decoding of this video stream can be started correctly for connecting the video stream to the next following Clip2 video stream. Re-encoding is made such that a sole continuous elementary stream conforming to MPEG 2 standard will be formed. For creating Bridge-Clip, several pictures in general need to be re-encoded, whilst other pictures can be copied from the original Clip.
91 Fig.92 shows an embodiment of realizing seamless connection without employing BridgeSequence in the embodiment shown in Fig.90. The Clip1 video stream is comprised of an encoded video stream as far as the picture corresponding to OUTtimel of Fig.90 and is re-encoded so as to give an elementary stream conforming to the MPEG2 standard. In similar manner, the video stream of Clip2 is made up of encoded bitstreams subsequent to the picture associated with IN_time2 of Clip2 of Fig.90. These encoding bitstreams are already re-encoded to give a sole continuous elementary stream conforming to the MPEG2 standard. By way of explaining encoding limitations of the video stream, the frame rates of the video streams of TS1 and TS2 must be equal to each other. The video stream of TS1 must be terminated at sequenceendcode. The video stream of TS2 must commence at Sequence header, GOP Header and with an I-picture. The video stream of TS2 must commence at a closed GOP. The video presentation units defined in a bitstream (frame or field) must be continuous with a junction point in-between. No gap of the fields or frames are allowed to exist at junction points. In case of encoding employing 3-2 pulldown, it may be necessary to rewrite "top-field first" and "repeat firstfield" flags. Alternatively, local re-encoding may be made to prevent field gaps from being produced. By way of explaining encoding limitations on the audio bitstream, the audio sampling frequency of TS1 and that of TS2 must be equal to each other. The audio encoding method of TS1 and that of TS2 (for example, MPEG1 layer 2, AC-3, SESF 92 LPCM and AAC) must be equal to each other. By way of explaining encoding limitations on MPEG-2 transport stream, the last audio frame of the audio stream of TS1 must contain audio samples having a display timing equal to the display end time of the last display picture of TS1. The first audio frame of the audio stream of TS2 must contain an audio sample having a display timing equal to the display start timing of the first display picture of TS2. At a junction point, no gap may be allowed to exist in a sequence of the audio presentation units. As shown in Fig.93, there may be an overlap defined by the length of the audio presentation unit less than two audio frame domains. The first packet transmitting an elementary stream of TS2 must be a video packet. The transport stream at the junction point must obey the DVR-STD which will be explained subsequently. By way of explaining limitations on the Clip and Bridge-Clip, no discontinuities in the arrival time base are allowed to exist in TS1 or in TS2. The following limitations are applied only to the case of employing the Bridge Clip. The Bridge-Clip AV stream has a sole discontinuous point in the arrival time base only at a junction point of the last source packet of TS 1 and the first source packet of TS2. The SPNarrival time discontinuity defined in ClipInfoO represents an address of the discontinuous point, which must represent the address referencing the first source packet of TS2. The source packet referenced by RSPNexitfrompreviousClip defined in BridgeSequenceInfOo may be any source packet in Clip1. It is unnecessary for this 93 source packet to be a boundary of the Aligned unit. The source packet referenced by RSPNentertocurrentClip defined in BridgeSequencelnfoO may be any source packet in Clip2. It is unnecessary for this source packet to be a boundary of the Aligned unit. By way of explaining limitations on PlayItem, the OUTtime of the previous PlayItem (OUTtime 1 shown in Fig.89) must represent the display end time of the last video presentation unit of TS1. The INtime of the current PlayTime (IN_time2 shown in Fig.88 and 89) must represent the display start time of the first presentation unit of TS2. By way of explaining the limitations on the data allocation in case of employing Bridge-Clip by referring to Fig.94, the seamless connection must be made to guarantee continuous data supply by the file system. This must be realized by arranging the Bridge-Clip AV stream, connecting to Clip1 (Clip AV stream file) and Clip2 (Clip AV stream file), such as to satisfy data allocation prescriptions. RSPNexitfrompreviousClip must be selected so that the stream portion of Clip1 (Clip AV stream file) previous to RSPN-exit-from_previousClip will be arranged in a continuous area not less than half fragment. The data length of the Bridge-Clip AV stream must be selected so that the data will be arranged in the continuous area not less than half fragment. RSPNentertocurrent_Clip must be selected so that the stream portion of Clip2 (Clip AV stream file) subsequent to RSPNentertocurrentClip will be arranged in a continuous area not less than half 94 fragment. By way of explaining data allocation limitations in case of seamless connection not employing Bridge-Clip, by referring to Fig.95, the seamless connection must be made so as to guarantee continuous data supply by the file system. This must be realized by arranging the last portion of the Clip1 (Clip AV stream file) and the first portion of the Clip2 (Clip AV stream file) so that the provisions on data allocation will be met. The last stream portion of Clip1 (Clip AV stream file) must be arranged in a continuous area not less than one half fragment. The first stream portion of Clip2 (Clip AV stream file) must be arranged in a continuous area not less than one half fragment. Next, DVR-STD is explained. This DVR-STD is a conceptual model for modeling the decoding processing in the generation and verification of the DVR MPEG 2 transport stream. The DVR-STD is also a conceptual model for modeling the decoding processing in the generation and verification of the AV stream referenced by two PlayItems seamlessly connected to each other as described above. Fig.96 shows a DVR-STD model. The model shown in Fig.96 includes, as a constituent element, a DVR MPEG 2 transport stream player model. The notation of n, Tbn, Mbn, Ebn, Thsys, Bsys, Rxn, Rbxn, Rxsys, Dn, Dsys, On and P9(k) is the same as that defined in T-STD of ISO/JEC 13818-1, wherein n is an index number of an elementary stream and TBn is a transport buffer of the elementary stream n.
95 MBn is a multiplexing buffer of the elementary stream n and exists only for the video stream. EBn is an elementary stream buffer of the elementary stream n and is present only for the video stream. TBsys is a main buffer in a system target decoder for the system information for a program being decoded. Rxn is a transmission rate with which data is removed from TBn. Rbxn is a transmission rate with which the PES packet payload is removed from MBn and is present only for a video stream. Rxsys is a transmission rate with which data is removed from TBsys. Dn is a decoder of the elementary stream n. Dsys is a decoder pertinent to the system information of a program being decoded. On is a re-ordering buffer of the video stream n. Pn(k) is a number k presentation unit of the elementary stream. The decoding process for DVR-STD is explained. During the time a sole DVR MPEG 2 transport stream is being reproduced, the timing of inputting the transport packet to TB1, TBn or TBsys is determined by arrivaltime-stamp of the source packet. The prescriptions for the buffering operation of TB1, MB1, EB1, TBn Bn, TBsys and Bsys are the same as those of the T-STD provided for in ISO/EC 13818-1, while the prescriptions for the deciding and display operations are also the same as the T-STD provided for in ISO/IEC 13818-1. The decoding process during the time the seamlessly connected PlayLists are being reproduced is now explained. Here, the reproduction of two AV streams referenced by the seamlessly connected PlayItems is explained. In the following explanation, the reproduction of TS1 and TS2, shown for example in Fig.88, is 96 explained. TS1 and TS2 are a previous stream and a current stream, respectively. Fig.97 shows a timing chart for inputting, decoding and display of transport packets when transferring from a given AV stream (TS1) to the next AV stream seamlessly connected thereto (TS2). During transfer from a preset AV stream (TS1) to the next AV stream seamlessly connected thereto (TS2), the time axis of the arrival time base of TS2 is not the same as the time axis of the arrival time base of TS1 (indicated by ATC1 in Fig.97). Moreover, the time axis of the system time base of TS2 (indicated by ATC1 in Fig.97) is not the same as the time axis of the system time base of TS1 (indicated by STC1 in Fig.97). The video display is required to be continuous seamlessly, however, there may be overlap in the display tine of the presentation units. The input timing to DVR-STD is explained. During the time until time T1, that is until the inputting of the last video packet to the TB1 of DVR-STD, the input timing to the buffers of TB1, TBn or TBsys of DVR-STD is determined by arrivaltimestamp of the arrival time base of TS1. The remaining packets of TS1 must be input to buffers of TBn or to TBsys of DVR-STD at a bitrate of TSrecordingrate (TS1). The TSrecordingrate (TS1) is the value of TSrecording_rate defined in ClipInfo() corresponding to Clip1. The time the last byte of TS1 is input to the buffer is the time T2. So, during the time between time T1 and time T2, arrivaltimestamp of the source packet is discounted. If N1 is the number of bytes of the transport packet of TS1 next following the 97 last video packet of TS1, the time DT1 from time T1 until time T2 is the time necessary for N1 bytes to be input completely at a bitrate of TSrecordingjate (TS1), and is calculated in accordance with the following equation: DT1 = T2 - T1 = N1/TSrecordingjate. During the time from time T1 until time T2 (TS1), both the values of RXn and RXsys are changed to the value of TS-recordingrate (TS1). Except this rule, the buffering operation is the same as that of T-STD. At time T2, the arrival time clock counter is reset to the value of arrivaltimestamp of the first source packet of TS2. The input timing to the buffer of TB1, TBn or TBsys of DVR-STD is determined by arrivaltimestamp of the source packet of TB2. Both RXn and RXsys are changed to values defined in T-STD. By way of explaining additional audio buffering and system data buffering, the audio decoder and the system decoder need to have an additional buffering amount (data amount equivalent to one second) in addition to the buffer amount defined in T STD in order to allow input data of a domain from time T1 to time T2. By way of explaining the video presentation timing, the display on the video presentation unit must be continuous, that is devoid of gaps, through junction point. It is noted that STC1 is the time axis of the system time base of TSi (indicated as STC1 in Fig.9), while STC2 is the time axis of the system time base of TS2 (shown at STC2 in Fig.97; correctly, STC2 begins at time the first PCR of TS2 has been input to the T
STD).
98 The offset between STC1 and STC2 is determined as follows: the PTSlend is the PTS on STC1 corresponding to the last video presentation unit of TS2. PTS2start is PTS on STC2 corresponding to the first video presentation unit of TS2 and Tpp is the display time period of the last video presentation unit of TS1, the offset STCdelta between two system time bases is calculated in accordance with the following equation: STCdelta = PTSlend + Tpp - PTS2start. By way of explanation of the audio presentation timing, there may be overlap in the display timing of the audio presentation unit, with the overlap being less than 0 to 2 audio frames (see "audio overlap" shown in Fig.97). The indication as to which of audio samples is to be selected and re-synchronization of the display of the audio presentation unit to the corrected time base at back of the junction point are set on the player. By way of explaining the system time clock of DVR-STD, the last audio presentation unit of TS1 is displayed at time T5. The system time clock may be overlapped between time T2 and time T5. During this time domain, the DVR-STD switches the system time clocks between the value of the old time base (STC1) and the value of the new time base (STC2). The value of STC2 may be calculated in accordance with the following equation: STC2 = STC1 - STCdelta. The buffering continuity is explained. STC11video end is the value of STC on the system time base STC2 when the first byte of the first video packet reaches TB1 99 of DVR-STD. STC22videostart is the value of STC on the system time base STC2 when the first byte of the first video packet reaches TB1 of DVR-STD. STC21videoend is the value of STC11videoend calculated as the value on STC2 of the system time base STC2. STC21video end is calculated in accordance with the following equation: STC21videoend = STC11videoend - STCdelta. In order to obey DVR-STD, the following two conditions must be met: First, the arrival timing of the first video packet of TS2 at TB1 must satisfy the following inequality: STC22videostart > STC21videoend + AT1. If it is necessary to re-encode and/or multiplex the partial stream of Clip1 and/or Clip2, in such a manner that the above inequality will be satisfied, this re-encoding or multiplexing is performed as appropriate. Second, the inputting of the video packet from TS1 followed by the inputting of the video packet from TS2 on the time axis of the system time base, mapped from STC1 and STC2 on the same time axis, must not overflow or underflow the video buffer. If the above syntax, data structure and the rules are used as basis, the contents of data recorded on the recording medium or the reproduction information can be managed properly to enable the user to confirm the contents of data recorded on the recording medium at the time of reproduction or to reproduce desired data extremely 100 readily. In the above-described embodiment, the MPEG 2 transport stream is taken as an example of the multiplexed stream. This, however, is merely exemplary, such that the MPEG 2 program stream DSS or the transport stream used in the DirecTV Service (trade mark) of USA may also be used as the multiplexed stream. Fig.98 shows a modification of a PlayList file. The marked difference between the syntax of Fig.98 and that of Fig.99 is the place where UlAppInfoPlayListO is stored. In the embodiment of Fig.98, in which UIApplnfoPlayListO is outside the PlayList(, the future information expansion of UIAppInfoPlayListO may be achieved rather easily. Versionnumber is four figures indicating the version number of the thumbnail header information file. PlayListstartaddress indicates the leading address of PlayListo in terms of the number of relative bytes from the leading end of the PlayList file as a unit. The number of relative bytes is counted from 0. PlayListMarkstartaddress indicates the leading address of PlayListMarkQ in terms of the number of relative bytes from the leading byte of the PlayList file as a unit. The number of relative bytes is counted from 0. MakersPrivateDatastartaddress denotes the leading address of MakersPrivateData in terms of the number of relative bytes from the leading byte of the PlayList file as a unit. The number of relative bytes is counted from 0.
101 Fig.99 shows the syntax of UIAppInfoPlayList in the PlayList file of Fig.98. PlayListservice-type indicates the type of the PlayList file, an example of which is shown in Fig.26. The PlayList-service-type may have the same meaning as that of the service type shown by the digital TV broadcast. For example, in the BS broadcast in Japan, there are three service types, namely the TV service, audio service and data broadcast service. The value representing the service type of the program contained in the Clip AV stream used by the PlayList is set in the PlayListservice-type. PlayListcharacterset denotes the encoding method for character letters encoded in channelname, PlayListname and PlayListdetail field, while denoting the encoding method for character letters encoded in markname field in PlayListMark. Channelnumber denotes the broadcast channel number or service number as selected by the user when recording the PlayList. If plural PlayLists are combined into one PlayList, channelnumber denotes a representative value of the PlayList. If this field is set to OxFFFF, the field has no meaning. Channelnamelength denotes the byte length of the channel name indicated in the channelname field. This field has a value not larger than 20. Channelname denotes the name of the service or the broadcast channel as selected by the user when the PlayList is recorded. The number of bytes of a number indicated by the channelname length from left of this field represents effective character letters and denotes the above-mentioned name. The remaining bytes next following these effective character letters may be set to any arbitrary value. If plural 102 PlayLists are combined into one PlayList, this field indicates the name representative of the PlayList. PlayListnamelength denotes the byte length of the PlayList name indicated in the PlayList-name field. Play listname shows the name of the PlayList. The number of bytes of the number indicated by the PlayListnamelength from left of this field represents the effective character letters and denote the above-mentioned name. The remaining bytes in this field next following these effective character letters may be set to any optional value. PlayListjdetail-length denotes the byte length of the detailed information of the PlayList indicated in the PlayListdetail field. This field has a value not larger than 1200. PlayListjdetail denotes the text for illustrating the detailed information of the PlayList. The number of bytes of a number indicated by PlayListdetaillength as from left of this field is the effective character letters. The remaining bytes in this field next following these effective character letters may be set to any optional value. The meaning of this syntax field is otherwise the same as that of the field of the same name shown in Fig.27. Fig. 100 shows the syntax of PlayList() in the PlayList file of Fig.98. This syntax is basically the same as the embodiment of Fig.25 except that the present syntax lacks in the UlAppInfoPlayListO.
103 Fig.101 shows a modification of the syntax of SubPlayltem. The present syntax differs appreciably from the embodiment of Fig. 40 in that STC sequenceid has now been added. STCsequenceid indicates STC sequenceid of STC referenced by SubPathINtime and SubPathOUTtime used for identifying the reproduction domain on the AV stream file corresponding to Clip_Informationfilename. SubPathINtime and SubPathOUTtime denote the time on the same STC continuous domain specified by the STCsequenceid. By adding STC sequenceid to the SubPlayltem, it is allowed that the AV stream file referenced by SubPlayltem has an STC discontinuous point. The syntax field otherwise has the same meaning as that of the field of the same name shown in Fig.40. Fig.102 shows the flowchart for illustrating the method for forming Real PlayList. Reference is had to the block diagram of the recording and/or reproducing apparatus shown in Fig.1. At step S11, the controller 43 records a Clip AV stream. At step S12, the controller 23 checks whether or not EP map of the AV stream can be prepared. If the result of check at step S12 is YES, the controller 23 proceeds to step S13. If otherwise, the controller 23 proceeds to step S14 to form TU-map. At step S15, the controller 23 then sets the CLI type of PlayList. At step S16, the controller 23 forms the PlayListQ comprised of PlayItem 104 covering the possible reproduction range of the Clip in its entirety. If CPItype is EPmap type, the time information is set on the PTS basis. If there is STC discontinuous point in the Clip, and the PlayList() is made up of two or more PlayItems, the connectioncondition between PlayItems is also determined. If CPI_type is TU map type, the time information is set on the arrival time basis. At step S17, the controller 23 forms UIAppInfoPlayList(. At step S18, the controller 23 forms PlayListMark. At step S19, the controller 23 forms MakerPrivateData. At step S20, the controller 23 forms RealPlayList file. So, one Real PlayList file is formed whenever a Clip AV stream is newly recorded. Fig.103 is a flowchart for illustrating the method for forming Virtual PlayList. At step S31, one real PlayList, recorded on the disc, is specified through user interface. From the reproduction range of the Real PlayList, the reproduction range specified by IN and OUT points is specified through the user interface. If CPIjtype is EPmap type, the reproduction domain is set on the PTS basis. If CPI type is TUtype, the reproduction domain is set on the arrival time basis. At step S32, the controller 23 checks whether or not the entire operation of specifying the reproduction range by the user has been finished. If the user selects the domain to be reproduced next to the specified reproduction domain, the controller 23 reverts to step S31. If the entire operation of specifying the reproduction range by the 105 user has come to a close, the controller 23 proceeds to step S33. At step S33, the connection condition (Connectioncondition between two consecutively reproduced reproduction domains is determined by the user through interface or by the controller 23. If, at step S34, CPItype is EPmap type, the user specifies the sub-path information (post-recording audio information). This step is omitted if the sub-path is not formed by the user. At step S35, the controller 23 forms PlayList() based on the reproduction range information specified by the user and on the connectioncondition. At step S36, the controller 23 forms UIAppInfoPlayListO. At step S37, the controller 23 forms PlayListMark. At step S38, the controller 23 forms MakerPrivateData. At step S39, the controller 23 forms VirtualPlayList file. In this manner, one virtual PlayList file is formed for each group of the reproduction domains which are selected from the reproduction range of the Real PlayList recorded on the disc and which the user is desirous to view. Fig.104 is a flowchart for explaining the reproducing method of the PlayList. At step S51, the controller 23 acquires the information on the Info.dvr, Clip Information file, PlayList file and thumbnail file and forms a GUI picture demonstrating a list of PlayLists recorded on the disc to display the GUI picture so formed on the GUI through user interface.
106 At step S52, the controller 23 presents the information explaining the PlayList on the GUI picture based on the UIAppInfoPlayList in the respective PlayLists. At step S53, the user commands reproduction of one PlayList from the GUI picture through user interface. If the CPI type is EP map type, the controller 23 at step S54 acquires, from the STCsequenceid and IN-time PTS, the number of the source packet having an entry point temporally previous and closest to INtime. If the CPItype is TU map type, the controller 23 acquires, from the INtime of the current PlayItem, the number of the source packet where begins the time unit temporally previous and closest to INtime. At step S55, the controller 23 reads out data of the AV stream from the source packet number acquired at the above step to route the data so read out to the AV decoder 27. If, at step S56, there is the PlayItem temporally previous to the current PlayItem, the controller 23 performs the connection processing of display between the previous PlayItem and the current PlayItem in accordance with the connectioncondition. If. At step S57, the CPI_type is EP map type, the AV decoder 27 commands display to be started as from the picture of the INtime PTS. If the CPI-type is TUmap type, the AV decoder 27 commands display to be started as from the picture of the stream subsequent to INtime. At step S58, the controller 23 commands the AV decoder 27 to continue decoding the AV stream.
107 If CPItype is EP map type, the controller 23 at step S59 checks whether or not the picture currently displayed is the picture of the OUTtime PTS. Also, if the CPI type is TU map type, the controller 23 checks whether or not the stream currently decoded is past the OUTtime. If the result of check at step S59 is NO, the controller 23 proceeds to step S60. At step S60, the controller 23 displays the current picture to then revert to step S58. If the result of check at step S59 is YES, the controller 23 proceeds to step S61. At step S61, the controller 23 checks whether or not the current PlayItem is the last PlayItem in the PlayList. If the result of check is NO, the controller 23 reverts to step S54 and, if otherwise, the PlayList reproduction is finished. Fig.105 is a flowchart for illustrating the reproducing method of the SubPath of the PlayList. The subpath reproducing method of Fig.105 is used only if CPI_type of PlayList is EPmap. The processing of the flowchart is performed simultaneously with the processing subsequent to step S54 in the reproduction of the PlayList of Fig.104. Meanwhile, it is presupposed that the AV decoder 27 is able to decode two audio streams simultaneously. At step S71, the controller 23 acquires the information of SubPlayltem. At step S72, the controller 23 acquires the source packet number having an entry point temporally previous and closest to SubPathIN time. At step S73, the controller 23 reads out data of the AV stream of the subpath from the source packet number having the above entry point to route the data so read 108 out to the AV decoder 27. At step S74, the controller 23 commands the AV decoder 27 to start displaying the subpath audio when the reproduction of the main path reaches the picture specified by the syncPlaytemid and syncstartPTS_ofPlayItem. At step S75, the AV decoder 27 continues decoding the AV stream of the subpath. At step S76, the controller 23 checks whether or not the PTS of the subpath currently displayed is the SubPathOUTtime. If the result of check is NO, the controller 23 proceeds to step S77, where the controller 23 continues displaying the subpath. The controller 23 then reverts to step S75. If, at step S76, the PTS of the subpath currently displayed is SubPathOUTtime, the display of the subpath is finished. The main path and the subpath of one PlayList file, commanded to be reproduced by the user, are reproduced, as shown in Figs.104 and 105. Fig.106 shows a flowchart for illustrating the method for forming the PlayListMark. Reference is made to the block diagram of the recording and/or reproducing apparatus of Fig.1. At step S91, the controller 23 acquires the information on the Info.dvr, Clip Information file, PlayList file and thumbnail file and forms a GUI picture demonstrating a list of PlayLists recorded on the disc to display the GUI picture so formed on the GUI through user interface.
109 At step S92, the user commands the controller 23 to reproduce one PlayList through user interface. At step S93, the controller 23 causes reproduction of the specified PlayList as commanded to be started (see Fig.104). At step S94, the user commands the controller 23 to set a mark at a favorite scene through user interface. If, at step S95, the CPI type is EPmap, the controller 23 acquires the mark PTS and the PlayItemid of the PlayItem to which it belongs. Also, the controller 23 acquires the arrival time of the mark point, if the CPI type is TUmap. At step S95, the controller 23 stores the mark information in the PlayListMark(. At step S97, the controller 23 records the PlayList file on the recording medium 100. Fig. 107 is a flowchart for illustrating the locating reproducing method employing the PlayListMark. Reference is made to the block diagram of the recording and/or reproducing apparatus 1 of Fig.1. At step S111, the controller 23 acquires the information on the Info.dvr, Clip Information file, PlayList file and thumbnail file and forms a GUI picture demonstrating a list of PlayLists recorded on the disc (recording medium 100) to display the GUI picture so formed on the GUI through user interface. At step S112, the user commands the controller 23 to reproduce one PlayList through user interface.
110 At step S113, the controller 23 causes the list of thumbnails, generated from the picture referenced by PlayListMark, to be displayed on the GUI through user interface. At step s114, the user specifies the mark point of the reproduction start point through the user interface. If the CPI type is EP map type, the controller 23 acquires the mark PTS and the PlayItem_id to which it belongs. If the CPI type is TUmap type, the controller 23 acquires the ATS (Arrival Time Stamp) of the mark. If the CPI type is the EPmap type, the controller 23 at step S116 acquires the STC-sequence-id of the AV stream referenced by PlayItem specified by the Playltemjid. At step S 117, if the CPItype is EP map type, the controller 23 causes the AV stream to be input to the decoder based on the PTS of the mark and on the STC sequenceid. Specifically, the controller 23 performs the processing similar to that at step S55, using the PTS of the mark and on the STC-sequenceid. If CPIjtype is TU_type, the controller 23 causes the AV stream to be input to the decoder, based on the ATS of the mark. Specifically, the controller performs the processing similar to that at step S54 and step S55 of Fig.104, using the ATS. If, at step S118, the CPI type is the EPmap type, the controller 23 causes display to be started as from the picture of the PTS of the mark point. If the CPI type is the TUmap type, the controller 23 causes display to be started as from a picture at back of the ATS of the mark point.
111 In this manner, the user selects e.g., a start point of a favorite scene from the PlayList. The start point so selected is supervised by the recorder (controller 23 of the recording and/or reproducing apparatus 1) in the PlayListMark. Moreover, the user selects the reproduction start point from the list of the mark points stored in the playListMark so that the player begins reproduction at the start point, as shown in Fig.107. If the above syntax, data structure and the rules are used as basis, the contents of data recorded on the recording medium or the reproduction information can be managed properly to enable the user to confirm the contents of data recorded on the recording medium at the time of reproduction or to reproduce desired data extremely readily. If the position of an I-picture can be analyzed, the AV stream of different formats can be recorded, reproduced and managed using a common application program (software), subject to employing TU-map. If the AV stream is recorded on the recording medium as its contents (I-picture position) are analyzed (cognizant recording), the EP map is used, whereas, if the AV stream is directly recorded on the recording medium without analyzing its contents (I picture position) (non-cognizant recording), the TU map is used. So, the AV data can be recorded, reproduced and managed using a common application program. Thus, if the scrambled AV data is descrambled with analysis to record it on the recording medium, the EP map is used, whereas, if the scrambled AV data is directly 112 recorded on the recording medium without descrambling (without analysis), the TUmap is used. By so doing, the AV data may be recorded, reproduced and managed using the common application program. Moreover, the EP map type and the TUmap type can be described in the PlayList( as CPItype, TUmap may be used if the I-picture position can be analyzed, whereas, if the I-picture position cannot be analyzed, the TU map may be used. By so doing, the AV stream data recorded with analysis of the I-picture position and the AV stream data recorded without analysis of the I-picture position can be managed in a unified fashion by a common program simply on setting a corresponding flag. Moreover, the PlayList file and the Clip Information file are recorded separately, so that, if the contents of a given PlayList or Clip are changed by e.g., editing, it is unnecessary to change a file irrelevant to the changed file. The result is that the file contents can be changed readily to reduce the time needed in such change or recording. In addition, if only the Info.dvr is read out first to present the disc recording contents to the user interface to read out from the disc only the PlayList file commanded to be reproduced by the user and the relevant Clip Information file, the user queuing time may be reduced. If the totality of PlayList files or Clip Information files are collected in one file for recording, the file size becomes bulky. So, the time involved in changing the file contents for recording is appreciably longer than that in case the respective files are 113 recorded separately. The present invention is aimed to overcome this deficiency. The above-described sequence of operations may be executed not only by hardware but also by software. If the sequence of operations is to be executed by software, it is installed from a recording medium to a computer in the dedicated hardware of which the program forming the software or a general-purpose personal computer of Fig.38 capable of executing various functions based on a variety of programs installed therein. The recording medium is constituted not only by a package medium distributed for furnishing the program to the user, in addition to a computer, such as a magnetic disc 221 carrying the program thereon, inclusive of a floppy disc, an optical disc 222, inclusive of a CD-ROM (Compact Disc-Read-Only memory) or a DVD (Digital Versatile Disc), a magneto-optical disc 223, inclusive of a Mini-Disc, or a semiconductor memory 224, but also by a hard disc, inclusive of a ROM 202 carrying a program and a memory 208, furnished to the user as it is built-in in a computer, as shown in Fig.108. In the present specification, the steps of the program furnished by the medium include not only the chronological processing in accordance with the sequence indicated, but also the processing performed not chronologically but in parallel or separately. Additionally, in the specification, the system means an entire apparatus comprised of plural component devices.
114 Industrial Applicability In the information processing method and apparatus according to the present invention, program for a recording medium, program and the recording medium, according to the present invention, one of a first table stating the relation of correspondence between the presentation time stamp and the address in the AV stream data in the corresponding access unit and a second table stating the relation of correspondence between the arrival time stamp derived from the arrival time point of the transport packet and the address in the AV stream data in the corresponding transport packet is recorded depending on the recording method. In the information processing method and apparatus, program for a recording medium, and the program, according to the present invention, one of a first table stating the relation of correspondence between the presentation time stamp and the address in the AV stream data in the corresponding access unit and a second table stating the relation of correspondence between the arrival time stamp derived from the arrival time point of the transport packet and the address in the AV stream data in the corresponding transport packet, as recorded on a recording medium depending on the recording method, is reproduced from the recording medium to control the output. In the information processing method and apparatus, program for a recording medium, program, and the second recording medium, according to the present invention, the reproduction specifying information comprised of the first information specifying the main reproducing path and the second information specifying the 115 subsidiary reproducing path reproduced in synchronism with the main reproducing path is recorded. In the information processing method and apparatus, program for a recording medium, and the program, according to the present invention, the reproduction specifying information comprised of the first information specifying the main reproducing path and the second information specifying the subsidiary reproducing path is reproduced from the recording medium to control the output accordingly. So, in any instance, the AV stream capable of high speed reproduction and the AV stream incapable of high speed reproduction can be managed in common, whilst post-recording also becomes possible.

Claims (27)

1. An information processing apparatus for recording AV stream data on a recording medium, comprising: first generating means for generating a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet; selection means for selecting one of said first table and the second table depending on a recording method; and first recording means for recording the selected table on said recording medium along with said AV stream data.
2. The information processing apparatus according to claim 1 wherein said first table is EP map; and wherein said second table is TUmap.
3. The information processing apparatus according to claim 1 wherein said selection means selects said second table in case of non-cognizant recording.
4. The information processing apparatus according to claim 1 wherein said selection means selects said first table in case of self encoding recording.
5. The information processing apparatus according to claim 1 wherein said selection 117 means selects said first table in case of cognizant recording.
6. The information processing apparatus according to claim 1 further comprising: second generating means for generating the reproduction specifying information specifying the reproduction of said AV stream data; and second recording means for recording said reproduction specifying information generated by said second generating means on said recording medium; said reproduction specifying information including the identification information for specifying whether the time information of the reproduction domain of said AV stream data is to be represented on the presentation time basis or on the arrival time basis.
7. The information processing apparatus according to claim 6 wherein, if said first table is recorded along with said AV stream data, said reproduction specifying information expresses the time information of the reproduction domain of said AV stream data on the presentation time basis; and wherein if said second table is recorded along with said AV stream data, said reproduction specifying information expresses the time information of the reproduction domain of said AV stream data on the arrival time basis.
8. An information processing method for recording AV stream data on a recording medium, comprising: a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream 118 data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet; a selecting step of selecting one of said first table and the second table depending on a recording method; and a recording step of recording the selected table on said recording medium along with said AV stream data.
9. A recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, said program including a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet; a selecting step of selecting one of said first table and the second table depending on a recording method; and a recording step of recording the selected table on said recording medium along with said AV stream data. 119
10. A program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a generating step of generating a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit, or a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet; a selecting step of selecting one of said first table and the second table depending on a recording method; and a recording step of recording the selected table on said recording medium along with said AV stream data.
11. An information processing apparatus for reproducing AV stream data from a recording medium, comprising: reproducing means for reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet, from said recording medium, which has said first table or said second table recorded thereon depending on a recording method; and 120 control means for controlling the outputting of said AV stream data based on the reproduced table.
12. An information processing method for reproducing AV stream data from a recording medium, comprising: a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet, from said recording medium, which has said first table or said second table recorded thereon depending on a recording method; and a controlling step of controlling the outputting of said AV stream data based on the reproduced table.
13. A recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, said program including a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport 121 packet, from said recording medium, which has said first table or said second table recorded thereon depending on a recording method; and a controlling step of controlling the outputting of said AV stream data based on the reproduced table.
14. A program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a reproducing step of reproducing one of a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet, from said recording medium, having said first table or said second table recorded thereon depending on a recording method; and a controlling step of controlling the outputting of said AV stream data based on the reproduced table.
15. A recording medium having recorded thereon one of a first table describing the relation of correspondence between presentation time stamp and an address in said AV stream data of a corresponding access unit and a second table describing the relation of correspondence between arrival time stamp derived from the arrival time point of a transport packet and an address in said AV stream data of a corresponding transport packet, depending on a recording method. 122
16. An information processing apparatus for recording AV stream data on a recording medium, comprising: generating means for generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information indicating a subsidiary reproducing path; and recording means for recording said AV stream data and the reproduction specifying information on said recording medium.
17. The information processing apparatus according to claim 16 wherein said main reproducing path is a path for the post-recording of audio data.
18. The information processing apparatus according to claim 16 wherein said first information is mainpath and wherein said second information is the Sub-path.
19. The information processing apparatus according to claim 16 wherein said second information includes the type information indicating the type of said subsidiary reproducing information; a filename of said AV stream referenced by said subsidiary reproducing path; in- and out-points of said AV stream of said subsidiary reproducing path; and the time on said main path at which the in-point of said reproducing path starts in synchronization on the time axis of said main path.
20. An information processing method for an information processing apparatus recording AV stream data on a recording medium, comprising: 123 a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and a recording step of recording said AV stream data and said reproduction specifying information on said recording medium.
21. A recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, said program including a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and a recording step of recording said AV stream data and said reproduction specifying information on said recording medium.
22. A program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a generating step of generating the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and 124 a recording step of recording said AV stream data and said reproduction specifying information on said recording medium.
23. An information processing apparatus for reproducing AV stream data from a recording medium, comprising: reproducing means for reproducing, from said recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and control means for controlling the outputting of said AV stream data based on the reproduced reproduction specifying information.
24. An information processing method for reproducing AV stream data from a recording medium, comprising: a reproducing step of reproducing, from said recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and a controlling step of controlling the outputting of said AV stream data based on the reproduced reproduction specifying information.
25. A recording medium having recorded thereon a computer-readable program for an information processing apparatus recording AV stream data on a recording medium, said program including 125 a reproducing step of reproducing, from said recording medium, reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and a controlling step of controlling the outputting of said AV stream data based on the reproduced reproduction specifying information.
26. A program for having a computer, controlling an information processing apparatus recording AV stream data on a recording medium, execute a reproducing step of reproducing, from said recording medium, the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path; and a controlling step of controlling the outputting of said AV stream data based on the reproduced reproduction specifying information.
27. A recording medium having AV stream data recorded thereon, wherein the recording medium has recorded thereon the reproduction specifying information made up of the first information indicating a main reproducing path and the second information specifying a subsidiary reproducing path reproduced in synchronism with said main reproducing path.
AU54403/01A 2000-04-21 2001-04-20 Information processing apparatus and method, program, and recorded medium Expired AU779673B2 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2000183771 2000-04-21
JP2000-183771 2000-04-21
JP2000-271552 2000-09-07
JP2000271552 2000-09-07
PCT/JP2001/003415 WO2001082606A1 (en) 2000-04-21 2001-04-20 Information processing apparatus and method, program, and recorded medium

Publications (2)

Publication Number Publication Date
AU5440301A true AU5440301A (en) 2001-11-07
AU779673B2 AU779673B2 (en) 2005-02-03

Family

ID=26594227

Family Applications (1)

Application Number Title Priority Date Filing Date
AU54403/01A Expired AU779673B2 (en) 2000-04-21 2001-04-20 Information processing apparatus and method, program, and recorded medium

Country Status (22)

Country Link
US (2) US7738776B2 (en)
EP (6) EP1280347A4 (en)
JP (14) JP4599740B2 (en)
KR (2) KR100948439B1 (en)
CN (2) CN100348033C (en)
AU (1) AU779673B2 (en)
BR (3) BRPI0117209B1 (en)
CA (1) CA2377690C (en)
CZ (1) CZ20014489A3 (en)
DK (1) DK1569449T3 (en)
ES (1) ES2399331T3 (en)
HK (1) HK1050975A1 (en)
HU (1) HU229461B1 (en)
IL (1) IL147155A (en)
MX (1) MXPA01013122A (en)
NO (1) NO20016292L (en)
PL (1) PL351918A1 (en)
PT (1) PT1569449E (en)
RO (1) RO122068B1 (en)
RU (1) RU2314653C2 (en)
SK (1) SK18982001A3 (en)
WO (1) WO2001082606A1 (en)

Families Citing this family (195)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6808709B1 (en) * 1994-12-30 2004-10-26 The Regents Of The University Of California Immunoglobulins containing protection proteins and their use
IL132859A (en) * 1999-11-10 2008-07-08 Nds Ltd System for data stream processing
WO2001082611A1 (en) * 2000-04-21 2001-11-01 Sony Corporation Information processing apparatus and method, recorded medium, and program
GB0117926D0 (en) * 2001-07-23 2001-09-12 Nds Ltd Method for random access to encrypted content
KR100871850B1 (en) * 2001-09-27 2008-12-03 삼성전자주식회사 Method and apparatus for recording video data, and information storage medium thereby
JP3656248B2 (en) * 2001-10-09 2005-06-08 ソニー株式会社 Video signal recording apparatus and method, video signal reproducing apparatus and method, recording medium, program, and data structure
JP3716920B2 (en) * 2001-10-16 2005-11-16 ソニー株式会社 Recording medium reproducing apparatus and method, recording medium, and program
GB0127234D0 (en) 2001-11-13 2002-01-02 British Sky Broadcasting Ltd Improvements in receivers for television signals
KR100563668B1 (en) 2001-12-22 2006-03-28 엘지전자 주식회사 Method for recording a dubbing audio on rewritable medium
KR100563667B1 (en) 2001-12-24 2006-03-28 엘지전자 주식회사 Method for recording a still picture on a rewritable medium
KR20030062737A (en) * 2002-01-18 2003-07-28 엘지전자 주식회사 Method for recording a thumbnail picture on high density rewritable medium
KR100563670B1 (en) 2002-01-28 2006-03-28 엘지전자 주식회사 Method for recording a still picture on high density rewritable medium
EP1494472B1 (en) * 2002-04-10 2014-08-06 Sony Corporation Data recording device and method, program storage medium, and program
KR100880627B1 (en) * 2002-04-25 2009-01-30 엘지전자 주식회사 Method for managing a multi-dubbed audio stream record and reproduce
KR20030087193A (en) 2002-05-07 2003-11-14 엘지전자 주식회사 Method for managing a multi-channel broadcast stream record
US8027562B2 (en) * 2002-06-11 2011-09-27 Sanyo Electric Co., Ltd. Method and apparatus for recording images, method and apparatus for recording and reproducing images, and television receiver utilizing the same
CA2462070C (en) 2002-06-21 2012-03-20 Lg Electronics Inc. Recording medium having data structure for managing reproduction of video data recorded thereon
MXPA04002365A (en) * 2002-06-21 2004-11-22 Lg Electronics Inc Recording medium having data structure for managing reproduction of video data recorded thereon.
CN101350215B (en) * 2002-06-24 2012-08-29 Lg电子株式会社 Method and device for recording and reproducing data structure of reproduction for video data
KR20040000290A (en) 2002-06-24 2004-01-03 엘지전자 주식회사 Method for managing multi-path data stream of high density optical disc
WO2004001750A1 (en) * 2002-06-24 2003-12-31 Lg Electronics Inc. Recording medium having data structure for managing reproduction of multiple reproduction path video data recorded thereon and recording and reproducing methods and apparatuses
US7889968B2 (en) 2002-06-24 2011-02-15 Lg Electronics Inc. Recording medium having data structure for managing reproduction of multiple reproduction path video data for at least a segment of a title recorded thereon and recording and reproducing methods and apparatuses
CN101067954B (en) * 2002-06-28 2010-06-23 Lg电子株式会社 Recording medium having data structure for managing reproduction of multiple playback path video data recorded thereon and recording and reproducing method and apparatus
RU2334286C2 (en) * 2002-06-28 2008-09-20 Эл Джи Электроникс Инк. Recording medium with data structure for recording and playback control of data from several channels recorded on it and methods and recording/playback devices
US7974516B2 (en) 2002-09-05 2011-07-05 Lg Electronics Inc. Recording medium having data structure of playlist marks for managing reproduction of still images recorded thereon and recording and reproducing methods and apparatuses
CN100495558C (en) * 2002-09-06 2009-06-03 Lg电子株式会社 Methdo and device for recording and reproducing data structure for managing still images
KR100605188B1 (en) 2002-09-07 2006-07-31 엘지전자 주식회사 Recording medium having data structure for managing reproduction of still images from a clip file recorded thereon and recording and reproducing methods and apparatuses
JP2004128769A (en) * 2002-10-01 2004-04-22 Pioneer Electronic Corp Information recording medium, information recording and/or reproducing apparatus and method, computer program for recording or reproduction control, and data structure including control signal
JP3975147B2 (en) * 2002-10-01 2007-09-12 パイオニア株式会社 Information recording medium, information recording apparatus and method, information reproducing apparatus and method, information recording / reproducing apparatus and method, computer program for recording or reproduction control, and data structure including control signal
EP1408505A1 (en) 2002-10-11 2004-04-14 Deutsche Thomson-Brandt Gmbh Method and apparatus for synchronizing data streams containing audio, video and/or other data
EP1552519A4 (en) 2002-10-14 2006-08-16 Lg Electronics Inc Recording medium having data structure for managing reproduction of multiple audio streams recorded thereon and recording and reproducing methods and apparatuses
RU2334287C2 (en) 2002-10-15 2008-09-20 Эл Джи Электроникс Инк. Recording medium with data structure for managing playback of several graphic streams written on it and methods and devices for playback and recording
CN1685420B (en) 2002-11-08 2010-07-07 Lg电子有限公司 Method and apparatus for recording and reproducing a multi-component data stream on a high-density recording medium
US7720356B2 (en) 2002-11-12 2010-05-18 Lg Electronics Inc Recording medium having data structure for managing reproduction of multiple reproduction path video data recorded thereon and recording and reproducing methods and apparatuses
CA2474229C (en) 2002-11-20 2014-11-04 Lg Electronics Inc. Recording medium having data structure for managing reproduction of still images recorded thereon and recording and reproducing methods and apparatuses
US7783160B2 (en) * 2002-11-20 2010-08-24 Lg Electronics Inc. Recording medium having data structure for managing reproduction of interleaved multiple reproduction path video data recorded thereon and recording and reproducing methods and apparatuses
JP4964414B2 (en) * 2002-11-22 2012-06-27 エルジー エレクトロニクス インコーポレイティド Recording medium having data structure for managing reproduction of recorded multiple reproduction path video data, and recording and reproduction method and apparatus therefor
US20060093314A1 (en) * 2002-12-05 2006-05-04 Koninklijke Philips Electronics N.V. Editing of data frames
JP3815458B2 (en) 2002-12-18 2006-08-30 ソニー株式会社 Information processing apparatus, information processing method, and program
CN100448285C (en) * 2002-12-18 2008-12-31 索尼株式会社 Information processing device, information processing method and program, and recording medium
CN100354941C (en) * 2003-01-20 2007-12-12 Lg电子株式会社 Recording medium having data structure for managing reproduction of still pictures recorded thereon and recording and reproducing methods and apparatuses
EP1593121A4 (en) 2003-01-20 2009-09-30 Lg Electronics Inc Recording medium having data structure for managing reproduction of still pictures recorded thereon and recording and reproducing methods and apparatuses
TWI271721B (en) 2003-01-20 2007-01-21 Lg Electronics Inc Recording medium having data structure for managing reproduction of still pictures recorded thereon and recording and reproducing methods and apparatuses
US8145033B2 (en) 2003-02-05 2012-03-27 Lg Electronics Inc. Recording medium having data structure for managing reproducton duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US7734154B2 (en) 2003-02-14 2010-06-08 Lg Electronics Inc. Recording medium having data structure for managing reproduction duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US8055117B2 (en) 2003-02-15 2011-11-08 Lg Electronics Inc. Recording medium having data structure for managing reproduction duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US7606463B2 (en) * 2003-02-24 2009-10-20 Lg Electronics, Inc. Recording medium having data structure for managing playback control and recording and reproducing methods and apparatuses
US8041179B2 (en) * 2003-02-24 2011-10-18 Lg Electronics Inc. Methods and apparatuses for reproducing and recording still picture and audio data and recording medium having data structure for managing reproduction of still picture and audio data
US7693394B2 (en) * 2003-02-26 2010-04-06 Lg Electronics Inc. Recording medium having data structure for managing reproduction of data streams recorded thereon and recording and reproducing methods and apparatuses
US7809775B2 (en) * 2003-02-27 2010-10-05 Lg Electronics, Inc. Recording medium having data structure for managing playback control recorded thereon and recording and reproducing methods and apparatuses
CN100397882C (en) 2003-02-28 2008-06-25 Lg电子株式会社 Recording medium having data structure for managing random/shuffle reproduction of video data recorded thereon and recording and reproducing methods and apparatuses
US7620301B2 (en) 2003-04-04 2009-11-17 Lg Electronics Inc. System and method for resuming playback
EP1614108B1 (en) * 2003-04-09 2011-07-20 LG Electronics, Inc. Recording medium having a data structure for managing reproduction of text subtitle data and methods and apparatuses of recording and reproducing
KR100977918B1 (en) * 2003-04-23 2010-08-24 파나소닉 주식회사 Recording medium, reproducing apparatus, recording method, and reproducing method
JP4228767B2 (en) 2003-04-25 2009-02-25 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, REPRODUCTION PROGRAM, AND RECORDING MEDIUM
JP4902935B2 (en) 2003-05-08 2012-03-21 ソニー株式会社 Information processing apparatus, information processing method, program, and recording medium
KR101036475B1 (en) * 2003-05-27 2011-05-24 엘지전자 주식회사 Recording medium having data structure for managing main data and additional content data thereof and recording and reproducing methods and apparatuses
JP2005004866A (en) * 2003-06-11 2005-01-06 Sony Corp Device and method for processing information, recording medium, and program
JP3931843B2 (en) 2003-06-13 2007-06-20 株式会社日立製作所 Recording medium and reproducing method
KR20050001171A (en) * 2003-06-27 2005-01-06 엘지전자 주식회사 Method for managing and playing addition contents data for high density optical disc
KR20050012328A (en) * 2003-07-25 2005-02-02 엘지전자 주식회사 Method for managing and reproducing a presentation graphic data of high density optical disc, and high density optical disc therof
BRPI0412839A (en) * 2003-07-24 2006-09-26 Lg Electronics Inc recording media that has a data structure for managing reproduction of text subtitle data recorded on it and recording and reproduction methods and apparatus
EP1654871A4 (en) * 2003-08-12 2007-06-27 Digital Networks North America Method and apparatus for navigating content in a personal video recorder
KR100804380B1 (en) 2003-10-10 2008-02-15 샤프 가부시키가이샤 Reproducing apparatus, video data reproducing method, content recording medium, and computer-readable recording medium
JP5133313B2 (en) * 2003-10-10 2013-01-30 シャープ株式会社 REPRODUCTION DEVICE, VIDEO DATA REPRODUCTION METHOD, CONTROL PROGRAM, AND CONTENT RECORDING MEDIUM
KR20050035678A (en) * 2003-10-14 2005-04-19 엘지전자 주식회사 Method and apparatus for reproducing additional data of optical disc device and, optical disc
KR20050036277A (en) * 2003-10-15 2005-04-20 엘지전자 주식회사 Method for managing navigation information of high density optical disc
KR20050047710A (en) * 2003-11-18 2005-05-23 엘지전자 주식회사 Method for managing and reproducing a composite playlist file of high density optical disc
KR20050048848A (en) * 2003-11-20 2005-05-25 엘지전자 주식회사 Method for managing and reproducing a playlist file of high density optical disc
EP1542231A1 (en) * 2003-12-08 2005-06-15 Canon Kabushiki Kaisha Recording apparatus and recording method capable of recording series of content data on different recording media
US7519274B2 (en) 2003-12-08 2009-04-14 Divx, Inc. File format for multiple track digital data
US8472792B2 (en) 2003-12-08 2013-06-25 Divx, Llc Multimedia distribution system
KR101053575B1 (en) * 2003-12-09 2011-08-03 엘지전자 주식회사 File structure of high density optical disc and high density optical disc
JP4140518B2 (en) * 2003-12-15 2008-08-27 ソニー株式会社 Information processing apparatus and method, and program
KR20050064150A (en) * 2003-12-23 2005-06-29 엘지전자 주식회사 Method for managing and reproducing a menu information of high density optical disc
KR20050066265A (en) * 2003-12-26 2005-06-30 엘지전자 주식회사 Method for managing and reproducing a menu information of high density optical disc
KR20050066264A (en) 2003-12-26 2005-06-30 엘지전자 주식회사 Method for managing and reproducing a menu information of high density optical disc
BRPI0506712A (en) * 2004-01-06 2007-05-02 Lg Electronics Inc physical recording medium, method and apparatus for reproducing and recording text subtitle streams
KR20050072255A (en) * 2004-01-06 2005-07-11 엘지전자 주식회사 Method for managing and reproducing a subtitle of high density optical disc
KR100937421B1 (en) * 2004-01-13 2010-01-18 엘지전자 주식회사 Method for managing and reproducing a File information of high density optical disc
KR20050078907A (en) * 2004-02-03 2005-08-08 엘지전자 주식회사 Method for managing and reproducing a subtitle of high density optical disc
US20080002947A1 (en) * 2004-02-06 2008-01-03 Wataru Ikeda Recording medium, reproduction device, program and reproduction method
US8391672B2 (en) 2004-02-06 2013-03-05 Panasonic Corporation Recording medium, reproduction device, program, and reproduction method
US8879641B2 (en) * 2004-02-10 2014-11-04 Thomson Licensing Storage of advanced video coding (AVC) parameter sets in AVC file format
US20050196146A1 (en) * 2004-02-10 2005-09-08 Yoo Jea Y. Method for reproducing text subtitle and text subtitle decoding system
KR20070028323A (en) * 2004-02-10 2007-03-12 엘지전자 주식회사 Recording medium having a data structure for managing data streams associated with different languages and recording and reproducing methods and apparatuses
KR20070007307A (en) * 2004-02-10 2007-01-15 엘지전자 주식회사 Recording medium having a data structure for managing various data and recording and reproducing methods and apparatuses
WO2005076601A1 (en) * 2004-02-10 2005-08-18 Lg Electronic Inc. Text subtitle decoder and method for decoding text subtitle streams
WO2005074399A2 (en) * 2004-02-10 2005-08-18 Lg Electronics Inc. Recording medium and method and apparatus for decoding text subtitle streams
EP1714281A2 (en) 2004-02-10 2006-10-25 LG Electronic Inc. Recording medium and method and apparatus for decoding text subtitle streams
WO2005076273A1 (en) * 2004-02-10 2005-08-18 Lg Electronic Inc. Recording medium having a data structure for managing font information for text subtitles and recording and reproducing methods and apparatuses
RU2377669C2 (en) * 2004-02-10 2009-12-27 ЭлДжи ЭЛЕКТРОНИКС ИНК. Recording medium with data structure for managing different data, and method and device for recording and playing back
AU2011218752A1 (en) * 2004-02-16 2011-09-22 Sony Corporation Reproduction device, reproduction method, program, recording medium, and data structure
KR101104528B1 (en) * 2004-02-16 2012-01-11 소니 주식회사 Reprodution device, reproduction method, and recording medium
WO2005081643A2 (en) * 2004-02-26 2005-09-09 Lg Electronics Inc. Recording medium and method and apparatus for reproducing and recording text subtitle streams
US8271752B2 (en) 2004-03-12 2012-09-18 Lg Electronics, Inc. Recording medium, method and apparatus for recording on recordable recording medium, and method for managing backup files of the same
KR20060043573A (en) 2004-03-12 2006-05-15 엘지전자 주식회사 Recording medium, method and apparatus for recording on recordable recording medium, and method for managing backup files of the same
ES2338019T3 (en) 2004-03-18 2010-05-03 Lg Electronics Inc. RECORDING MEDIA AND METHOD AND APPLIANCE TO PLAY A FLOW OR CURRENT OF TEXT SUBTITLES RECORDED IN THE RECORDING MEDIA.
CN1934625B (en) * 2004-03-26 2010-04-14 Lg电子株式会社 Method and apparatus for reproducing and recording text subtitle streams
WO2005099257A1 (en) * 2004-04-07 2005-10-20 Matsushita Electric Industrial Co., Ltd. Information recording medium wherein stream convertible at high-speed is recorded, and recording apparatus and recording method therefor
WO2005099260A1 (en) * 2004-04-07 2005-10-20 Matsushita Electric Industrial Co., Ltd. Information recording apparatus and information converting method
US8055122B2 (en) * 2004-04-07 2011-11-08 Panasonic Corporation Information recording medium wherein stream convertible at high-speed is recorded, and recording apparatus and recording method therefor
CN101577803B (en) * 2004-04-07 2011-04-13 松下电器产业株式会社 Information recording apparatus and recording method therefor
JP4105207B2 (en) * 2004-04-07 2008-06-25 松下電器産業株式会社 Information recording medium recording stream capable of high-speed conversion, recording apparatus and recording method therefor
JP4249224B2 (en) * 2004-04-16 2009-04-02 パナソニック株式会社 Playback apparatus and recording method
US7720355B2 (en) * 2004-04-16 2010-05-18 Panasonic Corporation Recording medium, reproduction device, program
JP4186871B2 (en) * 2004-05-21 2008-11-26 船井電機株式会社 Recording / playback device
JP4764855B2 (en) * 2004-06-02 2011-09-07 パナソニック株式会社 Recording method and playback system
JP4658986B2 (en) * 2004-06-02 2011-03-23 パナソニック株式会社 System LSI
KR100884150B1 (en) 2004-06-02 2009-02-17 파나소닉 주식회사 Recording medium capable of performing a high-speed random access in a slide show, reproduction device, computer readable medium, recording method, and reproduction method
JP4608953B2 (en) * 2004-06-07 2011-01-12 ソニー株式会社 Data recording apparatus, method and program, data reproducing apparatus, method and program, and recording medium
JP4542546B2 (en) * 2004-06-10 2010-09-15 パナソニック株式会社 Data processing device
KR100667756B1 (en) * 2004-07-01 2007-01-11 삼성전자주식회사 Method and apparatus for storing/searching broadcasting stream
ES2383276T3 (en) 2004-08-17 2012-06-19 Panasonic Corporation Device and image coding procedure
US7613384B2 (en) * 2004-08-17 2009-11-03 Lg Electronics Inc. Method for configuring composite file structure for data reproduction, and method and apparatus for reproducing data using the composite file structure
US7725010B2 (en) * 2004-08-17 2010-05-25 Lg Electronics, Inc. Method and apparatus of reproducing data recorded on recording medium and local storage
US7609939B2 (en) * 2004-08-17 2009-10-27 Lg Electronics Inc. Method and apparatus of reproducing data recorded on recording medium and local storage
US7609945B2 (en) * 2004-08-17 2009-10-27 Lg Electronics Inc. Recording medium, and method and apparatus for reproducing data from the recording medium
JP2006066943A (en) 2004-08-24 2006-03-09 Sony Corp Information processing apparatus and method, and program
WO2006025606A1 (en) * 2004-09-02 2006-03-09 Nec Corporation Data broadcasting recording method and device, and recording medium
US7609947B2 (en) * 2004-09-10 2009-10-27 Panasonic Corporation Method and apparatus for coordinating playback from multiple video sources
US20060056804A1 (en) * 2004-09-13 2006-03-16 Seo Kang S Recording medium, and method and apparatus for reproducing data from the recording medium
US7599611B2 (en) * 2004-09-13 2009-10-06 Lg Electronics Co. Recording medium, and method and apparatus of reproducing data recorded on the same
KR100782810B1 (en) 2005-01-07 2007-12-06 삼성전자주식회사 Apparatus and method of reproducing an storage medium having metadata for providing enhanced search
US8842977B2 (en) 2005-01-07 2014-09-23 Samsung Electronics Co., Ltd. Storage medium storing metadata for providing enhanced search function
WO2006080194A1 (en) 2005-01-26 2006-08-03 Sharp Kabushiki Kaisha Information recording/reproduction device and information recording medium
JP4968506B2 (en) 2005-03-04 2012-07-04 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM
EP1879191A4 (en) * 2005-03-22 2013-08-14 Panasonic Corp Stream data recording device, stream data recording/reproducing device, stream data reproduction device, stream data editing device, stream recording method, and stream reproducing method
US8752198B2 (en) * 2005-05-26 2014-06-10 Hewlett-Packard Development Company, L.P. Virtual write protection system
CN101228584B (en) * 2005-07-27 2010-12-15 松下电器产业株式会社 Information recording apparatus, and recording method
WO2007049189A2 (en) * 2005-10-24 2007-05-03 Koninklijke Philips Electronics N.V. Method and apparatus for editing an optical disc
CN101305425B (en) * 2005-11-07 2012-06-27 皇家飞利浦电子股份有限公司 Method and apparatus editing optical disk program
JP5200204B2 (en) 2006-03-14 2013-06-05 ディブエックス リミテッド ライアビリティー カンパニー A federated digital rights management mechanism including a trusted system
US8125987B2 (en) * 2006-03-30 2012-02-28 Broadcom Corporation System and method for demultiplexing different stream types in a programmable transport demultiplexer
JP4719053B2 (en) * 2006-03-31 2011-07-06 株式会社東芝 Reproduction method using entry point and recording / reproduction apparatus using this method
JP4765733B2 (en) 2006-04-06 2011-09-07 ソニー株式会社 Recording apparatus, recording method, and recording program
JP4591405B2 (en) * 2006-05-10 2010-12-01 ソニー株式会社 Information processing apparatus, information processing method, and computer program
JP4162691B2 (en) * 2006-09-27 2008-10-08 株式会社東芝 Program structuring apparatus, program structuring method, and program
KR101318081B1 (en) 2006-11-21 2013-10-14 엘지디스플레이 주식회사 LCD and drive method thereof
JP4735524B2 (en) * 2006-12-06 2011-07-27 株式会社日立製作所 Recording method
JP4735525B2 (en) 2006-12-06 2011-07-27 株式会社日立製作所 Recording method
JP4775241B2 (en) 2006-12-06 2011-09-21 株式会社日立製作所 Recording method
JP2008152871A (en) * 2006-12-19 2008-07-03 Victor Co Of Japan Ltd Information recording and reproducing device and reproducing device
EP2122482B1 (en) 2007-01-05 2018-11-14 Sonic IP, Inc. Video distribution system including progressive playback
JP2008204560A (en) * 2007-02-21 2008-09-04 D & M Holdings Inc Reproducing device, reproducing method, program, and recording medium
JP4852453B2 (en) 2007-03-19 2012-01-11 株式会社日立製作所 Recording apparatus, video reproduction apparatus, and special reproduction method thereof
JP5034608B2 (en) 2007-03-30 2012-09-26 株式会社日立製作所 Recording method
CN101437150B (en) * 2007-11-16 2011-11-09 华为技术有限公司 Apparatus and method for providing association information
CN101861583B (en) 2007-11-16 2014-06-04 索尼克Ip股份有限公司 Hierarchical and reduced index structures for multimedia files
JP4924447B2 (en) * 2008-01-25 2012-04-25 ソニー株式会社 Scene switching point detector, scene switching point detection method, recording device, event generator, event generation method, and playback device
JP5031608B2 (en) * 2008-02-01 2012-09-19 キヤノン株式会社 Playback apparatus and storage medium
JP4506879B2 (en) * 2008-06-09 2010-07-21 ソニー株式会社 Recording apparatus, recording method, program, and recording system
WO2010111233A2 (en) * 2009-03-23 2010-09-30 Medicis Technologies Corporation Analysis of real time backscatter data for fault signal generation in a medical hifu device
US8781122B2 (en) 2009-12-04 2014-07-15 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
JP5494152B2 (en) 2010-04-08 2014-05-14 ソニー株式会社 Information processing apparatus, information recording medium, information processing method, and program
JP5601006B2 (en) 2010-04-08 2014-10-08 ソニー株式会社 Information processing apparatus, information recording medium, information processing method, and program
JP5577805B2 (en) 2010-04-08 2014-08-27 ソニー株式会社 Information processing apparatus, information recording medium, information processing method, and program
JP2011223248A (en) 2010-04-08 2011-11-04 Sony Corp Information processing device, information recording medium and information processing method, and program
JP2011223247A (en) 2010-04-08 2011-11-04 Sony Corp Information processing device, information recording medium and information processing method, and program
JP2010225266A (en) * 2010-04-23 2010-10-07 Toshiba Corp Reproducing method using entry point, and recording and reproducing device using the method
JP5110135B2 (en) * 2010-08-30 2012-12-26 ソニー株式会社 recoding media
US8914534B2 (en) 2011-01-05 2014-12-16 Sonic Ip, Inc. Systems and methods for adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol
JP2011175729A (en) * 2011-04-08 2011-09-08 Hitachi Ltd Recording medium and reproduction method
US8812662B2 (en) 2011-06-29 2014-08-19 Sonic Ip, Inc. Systems and methods for estimating available bandwidth and performing initial stream selection when streaming content
CN102915316A (en) * 2011-08-05 2013-02-06 宏碁股份有限公司 Electronic system and multimedia playing method
KR101928910B1 (en) 2011-08-30 2018-12-14 쏘닉 아이피, 아이엔씨. Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US9467708B2 (en) 2011-08-30 2016-10-11 Sonic Ip, Inc. Selection of resolutions for seamless resolution switching of multimedia content
US8806188B2 (en) 2011-08-31 2014-08-12 Sonic Ip, Inc. Systems and methods for performing adaptive bitrate streaming using automatically generated top level index files
US8799647B2 (en) 2011-08-31 2014-08-05 Sonic Ip, Inc. Systems and methods for application identification
US8964977B2 (en) 2011-09-01 2015-02-24 Sonic Ip, Inc. Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
JP5999405B2 (en) 2011-11-28 2016-09-28 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2013115552A (en) 2011-11-28 2013-06-10 Sony Corp Information processor, information processing method, and program
JP6010900B2 (en) * 2011-11-29 2016-10-19 ソニー株式会社 Information processing apparatus, information processing method, and program
US8918908B2 (en) 2012-01-06 2014-12-23 Sonic Ip, Inc. Systems and methods for accessing digital content using electronic tickets and ticket tokens
US9936267B2 (en) 2012-08-31 2018-04-03 Divx Cf Holdings Llc System and method for decreasing an initial buffering period of an adaptive streaming system
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9380099B2 (en) 2013-05-31 2016-06-28 Sonic Ip, Inc. Synchronizing multiple over the top streaming clients
US9100687B2 (en) 2013-05-31 2015-08-04 Sonic Ip, Inc. Playback synchronization across playback devices
US9386067B2 (en) 2013-12-30 2016-07-05 Sonic Ip, Inc. Systems and methods for playing adaptive bitrate streaming content by multicast
US9866878B2 (en) 2014-04-05 2018-01-09 Sonic Ip, Inc. Systems and methods for encoding and playing back video at different frame rates using enhancement layers
CA2952847A1 (en) 2014-08-07 2016-02-11 Sonic Ip, Inc. Systems and methods for protecting elementary bitstreams incorporating independently encoded tiles
EP3243130B1 (en) 2015-01-06 2019-08-14 Sonic IP, Inc. Systems and methods for encoding and sharing content between devices
ES2768979T3 (en) 2015-02-27 2020-06-24 Divx Llc System and method for frame duplication and frame magnification in streaming and encoding of live video
US10075292B2 (en) 2016-03-30 2018-09-11 Divx, Llc Systems and methods for quick start-up of playback
US10231001B2 (en) 2016-05-24 2019-03-12 Divx, Llc Systems and methods for providing audio content during trick-play playback
US10129574B2 (en) 2016-05-24 2018-11-13 Divx, Llc Systems and methods for providing variable speeds in a trick-play mode
US10148989B2 (en) 2016-06-15 2018-12-04 Divx, Llc Systems and methods for encoding video content
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
ES2974683T3 (en) 2019-03-21 2024-07-01 Divx Llc Systems and methods for multimedia swarms
JP7540233B2 (en) * 2020-08-06 2024-08-27 オムロン株式会社 Control device

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2807456B2 (en) 1995-02-03 1998-10-08 株式会社東芝 Image information recording medium and method of manufacturing the same
JP4095681B2 (en) * 1995-02-24 2008-06-04 株式会社日立製作所 Data recording method and apparatus, and data recording / reproducing apparatus
US6002834A (en) 1995-02-24 1999-12-14 Hitachi, Ltd. Optical disk having table relating sector address and time and optical disk reproducing apparatus
JP3309656B2 (en) 1995-08-18 2002-07-29 ソニー株式会社 Image processing apparatus and image processing method
JPH09238303A (en) 1996-02-29 1997-09-09 Ricoh Co Ltd Digital still video camera
US5838876A (en) * 1996-09-24 1998-11-17 Sony Corporation Frame-accurate edit and playback in digital stream recording
JPH10145735A (en) * 1996-11-05 1998-05-29 Toshiba Corp Decoding device and method for reproducing picture and sound
US5938876A (en) * 1997-01-31 1999-08-17 Morrison International, Inc. Method of making eyeglass lenses with add on segments
JP4363671B2 (en) 1997-03-20 2009-11-11 ソニー株式会社 Data reproducing apparatus and data reproducing method
JP3791114B2 (en) * 1997-04-30 2006-06-28 ソニー株式会社 Signal reproducing apparatus and method
EP1300850A3 (en) * 1997-09-17 2004-06-09 Matsushita Electric Industrial Co., Ltd. Video data editing apparatus, optical disc for use a recording medium of a video data editing apparatus, and computer-readable recoding medium storing an editing program
EP0910087B1 (en) * 1997-10-17 2011-11-30 Sony Corporation Recording apparatus and method, reproducing apparatus and method, recording/reproducing apparatus and method, recording medium and distribution medium
JPH11149717A (en) 1997-11-19 1999-06-02 Toshiba Corp Decoding processing method and device
JP3324480B2 (en) * 1997-12-05 2002-09-17 松下電器産業株式会社 Recording / reproducing device and reproducing device
TW385436B (en) 1997-12-12 2000-03-21 Toshiba Corp Digital recording system using variable recording rate
JPH11259992A (en) 1998-03-10 1999-09-24 Toshiba Corp Information record medium, information recorder, information editing device and digital broadcast recorder
JP2002511975A (en) * 1998-03-19 2002-04-16 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Recording and / or playback of real-time information on a disc such as a record carrier
TW439054B (en) * 1998-04-08 2001-06-07 Matsushita Electric Ind Co Ltd Optical disc, optical disc recording method and apparatus, and optical disc reproducing method and apparatus
JPH11298845A (en) * 1998-04-08 1999-10-29 Matsushita Electric Ind Co Ltd Optical disk, optical disk recorder and optical disk player
JP3997367B2 (en) 1998-04-30 2007-10-24 ソニー株式会社 Recording / reproducing apparatus and method, and recording medium
JP3356991B2 (en) * 1998-06-17 2002-12-16 株式会社日立製作所 Optical disc, recording method, recording device, reproducing method, and reproducing device
GB9813831D0 (en) 1998-06-27 1998-08-26 Philips Electronics Nv Frame-accurate editing of encoded A/V sequences
JP3677176B2 (en) 1998-07-07 2005-07-27 株式会社東芝 Information recording method, medium and reproducing apparatus for object division and erasure prohibition flag processing
JP3383587B2 (en) * 1998-07-07 2003-03-04 株式会社東芝 Still image continuous information recording method, optical disc, optical disc information reproducing apparatus and information reproducing method
EP0986062A1 (en) * 1998-09-07 2000-03-15 Deutsche Thomson-Brandt Gmbh Method for addressing a bit stream recording
EP0991072A1 (en) * 1998-09-07 2000-04-05 Deutsche Thomson-Brandt Gmbh Method for addressing a bit stream recording
EP1122951B1 (en) * 1998-09-08 2003-11-19 Sharp Kabushiki Kaisha Time-varying image editing method and time-varying image editing device
CA2289958C (en) * 1998-11-19 2003-01-21 Tomoyuki Okada Information recording medium, apparatus and method for recording or reproducing data thereof
EP1021048A3 (en) 1999-01-14 2002-10-02 Kabushiki Kaisha Toshiba Digital video recording system and its recording medium
JP4763892B2 (en) * 1999-03-01 2011-08-31 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method for storing a real-time stream of information signals on a disc-shaped record carrier
JP2001076473A (en) * 1999-09-07 2001-03-23 Fujitsu Ltd Recording and reproducing device
JP4389365B2 (en) * 1999-09-29 2009-12-24 ソニー株式会社 Transport stream recording apparatus and method, transport stream playback apparatus and method, and program recording medium
JP4328989B2 (en) * 1999-11-24 2009-09-09 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND RECORDING MEDIUM

Also Published As

Publication number Publication date
KR20080091525A (en) 2008-10-13
DK1569449T3 (en) 2013-01-02
CN1381137A (en) 2002-11-20
EP1280347A4 (en) 2004-10-06
US20050025461A1 (en) 2005-02-03
HK1050975A1 (en) 2003-07-11
JP5063808B2 (en) 2012-10-31
JP2012065363A (en) 2012-03-29
JP2011166801A (en) 2011-08-25
CN1607825A (en) 2005-04-20
KR100875782B1 (en) 2008-12-24
BRPI0117209B1 (en) 2019-01-15
ES2399331T3 (en) 2013-03-27
JP2012130005A (en) 2012-07-05
EP2256739A3 (en) 2012-01-18
JP4919129B2 (en) 2012-04-18
JP2012065359A (en) 2012-03-29
EP2256738A3 (en) 2012-01-18
JP4999972B2 (en) 2012-08-15
KR100948439B1 (en) 2010-03-17
JP2011004413A (en) 2011-01-06
HU229461B1 (en) 2013-12-30
RU2314653C2 (en) 2008-01-10
PL351918A1 (en) 2003-06-30
JP4915484B2 (en) 2012-04-11
MXPA01013122A (en) 2002-06-04
JP2012130019A (en) 2012-07-05
JP4919127B2 (en) 2012-04-18
JP5051804B2 (en) 2012-10-17
JP4919128B2 (en) 2012-04-18
BR0106082A (en) 2002-05-21
WO2001082606A1 (en) 2001-11-01
EP1569449A2 (en) 2005-08-31
US8670646B2 (en) 2014-03-11
JP2011166803A (en) 2011-08-25
PT1569449E (en) 2013-01-24
JP2012065361A (en) 2012-03-29
JP2010148140A (en) 2010-07-01
KR20020020918A (en) 2002-03-16
JP5051805B2 (en) 2012-10-17
JP5051807B2 (en) 2012-10-17
IL147155A (en) 2008-07-08
BRPI0106082B1 (en) 2019-01-15
JP2011166800A (en) 2011-08-25
EP2256737A3 (en) 2012-01-11
EP1280347A1 (en) 2003-01-29
JP5051802B2 (en) 2012-10-17
CA2377690C (en) 2013-09-10
EP1569449B1 (en) 2012-11-21
US7738776B2 (en) 2010-06-15
JP2011166802A (en) 2011-08-25
CN100348033C (en) 2007-11-07
EP2256736A3 (en) 2012-01-11
US20020135607A1 (en) 2002-09-26
JP4947159B2 (en) 2012-06-06
CN100394791C (en) 2008-06-11
EP2256738A2 (en) 2010-12-01
HUP0202198A2 (en) 2002-10-28
RO122068B1 (en) 2008-11-28
JP2002158972A (en) 2002-05-31
RU2005117968A (en) 2006-12-20
NO20016292D0 (en) 2001-12-20
IL147155A0 (en) 2002-08-14
JP5051803B2 (en) 2012-10-17
CA2377690A1 (en) 2001-11-01
EP1569449A3 (en) 2005-11-09
EP2256736A2 (en) 2010-12-01
JP4599740B2 (en) 2010-12-15
NO20016292L (en) 2002-02-20
EP2256739A2 (en) 2010-12-01
EP2256737A2 (en) 2010-12-01
CZ20014489A3 (en) 2002-10-16
JP2012130006A (en) 2012-07-05
JP2011135616A (en) 2011-07-07
JP4919130B2 (en) 2012-04-18
AU779673B2 (en) 2005-02-03
SK18982001A3 (en) 2003-01-09

Similar Documents

Publication Publication Date Title
CA2377690C (en) Information processing method and apparatus, program and recording medium
US7580613B2 (en) Information processing apparatus and method, recorded medium, and program
US7941033B2 (en) Information processing method and apparatus, program and recording medium
US7236687B2 (en) Information processing apparatus and method, program, and recording medium
JP4517267B2 (en) Recording apparatus and method, reproducing apparatus and method, program, and recording medium
EP2546833A2 (en) Information processing apparatus, method and computer program
ZA200110323B (en) Information processing apparatus and method, program, and recorded medium.

Legal Events

Date Code Title Description
MK14 Patent ceased section 143(a) (annual fees not paid) or expired