GB2361130A - Identifying video and/or audio material - Google Patents

Identifying video and/or audio material Download PDF

Info

Publication number
GB2361130A
GB2361130A GB0008436A GB0008436A GB2361130A GB 2361130 A GB2361130 A GB 2361130A GB 0008436 A GB0008436 A GB 0008436A GB 0008436 A GB0008436 A GB 0008436A GB 2361130 A GB2361130 A GB 2361130A
Authority
GB
United Kingdom
Prior art keywords
medium
identifiers
video
tape
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0008436A
Other versions
GB0008436D0 (en
Inventor
Morgan William Amos David
James Hedley Wilkinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Europe Ltd
Original Assignee
Sony United Kingdom Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony United Kingdom Ltd filed Critical Sony United Kingdom Ltd
Priority to GB0008436A priority Critical patent/GB2361130A/en
Publication of GB0008436D0 publication Critical patent/GB0008436D0/en
Priority to PCT/GB2001/001458 priority patent/WO2001075885A2/en
Priority to AU42651/01A priority patent/AU4265101A/en
Priority to EP08005086.7A priority patent/EP1939878A3/en
Priority to JP2001573478A priority patent/JP2003529877A/en
Priority to EP01915566A priority patent/EP1208566A2/en
Publication of GB2361130A publication Critical patent/GB2361130A/en
Priority to US10/016,828 priority patent/US7778516B2/en
Priority to US11/946,099 priority patent/US20080069515A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/107Programmed access in sequence to addressed parts of tracks of operating record carriers of operating tapes
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • G11B27/3036Time code signal
    • G11B27/3054Vertical Interval Time code [VITC]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • G11B27/323Time code signal, e.g. on a cue track as SMPTE- or EBU-time code
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/90Tape-like record carriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/7824Television signal recording using magnetic recording on tape with rotating magnetic heads
    • H04N5/7826Television signal recording using magnetic recording on tape with rotating magnetic heads involving helical scanning of the magnetic tape
    • H04N5/78263Television signal recording using magnetic recording on tape with rotating magnetic heads involving helical scanning of the magnetic tape for recording on tracks inclined relative to the direction of movement of the tape
    • H04N5/78266Television signal recording using magnetic recording on tape with rotating magnetic heads involving helical scanning of the magnetic tape for recording on tracks inclined relative to the direction of movement of the tape using more than one track for the recording of one television field or frame, i.e. segmented recording

Abstract

A video and/or audio signal processing system comprises a recorder for recording video and/or audio material on a recording medium the recorder including a first generator for generating first material identifiers for identifying respective pieces of material on the medium such that each piece is differentiated from other pieces on the medium, and a second generator for generating second, universally unique, identifiers for pieces of material, the second generator generating the second identifiers in accordance with the first identifiers.

Description

2361130 1-00-18 1 Identifying Video and/or Audio Material The present
invention relates to identifying video and /or audio and /or data.
It has been proposed to identify video and/or audio material and/or data using UMIDs which are identifiers which universally uniquely identify material. UMIDs in principle could identify video material to the accuracy of one frame. There is a basic UMID and an extended UMID. A basic UMID has 32 bytes each of 8 bits and an extended UMID has 64 bytes. It may be possible in some circumstances to use a reduced data structure UMID where for instance some data of a plurality of UMIDs is common to the plurality of UMIDs.
It is clearly desirable to associate the identifiers as closely as possible with the material which they identify and most preferably include the identifiers in the material or in the case of material recorded on a recording medium record the identifiers on the medium. However, there is little or no spare data capacity in some media, especially tape. In addition it is desirable to record other data such as Good Shot Markers (GSMs) on the medium with the material. Thus other desirable data competes for space on the media.
According to the present invention, there is provided a video and/or audio and /or data signal processing system comprising a recorder for recording video and/or audio and/or data material on a recording medium the recorder including a first generator for generating first material identifiers for identifying respective pieces of material on the medium such that each piece is differentiated from other pieces on the medium, and a second generator for generating second, universally unique, identifiers for pieces of material, second identifiers being generated in respect of one or more of the first identifiers.
A second identifier may be generated for each of the first identifiers. A second identifier may be generated in respect of a group of two or more first identifiers.
The first identifiers, which need to distinguish the pieces of material on the medium, but need not be universally unique, can thus be smaller than universally unique identifiers. For example the first identifiers may comprise only two bytes.
1-00-18 2 That is sufficient to allow the second generator to generate the second identifiers.
Also, it minimises the amount of data which need be stored on the medium to identify material thereon allowing other data, e.g. Good Shot Markers, to be recorded.
In preferred embodiments, a medium identifier is provided which identifies the medium. It is for example a serial number. The second generator generates the second identifiers in dependence on the medium identifier and the first identifiers of the material on the medium.
In another embodiment, the medium is housed in a housing supporting a data store. The data store preferably stores the medium identifier and may also store at least one of the first identifiers. Most preferably the first identifiers are recorded on the medium and the store stores only the last produced of the first identifiers to enable the first generator to produce the first identifiers in a regulated manner.
The use of the first identifiers or of the first identifiers plus medium identifiers which may be placed on the medium and/or in the data store allows existing record medium formats especially tape formats to use the identifiers and be incorporated in a production and distribution system which uses universally unique identifiers such as UMIDs and metadata bases. Existing tape formats can accommodate the first identifiers and tape cassettes having data stores are in current use (at the application date of this application).
The embodiments of the invention address the problem of labelling tapes and other recording media by providing the medium identifier. The use of short first identifiers allows GSMs to be recorded. The data store is not essential but can be used if available.
These and other aspects and advantages of the invention are set out in the following description and in the claims.
1-00-18 3 For a better understanding of the present invention, reference will now be made, by way of example, to the accompanying drawings in which: Figures 1 to 11 illustrate illustrative versions of the present invention; Figures 12 and 13 illustrate UNfiDs Figure 14 illustrates a data structure for metadata in a data base Figures 15 to 21 illustrate an A- Box Figures 22 to 27 illustrate a B-Box and Figures 28 to 30 illustrate Tape IDs in linear time code.
Overview- Figures 1 to 11 The following description refers to:
UMIDs- which are described with reference to Figures 12 and 13); Metadatawhich is described with reference to Figure 14; A Box- which is described with reference to Figures 15 to 2 1; B Box which is described with reference to -Figures 22 to 27;and. TWe IDs which are described with reference to Figures 28 to 30.
Referring to Figure 1, a carncorder 500 is equipped with an A-box 152. The camcorder 500 records video and audio material on a recording medium which may be a tape 126 or a disc for example. The following description refers to tape for convenience and because tape is currently the most common recording medium for camcorders. The tape is housed in a cassette which supports a data store 502 additional to the tape. The store 502 may be a Telefile (Trade Mark). Recorded material is transferred on the tape 126 to a VTR 204 which is connected to a B-Box 178. Both the A-Box and the B-Box are linked by communication links 174 and 180 to a database processor 176 storing a metadata base. Examples of metadata stored in the database are given in the section Metadata below. The metadata relates to the material recorded on the tape 126.
In the present embodiment of the invention, the metadata is linked to the material by UMIDs, which are described in the section UMIDs below and, 1-00-18 4 in accordance with the invention, by at least MURNs -Material Reference Numbers. UMIDs which are universally unique identifiers have 23, 32 or 64 bytes as described below. MURNs have 2 bytes in currently preferred embodiments of the invention and are intended to uniquely identify each piece of material on the tape but not be universally unique. Together with the time code bits, the MURNs identify each clip on the tape to frame accuracy.
In the embodiment of Figure 1, a tape ID is recorded in the datastore 502. Also the tape ID may be applied to the tape. The tape ID may be recorded on the tape during a prestriping process.
MURNs are generated as the material is recorded on the tape. The MURNs are preferably recorded in the user bits of tape time codes. That may be done in a similar way as recording tape IDs in time code as described below. Preferably, at least the last recorded one of the MURNs is also recorded in the data store 502. All the MURNs may be recorded in the data store 502 but that is not essential in this embodiment.
The camcorder 500 has a machine ID which is for example the serial number of the camcorder. Preferably in this embodiment the machine ID is recorded in the data store 502. It may be recorded on the tape if there is sufficient space.
The tape 126, after material is recorded on it, is transferred to a VTR 204 which is coupled to a B-Box 178. The VTR 204 and B-Box 178 together read the MURNs, the Tape ID, and the machine ID, and the B-Box associates a UMID with each MURN. In this embodiment, the UMID is retrieved from the database 176. Each UMID is associated with a combination of the Tape ID and a MURN and the associated identifiers (UMID, Tape ID and MURN) are stored in the database 176. For that purpose the B-Box is coupled to the database processor via a communications link which may be a network link, e.g. an internet link.
In the embodiment of Figure 1, the A-Box 152 generates, for each MURN, a UMID and transfers to the database 176, via a communications link 174, the UMIDs, MURNs, tape ID and any metadata which is generated at the 1-00-18 3^0 A-Box. Thus the UMID is available in the database to be retrieved by the BBox 178.
Good Shot Markers (GSMs) and/or other metadata may be stored on the tape.
Referring to Figure 2, the embodiment of Figure 2 differs from that of Figure 1 in that the A-Box 152 is omitted, MURNs, Tape 11) and Machine ID are generated by the camcorder without the A-Box and UMIDs are generated by the B-Box instead of being retrieved from the database 176. Preferably, the Tape ID and machine ID are recorded in the datastore 502 and the MURNs are recorded in the user bits of time codes on the tape. However they may be recorded in the other ways described with reference to the embodiment of Figure 1. The UMIDs are synthesised in the B-Box using machine ID, Tape ID and MURNs.
Good Shot Markers (GSMs) and/or other metadata may be stored on the tape.
Referring to Figure 3, the embodiment of Figure 3 differs from that of Figure 1 in that: the A-Box 152 is omitted; MURNs are generated by the camcorder without the A-Box; Tape ID is manually entered at the B-Box; machine ID identifying the camcorder is not used; the tape has no datastore; and UMIDs are generated by the B-Box instead of being retrieved from the database 176. The MURNs are recorded in the user bits of time codes on the tape. The UMIDs are synthesised in the B-Box using the MURNs, a machine ID identifying the VTR 204 and the manually entered Tape ID. Preferably the Tape ID is manually written on a physical label on the cassette. The UMIDs and associated MURNs and Tape ID are transferred to the database 176 via the link 180. It is possible that MURNs are duplicated on a tape in the absence of the datastore: that may be detected by the database 176.
Good Shot Markers (GSMs) and/or other metadata may be stored on the tape.
Referring to Figure 4, the embodiment of Figure 4 differs from that of Figure 1 in that: the tape has no datastore 176; the Tape ID is manually entered 1-00-18 6 at the B-Box; machine ID identifying the camcorder 500 is not used; and UMIDs are generated by the B-Box instead of being retrieved from the database 176. The MURNs are recorded in the user bits of time codes on the tape. The UMIDs are synthesised in the B-Box using the MURNs, and the manually entered Tape ID. The Tape ID is manually written on a physical label on the cassette. The UMIDs and associated MURNs and Tape ID are transferred to the database 176 via the link 180. It is possible that MURNs are duplicated on a tape in the absence of the datastore: that may be detected by the database 176.
In the embodiment of Figure 4 the A-Box 152 associates with each MURN, any metadata which is generated at the A-Box and the tape ID, which is entered at the A-Box and also written on a label on the tape 126. Thus the database can associate the metadata, MURNs and tape ID transferred to it from the A-Box with the UMIDs and associated MURNs and tape ID from the BBox.
The B-Box may also use the machine ID of the carncorder 500 in association with the UMIDs, and the A-Box may transfer the machine ID of the carncorder if that ID is recorded on the tape or written on the label of the tape.
The Tape ID may be recorded on the tape and thus it may be detected by the VTR 204 and B-Box avoiding the need to manually enter it at the B-Box.
Good Shot Markers (GSMs) and/or other metadata may be stored on the tape.
Referring to Figure 5, there is shown a UMID which is further described in the section UMIDs below. The UMID may be generated based on the data in the datastore 502 or recorded on the tape, such data including at least the MURNs and preferably also the Tape ID and most preferably also the machine ID. Figure 5 assumes that data is stored in the datastore, (Telefile) 502. When generating the UMID:- the UMID type value, byte 11 of the universal label, is set, the default being 04H, i.e. group; the creation type, byte 12 of the universal label, is set locally at ingestion; and the material number is set, it comprising 8 1-00-18 7 bytes of time snap set at ingest, 2 bytes of random Number (Rnd) and 6 bytes of Machine Node which is made up of data values passed through the datastore 502.
Referring to Figures 6 to 8, an example of the operation of the system of Figure 1 will be given. In Figures 6 to 8 "tape label" denotes the datastore 502, e.g. a Telefile. It is assumed the Tape ID is stored in at least the datastore 502.
Referring to Figure 6A, a blank tape is inserted in to the camcorder 500. The datastore stores the Tape ID and a number 0 indicating the number of tape erasures.
Referring to Figure 6B, assume 4 clips have been recorded on the tape by the same camcorder 500, camcorder A. The clips have respective MURNs 1 to 4 allocated to them and recorded in the user bits of time codes on the tape. As shown in Figure 6C, assume one more clip with MURN 5 is recorded on the tape and then the tape is ejected from the cameorder 500. The MURN 5 is stored in the datastore 502 and the machine ID (or a shortened proxy version thereof) is stored in the datastore 502. The MURN 5 is recorded in the data store to allow the next correct MURN number to be generated when the tape is next used.
Referring to Figure 7A assume the same tape is inserted into another camcorder B. The datastore 502 indicates the last recorded MURN is 5. As shown in Figure 7B, more clips are recorded on the tape, and the clips are given MURNs 6,7.... As shown in Figure 7C, when the tape is ejected from the camcorder B, the Machine ID B is recorded in the data store together with the last recorded MURN (7). The machine ID of cameorder A is retained and the last MURN (5) recorded with machine A is retained.
Referring to Figure 8A, assume the same tape is inserted into another machine C which is for example a VTR. The ID, C, of the machine is recorded in the datastore 502. The tape is partially erased, in this example over clips 2 to 6. The erased zone is denoted by MURN 8 which is recorded in the data store. An erasure number =1 is stored in data store 502.
1-00-18 8 Figures 8B and C show alternatives for full erasure. In Figure 813, the fully erased tape is given MURN 8 which is recorded in the datastore and the ersure number is stored as 1 in the datastore. In Figure 8C, the fully erased tape is given MURN 0 and the datastore is cleared except for the Tape ID.
Those skilled in the art will be able to relate Figures 6 to 8 to Figures 2 to 4.
Figures 9 to 11 illustrate editing rules. Every tape edit event generates a new MURN. A MURN treats all tape content as a group, e.g. video plus audio 1 plus audio2 plus.... audio n. Referring to Figure 10, a tape has 2 audio channels AI and A2 and 1 video channel V. A er editing, video channel has a first clip (UMID 123) up to time tI, a second clip (UMID 124) from time 2 onwards and a mix of the two clips between times tl and 2. Audio AI has a first audio section (UMIOD 123) up to time 0) and a second section ftom time C onwards. Audio A2 has a first section up to time t4 and a second section from t4 onwards. Thus there are the different groups of audio and video and thus the different MURNs indicated in the database 176 in Figure 10. The UMIDs are generated from the MURNs and Tape ID.
Figure 11 shows an example of inserting a section of audio, e.g. a voiceover (V0) into an audio channel.
1-00-18 9 UMIDs-Figures 12 and 13 insert J.-lrwplmetadatal UMIDs short form UMIDs A UMID is described in reference [2]. Referring to Figure 12 an extended UMID is shown. It comprises a first set of 3)2 bytes of basic UMID and a second set of 32 bytes of signature metadata.
The first set of 32 bytes is the basic UMID. The components are:
A 12-byte Universal Label to identify this as a SMPTE UMID. It defines the type of material which the UMID identifies and also defines the methods by which the io globally unique Material and locally unique Instance numbers are created.
A 1 -byte length value to define the length of the remaining part of the UMID.
A 3-byte Instance number which is used to distinguish between different 'instances' of material with the same Material number.
A 16-byte Material number which is used to identify each clip. Each Material number is the same for related instances of the same material.
The second set of 32 bytes of the signature metadata as a set of packed metadata items used to create an extended UMID. The extended UMID comprises the basic UMID followed immediately by signature metadata which comprises:
An 8-byte time/date code identifying the time and date of the Content Unit creation.
Unit creation.
codes A 12-byte value which defines the spatial co-ordinates at the time of Content 3 groups of 4-byte codes which register the country, organisation and user Each component of the basic and extended UMIDs will now be defined in turn.
The 12-byte Universal Label The first 12 bytes of the UMID provide identification of the UMID by the registered string value defined in table 1.
Byte No. Description Value (hex)
1 Object Identifier 06h 1-00-18 2 Label size 0Ch 3 Designation: ISO 2Bh 4 Designation: SMPTE 34h.
Registry: Dictionaries 01h 6 Registry: Metadata Dictionaries 01h 7 Standard: Dictionary Number 01h 8 Version number 01h 9 Class: Identification and location 01h.
Sub-class: Globally Unique Identifiers 01h.
11 Type: UMID (Picture, Audio, Data, Group) 01,02,03,04h 12 Type: Number creation method XXh Table 1: Specification of the UMID Universal Label
The hex values in table 1 may be changed: the values given are examples. Also the bytes 1 - 12 may have designations other than those shown by way of example in the table. Referring to the Table 1, in the example shown byte 4 indicates that bytes 5-12 relate to a data format agreed by SMPTE. Byte 5 indicates that bytes 6 to 10 relate to "dictionary" data. Byte 6 indicates that such data is "metadata" defined by bytes 7 to 10. Byte 7 indicates the part of the dictionary containing metadata defined by bytes 9 and 10. Byte 10 indicates the version of the dictionary. Byte 9 indicates the class of io data and Byte 10 indicates a particular item in the class.
In the present embodiment bytes 1 to 10 have fixed preassigned values. Byte 11 is variable. Thus referring to Figure 1 J3), and to Table 1 above, it will be noted that the bytes 1 to 10 of the label of the UMID are fixed. Therefore they may be replaced by a 1 byte 'Type' code T representing the bytes 1 to 10. The type code T is followed by a length code L. That is followed by 2 bytes, one of which is byte 11 of Table 1 and the other of which is byte 12 of Table 1, an instance number (3 bytes) and a material number (16 bytes). Optionally the material number may be followed by the signature metadata of the extended UMID and/or other metadata.
The UMID type (byte 11) has 4 separate values to identify each of 4 different 2o data types as follows:
1-00-18 11 0 1 h' = UMID for Picture material 02h' = UMID for Audio material 03h' = UMID for Data material 04h' = UMID for Group material (i.e. a combination of related essence).
The last (12th) byte of the 12 byte label identifies the methods by which the material and instance numbers are created. This byte is divided into top and bottom nibbles where the top nibble defines the method of Material number creation and the bottom nibble defines the method of Instance number creation.
Length The Length is a 1 -byte number with the value '13 h' for basic UMIDs and '3 3 h' for extended UMIDs.
Instance Number The Instance number is a unique 3-byte number which is created by one of several means defined by the standard. It provides the link between a particular 'instance' of a clip and externally associated metadata. Without this instance number, all material could be linked to any instance of the material and its associated metadata.
The creation of a new clip requires the creation of a new Material number together with a zero Instance number. Therefore, a non-zero Instance number indicates that the associated clip is not the source material. An Instance number is primarily used to identify associated metadata related to any particular instance of a clip.
Material Number The 16-byte Material number is a non-zero number created by one of several means identified in the standard. The number is dependent on a 6- byte registered port ID number, time and a random number generator.
Signature Metadata Any component from the signature metadata may be null-filled where no meaningful value can be entered. Any null-filled component is wholly null- filled to clearly indicate a downstream decoder that the component is not valid.
The Time-Date Format The date-time format is 8 bytes where the first 4 bytes are a UTC (Universal Time Code) based time component. The time is defined either by an AES3 32-bit 1-00-18 12 audio sample clock or SMPTE 12M depending on the essence type.
The second 4 bytes define the date based on the Modified Julian Data (MJD) as defined in SMPTE 309M. This counts up to 999,999 days after midnight on the 17th November 1858 and allows dates to the year 4597.
The Spatial Co-ordinate Format The spatial co-ordinate value consists of three components defined as follows:
Altitude: 8 decimal numbers specifying up to 99,999,999 metres.
Longitude: 8 decimal numbers specifying East/West 180.00000 degrees (5 decimal places active).
Latitude: 8 decimal numbers specifying North/South 90.00000 degrees (5 decimal places active).
The Altitude value is expressed as a value in metres from the centre of the earth thus allowing altitudes below the sea level.
It should be noted that although spatial co-ordinates are static for most clips, this is not true for all cases. Material captured from a moving source such as a camera mounted on a vehicle may show changing spatial co-ordinate values.
Country Code The Country code is an abbreviated 4-byte alpha-numeric string according to the set defined in ISO 3166. Countries which are not registered can obtain a registered alpha-numeric string from the SMPTE Registration Authority.
Organisation Code The Organisation code is an abbreviated 4-byte alpha-numeric string registered with SMPTE. Organisation codes have meaning only in relation to their registered Country code so that Organisation codes can have the same value in different countries.
User Code The User code is a 4-byte alpha-numeric string assigned locally by each organisation and is not globally registered. User codes are defined in relation to their registered Organisation and Country codes so that User codes may have the same value in different organisations and countries.
Freelance Operators 1-00-18 Freelance operators may use their country of domicile for the country code and use the Organisation and User codes concatenated to e.g. an 8 byte code which can be registered with SMPTE. These freelance codes may start with the '-' symbol ( ISO 8859 character number 7Eh) and followed by a registered 7 digit alphanumeric string.
1-00-18 Metadata-Figure 14 insert J..lrwplmetadatalcamera metadata The following is provided, by way of example, to illustrate the possible types of metadata generated during the production of a programme, and one possible organisational approach to structuring that metadata.
Figure 14 illustrates an example structure for organising metadata. A number of tables each comprising a number of fields containing metadata are provided. The tables may be associated with each other by way of common fields within the respective tables, thereby providing a relational structure. Also, the structure may comprise a number of instances of the same table to represent multiple instances of the object that the table may represent. The fields may be formatted in a predetermined manner. The size of the fields may also be predetermined. Example sizes include which represents 2 bytes, "Long Int" which represents 4 bytes and "Double" which represents 8 bytes. Alternatively, the size of the fields may be defined with reference to the number of characters to be held within the field such as, for example,
8, 10, 16, 32, 128, and 255 characters.
Turning to the structure in more detail, there is provided a Programme Table.
The Programme Table comprises a number of fields including Programme ID (PID),
Title, Working Title, Genre ID, Synopsis, Aspect Ratio, Director ID and Picturestamp.
Associated with the Programme Table is a Genre Table, a Keywords Table, a Script Table, a People Table, a Schedule Table and a plurality of Media Object Tables.
The Genre Table comprises a number of fields including Genre ID, which is associated with the Genre 11) field of the Programme Table, and Genre Description.
The Keywords Table comprises a number of fields including Programme ID, which is associated with the Programme ID field of the Programme Table, Keyword ID and Keyword.
The Script Table comprises a number of fields including Script ID, Script
Name, Script Type, Document Format, Path, Creation Date, Original Author, Version, Last Modified, Modified By, PID associated with Programme ID and Notes. The People Table comprises a number of fields including Image.
The People Table is associated with a number of Individual Tables and a 1-00-18 number of Group Tables. Each Individual Table comprises a number of fields including Image. Each Group Table comprises a number of fields including Image. Each Individual Table is associated with either a Production Staff Table or a Cast Table.
The Production Staff Table comprises a number of fields including Production Staff ID, Surname, Firstname, Contract ID, Agent, Agency ID, Email, Address, Phone Number, Role ID, Notes, Allergies, DOB, National Insurance Number and Bank ID and Picture Stamp.
The Cast Table comprises a number of fields including Cast ID, Surname, io Firstriame, Character Name, Contract ID, Agent, Agency ID, Equity Number, E-mail, Address, Phone Number, DOB and Bank ID and Picture Stamp. Associated with the Production Staff Table and Cast Table are a Bank Details Table and an Agency Table.
The Bank Details Table comprises a number of fields including Bank ID, which is associated with the Bank ID field of the Production Staff Table and the Bank ID field of the Cast Table, Sort Code, Account Number and Account Name.
The Agency Table comprises a number of fields including Agency ID, which associated with the Agency ID field of the Production Staff Table and the Agency ID field of the Cast Table, Name, Address, Phone Number, Web Site and E-mail and a Picture Stamp. Also associated with the Production Staff Table is a Role Table.
The Role Table comprises a number of fields including Role ID, which associated with the Role ID field of the Production Staff Table, Function and Notes and a Picture Stamp. Each Group Table is associated with an Organisation Table.
The Organisation Table comprises a number fields including Organisation ID, Name, Type, Address, Contract ID, Contact Name, Contact Phone Number and Web Site and a Picture Stamp.
Each Media Object Table comprises a number of fields including Media Object ID, Name, Description, Picturestamp, PID, Format, schedule ID, script ID and Master ID. Associated with each Media Object Table is the People Table, a Master Table, a Schedule Table, a Storyboard Table, a script table and a number of Shot Tables.
The Master Table comprises a number of fields including Master ID, which is associated with the Master ID field of the Media Object Table, Title, Basic UMID, is is 1-00-18 16 EDL ID, Tape ID and Duration and a Picture Stamp.
The Schedule Table comprises a number of fields including Schedule ID, Schedule Name, Document Format, Path, Creation Date, Original Author, Start Date, End Date, Version, Last Modified, Modified By and Notes and PID which is 5 associated with the programme 1D.
The contract table contains: a contract ID which is associated with the contract ID of the Production staff, cast, and organisation tables; commencement date, rate, job title, expiry date and details. The Storyboard Table comprises a number of fields including Storyboard ID,
io which is associated with the Storyboard ID of the shot Table, Description, Author, Path and Media ID.
Each Shot Table comprises a number of fields including Shot ID, PID, Media
ID, Title, Location ID, Notes, Picturestamp, script ID, schedule ID, and description.
Associated with each Shot Table is the People Table, the Schedule Table, script table, a Location Table and a number of Take Tables.
The Location Table comprises a number of fields including Location ID, which is associated with the Location ID field of the Shot Table, GPS, Address, Description,
Name, Cost Per Hour, Directions., Contact Name, Contact Address and Contact Phone Number and a Picture Stamp.
Each Take Table comprises a number of fields including Basic UMID, Take
Number, Shot ID, Media ID, Timecode IN, Timecode OUT, Sign Metadata, Tape ID, Camera ID, Head Hours, Videographer, IN Stamp, OUT Stamp. Lens ID, AUTOID ingest ID and Notes. Associated with each Take Table is a Tape Table, a Task Table, a Camera Table, a lens table, an ingest table and a number of Take Annotation Tables.
The Ingest table contains an Ingest ID which is associated with the Ingest Id in the take table and a description.
The Tape Table comprises a number of fields including Tape ID, which is associated with the Tape ID field of the Take Table, PID, Format, Max Duration, First
Usage, Max Erasures, Current Erasure, ETA ( estimated time of arrival) and Last Erasure Date and a Picture Stamp.
The Task Table comprises a number of fields including Task ID, PID, Media
1-00-18 17 ID, Shot ID, which are associated with the Media ID and Shot ID fields respectively of the Take Table, Title, Task Notes, Distribution List and CC List. Associated with the Task Table is a Planned Shot Table.
The Planned Shot Table comprises a number of fields including Planned Shot
ID, PID, Media ID, Shot ID, which are associated with the PID, Media ID and Shot ID respectively of the Task Table, Director, Shot Title, Location, Notes, Description, Videographer, Due date, Programme title, media title Aspect Ratio and Format.
The Camera Table comprises a number of fields including Camera ID, which is associated with the Camera ID field of the Take Table, Manufacturer, Model, Format, io Serial Number, Head Hours, Lens ID, Notes, Contact Name, Contact Address and Contact Phone Number and a Picture Stamp.
The Lens Table comprises a number of fields including Lens ID, which is associated with the Lens ID field of the Take Table, Manufacturer, Model, Serial Number, Contact Name, Contact Address and Contact Phone Number and a Picture
Stamp.
Each Take Annotation Table comprises a number of fields including Take Annotation ID, Basic UMID, Timecode, Shutter Speed, Ids, Zoom, Gamma, Shot Marker ID, Filter Wheel, Detail and Gain. Associated with each Take Annotation Table is a Shot Marker Table.
The Shot Marker Table comprises a number of fields including Shot Marker ID, which is associated with the Shot Marker ID of the Take Annotation Table, and Description.
1-00-18 A Box-Figures 15 to 21 Acquisition Unit 18 Embodiments of the present invention relate to audio and/or video generation apparatus which may be for example television cameras, video cameras or cameorders.
An embodiment of the present invention will now be described with reference to figure which provides a schematic block diagram of a video camera which is arranged to communicate to a personal digital assistant (PDA). A PDA is an example of a data processor which may be arranged in operation to generate metadata in accordance with io a user's requirements. The term personal digital assistant is known to those acquainted with the technical field of consumer electronics as a portable or hand held personal organiser or data processor which include an alpha numeric key pad and a hand writing interface.
In figure 15 a video camera 10 1 is shown to comprise a camera body 102 which is arranged to receive light from an image source falling within a field of view of an imaging arrangement 104 which may include one or more imaging lenses (not shown).
The camera also includes a view finder 106 and an operating control unit 108 from tP which a user can control the recording of signals representative of the images formed within the field of view of the camera. The camera 101 also includes a microphone
110 which may be a plurality of microphones arranged to record sound in stereo. Also shown in figure 15 is a hand-held PDA 112 which has a screen 114 and an alphanumeric key pad 116 which also includes a portion to allow the user to write characters recognised by the PDA. The PDA 112 is arranged to be connected to the video camera 10 1 via an interface 118. The interface 118 is arranged in accordance with a predetermined standard format such as, for example an RS232 or the like. The interface 118 may also be effected using infra-red signals, whereby the interface 118 is a wireless communications link. The interface 118 provides a facility for communicating information with the video camera 10 1. The function and purpose of the PDA 112 will be explained in more detail shortly. However in general the PDA 112 provides a facility for sending and receiving metadata generated using the PDA Z:1 1-00-18 19 112 and which can be recorded with the audio and video signals detected and captured by the video camera 1. A better understanding of the operation of the video camera 10 1 in combination with the PDA 112 may be gathered from figure 16 which shows a more detailed representation of the body 102 of the video camera which is shown in figure 15 and in which common parts have the same numerical designations.
In figure 16 the camera body 102 is shown to comprise a tape drive 122 having read/write heads 124 operatively associated with a magnetic recording tape 126. Also shown in figure 16 the camera body includes a metadata generation processor 128 coupled to the tape drive 122 via a connecting channel 1-310. Also connected to the io metadata generation processor 128 is a data store 132, a clock 1 J36 and three sensors 138, 140, 142. The interface unit 118 sends and receives data also shown in figure 16via a wireless channel 119. Correspondingly two connecting channels for receiving and transmitting data respectively, connect the interface unit 118 to the metadata generation processor 128 via corresponding connecting channels 148 and 150. The metadata generation processor is also shown to receive via a connecting channel 151 the audio/video signals generated by the camera. The audio/video signals are also fed to the tape drive 122 to be recorded on to the tape 126.
The video camera 110 shown in figure 15 operates to record visual information falling within the field of view of the lens arrangement 104 onto a recording medium.
The visual information is converted by the camera into video signals. In combination, the visual images are recorded as video signals with accompanying sound which is detected by the microphone 101 and arranged to be recorded as audio signals on the recording medium with the video signals. As shown in figure 16, the recording medium is a magnetic tape 126 which is arranged to record the audio and video signals onto the recording tape 126 by the read/write heads 124. The arrangement by which the video signals and the audio signals are recorded by the read/write heads 124 onto the magnetic tape 126 is not shown in figure 16 and will not be further described as this does not provide any greater illustration of the example embodiment of the present invention. However once a user has captured visual images and recorded these images using the magnetic tape 126 as with the accompanying audio signals, metadata describing the content of the audio/video signals may be input using the PDA 112. As 1-00-18 will be explained shortly this metadata can be information that identifies the audio/video signals in association with a pre-planned event, such as a 'take'.. As shown in figure 16the interface unit 118 provides a facility whereby the metadata added by the user using the PDA 112 may be received within the camera body 102.
Data signals may be received via the wireless channel 119 at the interface unit 118.
The interface unit 118 serves to convert these signals into a form in which they can be processed by the acquisition processor 128 which receives these data signals via the connecting channels 148, 150.
Metadata is generated automatically by the metadata generation processor 128 io in association with the audio/video signals which are received via the connecting channel 151. In the example embodiment illustrated in figure 16, the metadata generation processor 128 operates to generate time codes with reference to the clock 136, and to write these time codes on to the tape 126 in a linear recording track provided for this purpose. The time codes are formed by the metadata generation processor 128 from the clock 136. Furthermore, the metadata generation processor 128 forms other metadata automatically such as a UMID, which identifies uniquely the audio/video signals. The metadata generation processor may operate in combination with the tape driver 124, to write the UMID on to the tape with the audio/video signals.
In an alternative embodiment, the UMID, as well as other metadata may be stored in the data store 132 and communicated separately from the tape 126. In this case, a tape ID is generated by the metadata generation processor 128 and written on to the tape 126, to identify the tape 126 from other tapes.
In order to generate the UMID, and other metadata identifying the contents of the audio/video signals, the metadata generation processor 128 is arranged in operation to receive signals from other sensor 138, 140, 142, as well as the clock 136. The metadata generation processor therefore operates to co-ordinate these signals and provides the metadata generation processor with metadata such as the aperture setting of the camera lens 104, the shutter speed and a signal received via the control unit 108 to indicate that the visual images captured are a "good shot". These signals and data are generated by the sensors 138, 140, 142 and received at the metadata generation processor 128. The metadata generation processor in the example embodiment is 1-0018 21 arranged to produce syntactic metadata which provides operating parameters which are used by the camera in generating the video signals. Furthermore the metadata generation processor 128 monitors the status of the camcorder 101, and in particular whether audio/video signals are being recorded by the tape drive 124. When RECORD START is detected the IN POINT time code is captured and a UMID is generated in correspondence with the IN POINT time code. Furthermore in some embodiments an extended UMID is generated, in which case the metadata generation processor is arranged to receive spatial co-ordinates which are representative of the location at which the audio/video signals are acquired. The spatial co-ordinates may be generated by a receiver which operates in accordance with the Global Positioning System (GPS). The receiver may be external to the camera, or may be embodied within the camera body 102.
When RECORD START is detected, the OUT POINT time code is captured by the metadata generation processor 128. As explained above, it is possible to generate a G'good shof' marker. The "good shof' marker is generated during the recording process, and detected by the metadata generation processor. The "good shof' marker is then either stored on the tape, or within the data store 132, with the corresponding IN POINT and OUT POINT time codes.
As already indicated above, the PDA 112 is used to facilitate identification of the audio/video material generated by the camera. To this end, the PDA is arranged to associate this audio/video material with pre-planned events such as scenes, shots or takes. The camera and PDA shown in figures 15 and 16 form part of an integrated system for planning, acquiring, editing an audio/video production. During a planning phase, the scenes which are required in order to produce an audio/video production are identified. Furthermore for each scene a number of shots are identified which are required in order to establish the scene. Within each shot, a number of takes may be generated and from these takes a selected number may be used to form the shot for the final edit. The planning information in this form is therefore identified at a planning stage. Data representing or identifying each of the planned scenes and shots is therefore loaded into the PDA 112 along with notes which will assist the director when the audio/video material is captured. An example of such data is shown in the table 1-00-18 below.
22 A/V Production News story: BMW disposes of Rover Scene ID: 900015689 Outside Longbridge Shot5000000199 Longbridge BMW Sign Shot5000000200 Workers Leaving shift Shot 5000000201 Workers in car park Scene ID: 900015690 BMW HQ Munich Shot 5000000202 Press conference Shot 5000000203 Outside BMW building Scene ID: 900015691 Interview with minister Shot 5000000204 Interview In the first column of the table below the event which will be captured by the camera and for which audio/video material will be generated is shown. Each of the events which is defined in a hierarchy is provided with an identification number. Correspondingly, in the second column notes are provided in order to direct or remind the director of the content of the planned shot or scene. For example, in the first row the audio/video production is identified as being a news story, reporting the disposal of Rover by BMW. In the extract of the planning information shown in the table below, io there are three scenes, each of which is provided with a unique identification number. Each of these scenes are "Outside Long Bridge", 'TMW HQ Munich" and '1nterview with Minister". Correspondingly for each scene a number of shots are identified and these are shown below each of the scenes with a unique shot identification number. Notes corresponding to the content of each of these shots are also entered in the second column. So, for example, for the first scene "Outside Long Bridge", three shots are identified which are "Long Bridge BMW", "Workers leaving shift" and "Workers in car parV. With this information loaded onto the PDA, the director or indeed a single camera man may take the PDA out to the place where the new story is to be shot, so that the planned audio/video material can be gathered. An illustration of the form of the PDA with the graphical user interface displaying this information is shown in figure 17.
1-00-18 23 As indicated in figure 15, the PDA 112 is arranged to communicate data to the camera 111. To this end the metadata generation processor 128 is arranged to communicate data with the PDA 112 via the interface 118. The interface 118 maybe for example an infra-red link 119 providing wireless communications in accordance with a known standard. The PDA and the parts of the camera associated with generating metadata which are shown in figure 16 are shown in more detail in 18.
In 18 the parts of the camera which are associated with generating metadata and communicating with the PDA 112 are shown in a separate acquisition unit 152.
However it will be appreciated that the acquisition unit 152 could also be embodied io within the camera 102. The acquisition unit 152 comprises the metadata generation processor 128, and the data store 1-312. The acquisition processor 152 also includes the clock 136 and the sensors 138, 140, 142 although for clarity these are not shown in figure 18. Alternatively, some or all of these features which are shown in 16 will be embodied within the camera 102 and the signals which are required to define the metadata such as the time codes and the audio/video signals themselves may be communicated via a communications link 153 which is coupled to an interface port 154. The metadata generation processor 128 is therefore provided with access to the time codes and the audio/video material as well as other parameters used in generating the audio/video material. Signals representing the time codes end parameters as well as the audio/video signals are received from the interface port 154 via the interface channel 156. The acquisition unit 152 is also provided with a screen (not shown) which is driven by a screen driver 158. Also shown in figure 18 the acquisition unit is provided with a communications processor 160 which is coupled to the metadata generation processor 128 via a connecting channel 162. Communications is effected by the communications processor 160 via a radio frequency communications channel using the antennae 164. A pictorial representation of the acquisition unit 152 is shown in 19.
The PDA 112 is also shown in 18. The PDA 112 is correspondingly provided with an infra-red communications port 165 for communicating data to and from the acquisition unit 152 via an infra-red link 119. A data processor 166 within the PDA 112 is arranged to communicate data to and from the infra-red port 165 via a 1-00-18 24 connecting channel 166. The PDA 112 is also provided with a data store 167 and a screen driver 168 which are connected to the data processor 166.
The pictorial representation of the PDA 112 shown in 17 and the acquisition unit shown in 19 provide an illustration of an example embodiment of the present invention. A schematic diagram illustrating the arrangement and connection of the PDA 112 and the acquisition unit 152 is shown in 20. In the example shown in figure the acquisition unit 152 is mounted on the back of a camera 10 1 and coupled to the camera via a six pin remote connector and to a connecting channel conveying the external signal representative of the time code recorded onto the recording tape. Thus, io the six pin remote connector and the time code indicated as arrow lines form the communications channel 15') shown in figure 18. The interface port 154 is shown in figure 20 to be a VA to DN1 conversion comprising an RM-P9/LTC to RS422 converter 154. RM-P9 is a camera remote control protocol, whereas LTC is Linear Time Code in the form of an analogue signal. This is arranged to communicate with a RS422 to RS232 converter 154" via a connecting channel which forms part of the interface port 154. The converter 154" then communicates with the metadata generation processor 128 via the connecting channel 156 which operates in accordance with the RS 232 standard.
Returning to figure 18, the PDA 112 which has been loaded with the preplanned production information is arranged to communicate the current scene and shot for which audio/video material is to be generated by communicating the next shot ID number via the infra-red link 119. The preplanned information may also have been communicated to the acquisition unit 152 and stored in the data store 132 via a separate link or via the infra-red communication link 119. However in effect the acquisition unit 152 is directed to generate metadata in association with the scene or shot ID number which is currently being taken. After receiving the information of the current shot the camera 102 is then operated to make a "take of the shot". The audio/video material of the take is recorded onto the recording tape 126 with corresponding time codes. These time codes are received along with the audio/video material via the interface port 154 at the metadata generation processor 128. The metadata generation processor 128 having been informed of the current pre-planned 1-00-18 shot now being taken logs the time codes for each take of the shot. The metadata generation processor therefore logs the IN and OUT time codes of each take and stores these in the data store 132.
The information generated and logged by the metadata generation processor 128 is shown in the table below. In the first column the scene and shot are identified with the corresponding ID numbers, and for each shot several takes are made by the camera operator which are indicated in a hierarchical fashion. Thus, having received information from the PDA 112 of the current shot, each take made by the camera operator is logged by the metadata generation processor 128 and the IN and OUT lo points for this take are shown in the second and third columns and stored in the data store 132. This information may also be displayed on the screen of the acquisition unit 152 as shown in figure 19. Furthermore, the metadata generation processor 128 as already explained generates the UMID for each take for the audio/video material generated during the take. The UMID for each take forms the fourth column of the table. Additionally, in some embodiments, to provide a unique identification of the tape once which the material is recorded, a tape identification is generated and associated with the metadata. The tape identification may be written on to the tape, or stored on a random access memory chip which is embodied within the video tape cassette body. This random access memory chip is known as a TELEFILE (RTM) system which provides a facility for reading the tape ID number remotely. The tape ID is written onto the magnetic tape 126 to uniquely identify this tape. In preferred embodiments the TELEFILE (RTM) system is provided with a unique number which manufactured as part of the memory and so can be used as the tape ID number. In other embodiments the TELEFILE (RTM) system provides automatically the IN/OUT time codes of the recorded audio/video material items.
In one embodiment the information shown in the table below is arranged to be recorded onto the magnetic tape in a separate recording channel. However, in other embodiments the metadata shown in the table is communicated separately from the tape 126 using either the communications processor 160 or the infta-red link 119. The metadata maybe received by the PDA 112 for analysis and may be farther communicated by the PDA.
1-00-18 26 Scene ID: 900015689 Tape ID: 00001 UMID:
Shot5000000199 Take 1 IN: 00:03:45:29 OUT: 00:04:21:05 060C23B340 Take 2 IN: 00:04:21:20 OUT: 00:04:28:15 060C23B340 Take 3 IN: 00:04:28:20 OUT: 00:05:44:05 060C23B340 Shot5000000200 Take 1 IN: 00:05:44:10 OUT: 00:08:22:05 060C23B340 Take 2 IN: 00:08:22:10 OUT: 00:08:23:05 060C23B340 The communications processor 160 may be arranged in operation to transmit the metadata generated by the metadata generation processor 128 via a wireless communications link. The metadata maybe received via the wireless communications 1 link by a remotely located studio which can then acquire the metadata and process this metadata ahead of the audio/video material recorded onto the magnetic tape 126. This provides an advantage in improving the rate at which the audio/video production may be generated during the post production phase in which the material is edited.
A further advantageous feature provided by embodiments of the present io invention is an arrangement in which a picture stamp is generated at certain temporal positions within the recorded audio/video signals. A picture stamp is known to those skilled in the art as being a digital representation of an image and in the present example embodiment is generated from the moving video material generated by the camera. The picture stamp may be of lower quality in order to reduce an amount of data required to represent the image from the video signals. Therefore the picture stamp may be compression encoded which may result in a reduction in quality. However a picture stamp provides a visual indication of the content of the audio/video material and therefore is a valuable item of metadata. Thus, the picture stamp may for example be generated at the IN and OUT time codes of a particular take. Thus, the picture stamps may be associated with the metadata generated by the metadata generation processor 128 and stored in the data store 132. The picture stamps are therefore associated with items of metadata such as, for example, the time codes which 1-00-18 27 identify the place on the tape where the image represented by the picture stamp is recorded. The picture stamps may be generated with the "Good Shof' markers. The picture stamps are generated by the metadata generation processor 128 from the audio/video signals received via the communications link 153). The metadata generation processor therefore operates to effect a data sampling and compression encoding process in order to produce the picture stamps. Once the picture stamps have been generated they can be used for several purposes. They may be stored in a data file and communicated separately from the tape 126, or they may be stored on the tape 126 in compressed form in a separate recording channel. Alternatively in preferred io embodiments picture stamps may be communicated using the communications processor 160 to the remotely located studio where a producer may analysis the picture stamps. This provides the producer with an indication as to whether the audio/video material generated by the camera operator is in accordance with what is required.
In a yet further embodiment, the picture stamps are communicated to the PDA is 112 and displayed on the PDA screen. This may be effected via the infra-red port 119 or the PDA may be provided with a further wireless link which can communicate with the conununications processor 160. In this way a director having the hand held PDA 112 is provided with an indication of the current audio/video content generated by the camera. This provides an immediate indication of the artist and aesthetic quality of the audio/video material currently being generated. As already explained the picture stamps are compression encoded so that they may be rapidly communicated to the PDA.
A further advantage of the acquisition unit 152 shown in figure 19 is that the editing process is made more efficient by providing the editor at a remotely located studio with an indication of the content of the audio/video material in advance of receiving that material. This is because the picture stamps are communication with the metadata via a wireless link so that the editor is provided with an indication of the content of the audio/video material in advance of receiving the audio/video material itself. In this way the bandwidth of the audio/video material can remain high with a correspondingly high quality whilst the metadata and picture stamps are at a relatively low band width providing relatively low quality information. As a result of the low 1-00-18 28 band width the metadata and picture stamps may be communicated via a wireless link on a considerably lower band width channel. This facilitates rapid communication of the metadata describing content of the audio/video material.
The picture stamps generated by the metadata generation processor 128 can be at any point during the recorded audio/video material. In one embodiment the picture stamps are generated at the IN and OUT points of each take. However in other embodiments of the present invention as an activity processor 170 is arranged to detect relative activity within the video material. This is effected by performing a process in which a histogram of the colour components of the images represented by the video signal is compiled and the rate of change of the colour components determined and changes in these colour components used to indicate activity within the image. Alternatively or in addition, motion vectors within the image are used to indicate activity. The activity processor 176 then operates to generate a signal indicative of the relative activity within the video material. The metadata generation processor 128 then operates in response to the activity signal to generate picture stamps such more picture stamps are generated for greater activity within the images represented by the video signals.
In an alternative embodiment of the present invention the activity processor 170 is arranged to receive the audio signals via the connecting channel 172 and to recognise speech within the audio signals. The activity processor 170 then generates content data representative of the content ofthis speech as text. The text data is then communicated to the data processor 128 which may be stored in the data store 132 or communicated with other metadata via the communications processor 160 in a similar way to that already explained for the picture stamps.
Figure 21 provides a schematic representation of a post production process in which the audio/video material is edited to produce an audio/video program. As shown in figure 21 the metadata, which may include picture stamps and/or the speech content information is communicated from the acquisition unit 152 via a separate route represented by a broken line 174, to a metadata database 176. The route 174 may be representative of a wireless communications link formed by for example UMTS, GSM or the like.
1-00-18 29 The database 176 stores metadata to be associated with the audio/video material. The audio/video material in high quality form is recorded onto the tape 126. Thus the tape 126 is transported back to the editing suite where it is ingested by an ingestion processor 178. The tape identification (tape ID) recorded onto the tape 126 or other metadata providing an indication of the content of the audio/video material is used to associate the metadata stored in the data store 176 with the audio/video material on the tape as indicated by the broken line 180.
As will be appreciated although the example embodiment of the present invention uses a video tape as the recording medium for storing the audio/video jo signals, it will be understood that alternative recording medium such as magnetic disks and random access memories may also be used.
1-00-18 B-Box B Box-Figures 22 to 27 Figure 22 provides a schematic representation of a post production process in which the audio/video material is edited to produce an audio/video program. As shown in figure 7 the metadata, which may include picture stamps and/or the speech content information is communicated from the acquisition unit 152 via a separate route represented by a broken line 174, to a metadata database 176. The route 174 may be representative of a wireless communications link formed by for example UMTS, GSM or the like.
The database 176 stores metadata to be associated with the audio/video material. The audio/video material in high quality form is recorded onto the tape 126.
Thus the tape 126 is transported back to the editing suite where it is ingested by an ingestion processor 178. The tape identification (tape ID) recorded onto the tape 126 or other metadata providing an indication of the content of the audio/video material is used to associate the metadata stored in the data store 176 with the audio/video material on the tape as indicated by the broken line 180.
The ingestion processor 178 is also shown in 22 to be connected to a network formed from a communications channel represented by a connecting line 182. The connecting line 182 represents a communications channel for communicating data to items of equipment, which form an inter-connected network. To this end, these items of equipment are provided with a network card which may operate in accordance with a known access technique such as Ethemet, RS422 and the like. Furthermore, as will be explained shortly, the communications network 182 may also provide data communications in accordance with the Serial Digital Interface (SDI) or the Serial Digital Transport Interface (SDTI).
Also shown connected to the communications network 182 is the metadata database 176, and an audio/video server 190, into which the audio/video material is ingested. Furthermore, editing terminals 184, 186 are also connected to the 1-00-18 j 1 1 communications channel 182 along with a digital multi-effects processor 188.
The communications network 182 provides access to the audio/video material present on tapes, discs or other recording media which are loaded into the ingestion processor 178.
The metadata database 176 is arranged to receive metadata via the route 174 describing the content of the audio/video material recorded on to the recording media loaded into the ingestion processor 178.
As will be appreciated although in the example embodiment a video tape has been used as the recording medium for storing the audio/video signals, it will be io understood that alternative recording media such as magnetic disks and random access memories may also be used, and that video tape is provided as an illustrative example only.
The editing terminals 184, 186 digital multi-effects processor 188 are provided with access to the audio/video material recorded on to the tapes loaded into the ingestion processor 178 and the metadata describing this audio/video material stored in the metadata database 176 via the communications network 182. The operation of the ingestion processor with 178 in combination with the metadata database 176 will now be described in more detail.
Figure 23 provides an example representation of the ingestion processor 178.
In Figure 23 the ingestion processor 178 is shown to have a jog shuttle control 200 for navigating through the audio/video material recorded on the tapes loaded into video tape recorders/reproducers forming part of the ingestion processor 178. The ingestion processor 178 also includes a display screen 202 which is arranged to display picture stamps which describe selected parts of the audio/video material. The display screen 202 also acts as a touch screen providing a user with the facility for selecting the audio/video material by touch. The ingestion processor 178 is also arranged to display all types of metadata on the screen 202 which includes script, camera type, lens types and UMIDs.
As shown in Figure 24, the ingestion processor 178 may include a plurality of video tape recorders/reproducers into which the video tapes onto which the audio/video material is recorded may be loaded in parallel. In the example shown in 1-00-18 312 figure 24, the video tape recorders 204 are connected to the ingestion processor 178 via an RS422 link and an SDI IN/OUT link. The ingestion processor 178 therefore represents a data processor which can access any of the video tape recorders 204 in order to reproduce the audio/video material from the video tapes loaded into the video tape recorders. Furthermore, the ingestion processor 178 is provided with a network card in order to access the communications network 182. As will be appreciated from Figure 249 however, the communications channel 182 is comprised of a relatively low band width data communications channel 182' and a high band width SDI channel 18T' for use in streaming video data. Correspondingly, therefore the ingestion processor 178 is connected to the video tape recorders 204 via an RS422 link in order communicate requests for corresponding items of audio/video material. Having requested these items of audio/video material, the audio/video material is communicated back to the ingestion processor 178 via an SDI communication link 206 for distribution via the SDI network. The requests may for example include the UMID which uniquely identifies the audio/video material item(s).
The operation of the ingestion processor in association with the metadata database 176 will now be explained with reference to figure 25. In figure 25 the metadata database 176 is shown to include a number of items of metadata 210 associated with a particular tape ID 212. As shown by the broken line headed arrow 214, the tape ID 212 identifies a particular video tape 216, on which the audio/video material corresponding to the metadata 210 is recorded. In the example embodiment shown in Figure 25 the tape ID 212 is written onto the video tape 218 in the linear time code area 220. However it will be appreciated that in other embodiments, the tape ID could be written in other places such as the vertical blanking portion. The video tape 216 is loaded into one of the video tape recorders 204 forming part of the ingestion processor 178.
In operation one of the editing terminals 184 is arranged to access the metadata database 176 via the low band width communications channel 182' the editing terminal 184 is therefore provided with access to the metadata 210 describing the content of the audio/video material recorded onto the tape 216. The metadata 210 may include such as the copyright owner "BSkyB", the resolution of the picture and the format in which 33 the video material is encoded, the name of the program, which is in this case -Grandstand", and information such as the date, time and audience. Metadata may further include a note of the content of the audio/video material.
Each of the items of audio/video material is associated with a UMID, which idenifies the audio/video material. As such, the editing terminal 184 can be used to identify and select from the metadata 2 10 the items of audio/video material which are required in order to produce a program. This material may be identified by the UMID associated with the material. In order to access the audio/video material to produce the program, the editing terminal 184 communicates a request for this material via the low io band width communications network 182. The request includes the UMID or the UMIDs identifying the audiolvideo material item(s). In response to the request for audio/video material received from the editing terminal 184, the ingestion processor 178 is arranged to reproduce selectively these audio/video material items identified by the UMID or UMIDs from the video tape recorder into which the video cassette 216 is loaded. This audio/video material is then streamed via the SDI network 182" back to the editing terminal 184 to be incorporated into the audio/video production being edited. The streamed audio/video material is ingested into the audio/video server 190 from where the audio/video can be stored and reproduced.
Figure 26 provides an alternative arrangement in which the metadata 210 is recorded onto a suitable recording medium with the audio/video material. For example the metadata 210 could be recorded in one of the audio tracks of the video tape 218'. Alternatively, the recording medium may be an optical disc or magnetic disc allowing random access and providing a greater capacity for storing data. In this case the metadata 2 10 may be stored with the audio/video material.
In a yet further arrangement, some or all of the metadata may be recorded onto the tape 216. This may be recorded, for example, into the linear recording track of the tape 218. Some metadata related to the metadata recorded onto the tape may be conveyed separately and stored in the database 176. A further step is required in order to ingest the metadata and to this end the ingestion processor 178 is arranged to read the metadata from the recording medium 218' and convey the metadata via the communications network 182' to the metadata database 176. Therefore, it will be 1-00-18 3"4 appreciated that the metadata associated with the audio/video material to be ingested by the ingestion processor 178 may be ingested into the database 176 via a separate medium or via the recording medium on which the audio/video material is also recorded.
The metadata associated with the audio/video material may also include picture stamps which represent low quality representations of the images at various points throughout the video material. These may be presented at the touch screen 202 on the ingestion processor 178. Furthermore these picture stamps may be conveyed via the network 182' to the editing terminals 184, 186 or the effects processor 188 to provide io an indication of the content of the audio/video material. The editor is therefore provided with a pictorial representation for the audio/video material and from this a selection of an audio/video material items may be made. Furthermore, the picture stamp may stored in the database 176 as part of the metadata 210. The editor may therefore retreive a selected item for the corresponding picture stamp using the UMID which is associated with the picture stamp.
In other embodiments of the invention, the recording medium may not have sufficient capacity to include picture stamps recorded with the audio/video material.
This is likely to be so if the recording medium is a video tape 216. It is particularly appropriate in this case, although not exclusively so, to generate picture stamps before or during ingestion of the audio/video material.
Returning to figure 22, in other embodiments, the ingestion processor 178 may include a pre-processing unit. The pre-processing unit embodied within the ingestion tn processor 178 is arranged to receive the audio/video material recorded onto the recording medium which, in the present example is a video tape 126. To this end, the pre-processing unit may be provided with a separate video recorder/reproducer or may be combined with the video tape recorder/reproducer which forms part of the ingestion processor 178. The pre-processing unit generates picture stamps associated with the audio/video material. As explained above, the picture stamps are used to provide a pictorial representation of the content of the audio/video material items. However in -o accordance with a further embodiment of the present invention the pre- processing unit operates to process the audio/video material and generate an activity indicator 1-00-18 representative of relative activity within the content of the audio/video material. This may be achieved for example using a processor which operates to generate an activity signal in accordance with a histogram of colour components within the images represented by the video signal and to generate the activity signals in accordance with a rate of change of the colour histogram components. The pre-processing unit then operates to generate a picture stamp at points throughout the video material where there are periods of activity indicated by the activity signal. This is represented in Figure 27. In Figure 27A picture stamps 224 are shown to be generated along a line 226 which is representing time within the video signal. As shown in figure 27A the io picture stamps 224 are generated at times along the time line 226 where the activity signal represented as arrows 228 indicates events of activity. This might be for example someone walking into and out of the field of view of the camera where there is a great deal of motion represented by the video signal. To this end, the activity signal may also be generated using motion vectors which may be, for example, the motion vectors generated in accordance with the MPEG standard.
In other embodiments of the invention, the pre-processor may generate textual information corresponding to speech present within the audio signal forming part of the audiolvideo material items stored on the tape 126. The textual information may be generated instead of the picture stamps or in addition to the picture stamps. In this case, text may be generated for example for the first words of sentences and/or the first activity of a speaker. This is detected from the audio signals present on the tape recording or forming part of the audio/video material. The start points where text is to be generated is represented along the time line 226 as arrows 230. Alternatively the text could be generated at the end of sentences or indeed at other points of interest within the speech.
At the detected start of the speech, a speech processor operates to generate a textual representation of the content of the speech. To this end, the time line 226 shown in Figure 27B is shown to include the text 232 corresponding to the content of the speech at the start of activity periods of speech.
The picture stamps and textual representation of the speech activity generated by the pre-processor is communicated via the communications channel 182 to the 1-00-18 36 metadata database 176 and stored. The picture stamps and text are stored in association with the UMID identifying the corresponding items of audio/video material from which the picture stamps 224 and the textual information 232 were generated. This therefore provides a facility to an editor operating one of the editing terminals 184, 186 to analyse the content of the audio/video material before it is ingested using the ingestion processor 178. As such the video tape 126 is loaded into the ingestion processor 178 and thereafter the audio/video material can be accessed via the network communications channel 182. The editor is therefore provided with an indication, very rapidly, of the content of the audio/video material and so may ingest only those parts of lo the material, which are relevant to the particular material items required by the editor. This has a particular advantage in improving the efficiency with which the editor may produce an audio/video production.
In an alternative embodiment, the pre-processor may be a separate unit and may be provided with a screen on which the picture stamps and/or text information are displayed, and a means such as, for example, a touch screen, to provide a facility for selecting the audio/video material items to be ingested.
In a ftirther embodiment of the invention, the ingestion processor 178 generates metadata items such as UMIDs whilst the audio/video material is being ingested. This may required because the acquisition unit in the camera 152 is not arranged to generate UMIDs, but does generate a Unique Material Reference Number (MURN). The MURN is generated for each material item, such as a take. The MURN is arranged to be considerably shorter than a UMID and can therefore be accommodated within the linear time code of a video tape, which is more difficult for UMIDs because these are larger. Alternatively the MURN may be written into a TELEFILE (RTM) label of the tape. The MURN provides a unique identification of the audio/video material items present on the tape. The MURNs may be communicated separately to the database 176 as indicated by the line 174.
At the ingestion processor 178, the MURN for the material items are recovered from the tape or the TELEFILE label. For each MURN, the ingestion processor 178 operates to generate a UMID corresponding to the MURN. The UMIDs are then communicated with the MURN to the database 176, and are ingested into the database 1-00-18 37 in association with the MURNs, which may be already present within the database 176.
1-00-18 Tape IDs in time code-Figures 28 to 30 Tape IDs.
Referring to Figure 28, a tape format is shown schematically. Video and audio information is recorded in helical tracks of which a set of, e.g. 10 or 12, tracks records one field of video. The helical tracks include vertical interval time codes (VITC). The time codes may be duplicated in a linear time code track LTC, but the contents of the VITC and LTC may be different. The tape may comprise at least one other linear track (not shown). In this illustrative description it is assumed that all video, audio and other i o information is recorded digitally. However, the video and audio may be recorded as analogue information. The video and audio information may be compressed according to the MPEG 2 standard for example.
The time codes are recorded once per video field. As schematically shown
Figure 29, a known time code has 80 bits of which 16 are reserved for synchronisation information, 32 for time code bits and 32 for user defined bits, herein referred to as "user bits". The user bits are interleaved with the other bits in a typical time code; however the invention is not limited to that.
Metadata Metadata is described in the section Metadata.
Tape IDs and UMIDs UMIDs are described in the section UMIDs. They are material identifiers which are universally unique. In embodiments of the present invention they are used to bind material i.e. video and/or audio recorded on the tape to metadata which is stored in for example a database 464 as shown in 30.
Embodiments, of the present invention, record on the tape Tape Identifiers (Tape IDs) having most preferably 64 bits. Tape IDs may have other numbers of bits for example in the range 32 to 64 bits. Unlike a UMID which is universally unique, a Tape ID may not be universally unique but is unique to at least an organisation such as a production company. The Tape 1D is recorded in the user bits of the linear time code. If it has 64 bits it occupies two time codes. It thus refers to one fraine of two video fields. In preferred embodiments the same tape ID is repeated every frame.
1-00-18 39 Preferably, the tape is "prestriped" before use to record linear time codes for the fields.
The format of an illustrative Tape ID is any 4 byte hex number as set by the user-bit set-up controls on the VTR or camcorder.
Linking to a UMID The Tape ID may not be unique. In embodiments of the present invention, a Tape ID is linked to a UMID which uniquely identifies the material recorded on the tape. The UMID is used to link the material on the tape to other metadata relating to the material. If only one piece of material is recorded on a tape, then only the Tape ID needs to be linked to the UMID which uniquely identifies that one piece of material.
However, in practice two or more pieces of material would be recorded. For example, the tape may contain two or more takes of the same shot: each take is one piece of material and has its own UMID. Thus to link each UMID to each piece of material, the Tape ID plus the IN (start) and OUT (end) time codes of the piece of material are used.
Linking to a database It is desirable to provide more detailed metadata relating to the material C.
recorded on the tape. Examples of such metadata are described in the section Camera Metadata. Thus metadata is stored in a database, the UMID linking the metadata to the material.
Illustrative Syste Referring to Figure 30, a digital video source, e.g. a camcorder 460 has a multiplexer 463 which in known manner inserts the Tape ID and the IN and OUT time codes onto a tape. The IN and OUT time codes are generated each time a record start and stop button 471 is operated. The tape ID is generated as follows:
The carncorder records a contiguous set of time codes for all fields; the tape ID is fixed, recorded in the time code user bits and is preset by the user bit controls. The camera also outputs audio A, video V on respective outputs.
The camera has a signal processor termed herein the A-BOX which: stores time code snap shots at the beginning and end of a recording, i.e. the IN and OUT points. The user bits form part of the time code and thus the tape ID is monitored by monitoring the user bits, whereby the tape Idsare stored with the IN and Out points.The A- box derives the user bits of the time codes from the tape and transfers them to a 1-00-18 data processor which in this example is a PDA (Personal Digital Assistant) 469. The ABox is described in more detail in the section A-BOX. It may derive other metadata from the caimera and/or material recorded on the tape and transfer it to the PDA 469.
The PDA 469 links the Tape ID and the IN and OUT time codes of the pieces of material recorded on the tape to one or more UMIDs. The PDA has a data entry device, for example a keyboard, to enter data and may have, or be connected to, a GPS device 470 for producing the spatial co-ordinate data of an extended UMID. The PDA generates the UMID and associates it with the Tape ID and the IN and OUT codes. The PDA 469 transfers the UMIDs, Tape IDs, IN and Out points, and any other io metadata generated at the camera and/or PDA, to a database 464.
The database 464 in this example comprises a data base program run on a standard personal computer (PC) or a lap-top computer having a keyboard 467 for data entry, a display 465 and a systems unit 466.
The database 464 stores more extensive and detailed metadata, including the UMID(s), the tape IDs, the IN and OUT points and other metadata generated at the Camera 460, the PDA 469 and/or the data entry device 467. The Tape IDs and the IN and OUT points on the tape and the UMID(s) in the database allow clear and unique linking of the material on the tape, and of the tape on which the material is recorded, to the data in the database.
Metadata, which is additional to the UMID, may be entered into the PDA 469 by the operator using the keyboard 468. A computer 461 in the PDA generates the UMID (whether basic or extended or having the data-reduced structure as shown in Figure 13 of the section UMIDs) and formats the other metadata into a suitable data structure for transfer to the database 464, Interconnecting the Camera, PDA and Database.
Data transfer between the A-box and PDA may be by corded or wireless link. For example the PDA may have in Infra Red port for the transfer of data linking with a corresponding Infra-Red port on the A-Box. Likewise the PDA may be linked to the database by a corded or wirel ess link, The link from the PDA to the database may be via a telephone link, or by direct radio link. The PDA may be linked to the database via the internet.
1-00-18 41 Modifications.
The 'A-BOX' and the PDA 469 are shown as items separate from the camera 460. The A-box may be replaced by a processor, e.g. a computer built into the camera. Alternatively both the a-Box and the PDA may be replaced by a processor built into the camera.
Whilst the invention has been described by way of example with reference to tape, the invention may be applied to other recording media. For example tapes may be replaced by discs such as optical or magneto-optical discs or by computer hard discs.
1-00-18 42 References [11 Introduction to the 42:2 Digital Video Tape Recorder - Stephen Gregory Pentech Press 1998, ISBN 0-7273-0903-X [21 EBUSMPTE Task Force for Harmonised Standards for the Exchange of Programme Material as Bit Streams, Final Report: Analyses and Results, Sept 1998.
[3] EBU/IFTA Forum 'Taving the Way to Future TV Archives", Vienna, 18 June 1999, M.F. Vetter, TASC Inc.
[41 MPEG2 - ISO/IEC/ 13818-2 1-00-18 43 Modifications.
Although the foregoing description describes the embodiments of the invention in relation to video material, the invention may be applied to audio material and /or to data.
Although the foregoing description describes the embodiments of the invention in relation to material recorded on a recording medium, and the MURNs are applied to recorded material, the MURNs may be embedded in material from a live source and transmitted to a processor or to a transmission and distribution network as streamed i o and unrecorded material.
1-00-18 44

Claims (1)

1.
2.
n 5.
6.
A video and/or audio signal processing system comprising a recorder for recording video and/or audio material on a recording medium the recorder including a first generator for generating first material identifiers for identifying respective pieces of material on the medium such that each piece is differentiated from other pieces on the medium, and a second generator for generating second, universally unique, identifiers for pieces of material, second identifiers being generated in respect of one or more of the first identifiers.
A system according to claim 1, wherein the recording medium has an identifier which identifies the medium additionally to the first identifiers which identify material recorded thereon, and the second generator associates the second identifiers with the medium identifier and the first identifiers in combination.
A system according to claim 1 or 2, wherein a third identifier identifying the machine which initially produces the video and/or audio material is produced and the second generator associates the second identifiers with the medium identifier and the first identifiers and the third identifiers in combination.
A system according to claim 1, 2, 3 or 4, wherein the second identifiers are UMIDs.
A system according to claim 1, 2, 3, 4 or 5, wherein the first identifiers are recorded on the medium.
A system according to any preceding claim, wherein the first identifiers comprise material reference numbers.
1-00-18 7.
8.
9.
A system according to claim 6, wherein the first identifiers are recorded in user bits of time codes.
A system according to any one of claims 1 to 7, wherein the medium identifier is recorded on the medium.
A system according to any one of claims 1 to 8, wherein the medium is contained in a housing.
10.
11.
12.
13.
14.
15.
A system according to claim 9, having a data store supported by the housing and additional to the medium, and wherein the data store stores at least the medium identifier.
A system according to claim 9 wherein at least one first identifier is stored in the said data store.
A system according to claim 9 or 10 when dependent on claim 3 wherein the third identifier is recorded in the said data store.
A system according to any one of claims 9 to 12, wherein the housing has a label on which data may be written.
A system according to any one of claims 2 to 13, wherein the medium identifier is written on the housing.
A system according to any preceding claim, further comprising a C> database processor arranged to associate the second identifiers with at least the first identifiers or with the first identifiers and one or more of the medium identifiers and the third identifiers.
1-00-18 46 16.
17.
18.
19.
A recorder for recording video and/or audio material on a recording medium and including a first generator for generating first material identifiers for identifying respective pieces of material on the medium such that each piece is differentiated from other pieces on the medium, and a second generator for generating second, universally unique, identifiers for pieces of material, the second generator associating the second identifiers with the first identifiers.
A recorder according to claim 16 wherein a medium identifier is recorded on the medium.
A recorder according to claim 16 or 17, for recording material on a medium contained in a housing which supports a data store additional to the medium, and including a data recording device for recording at least a medium identifier in the data store.
A recorder according to claim 17 or 18, wherein the data recording device is arranged to record at least one of the first identifiers in the data store.
20. A recorder according to claim 19 wherein at least the most recently generated of the first identifiers is recorded in the data store.
21.
22.
A recorder according to any one of claims 17 to 20 wherein the recorder is arranged to produce a machine identifier identifying the recorder and to record the machine identifier on the medium and/or in the data store.
A recorder according to claim 21 when dependent on claim 18 wherein the recorder is arranged to record the machine identifier in the data store.
1-00-18 47 23.
24.
25.
25.
27 A device for reproducing video and/or audio material recorded on a recording medium, the medium having at least first, material, identifiers associated there with and identifying the or each piece of material recorded thereon, the reproducing device having a generator for generating second, universally unique, identifiers for pieces of material, the second generator associating the second identifiers with the first identifiers.
A device according to claim 23) wherein the second generator generates a third identifier identifying the device.
A device according to claim 23) or 24 wherein the device reproduces a medium ID identifying the recording medium from the medium and / or from a data store associated with the medium.
A device according to claim 23), 24 or 25 wherein the device reproduces the material identifer from the medium and / or from a data store associated with the medium.
A device according to claim 2-3), arranged to reproduce material recorded on a medium which is contained in a housing supporting a data store additional to the medium, and to read data from the said data store, the second identifiers being generated in dependence on data in the store.
A device according to claim 23 or 24, wherein the second identifier generator is arranged to derive UMIDs from one or more of tape ID, machine ID, and MURN 1-00-18 48 29.
A recording medium on which audio and /or video material is recorded, the medium having recorded thereon material identifiers identifying the recorded material, the material identifiers being in user bits of time code recorded on the medium.
30. A medium according, to claim 29 further comprising a data store supported by a housing which houses the medium, the datastore storing at least the last recorded of the first identifiers.
31 A video and/or audio signal processing system comprising a recorder for recording video and/or audio material on a recording medium the recording medium having an identifier which identifies the medium, the recorder including a first generator for generating first material identifiers for identifying respective pieces of material on the medium such that each piece is differentiated from other pieces on the medium.
32 A video and/or audio signal processing system according to claim 31, and comprising a second generator for generating second, universally unique, identifiers for pieces of material, second identifiers being generated in respect of one or more of the first identifiers.
33 A recording medium substantially as hereinbefore described with reference to the accompanying drawings.
34 A reproducing device substantially as hereinbefore described with reference to the accompanying drawings 35 A recorder substantially as hereinbefore described with reference to the accompanying drawings 1-00-18 49 36.K.
A video and/or audio signal processing system substantially as hereinbefore described with reference to. the accompanying drawings
GB0008436A 2000-04-05 2000-04-05 Identifying video and/or audio material Withdrawn GB2361130A (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
GB0008436A GB2361130A (en) 2000-04-05 2000-04-05 Identifying video and/or audio material
PCT/GB2001/001458 WO2001075885A2 (en) 2000-04-05 2001-03-30 Identifying, recording and reproducing information
AU42651/01A AU4265101A (en) 2000-04-05 2001-03-30 Identifying, recording and reproducing information
EP08005086.7A EP1939878A3 (en) 2000-04-05 2001-03-30 Identifying, recording and reproducing information
JP2001573478A JP2003529877A (en) 2000-04-05 2001-03-30 Identification, recording and playback information system
EP01915566A EP1208566A2 (en) 2000-04-05 2001-03-30 Identifying, recording and reproducing information
US10/016,828 US7778516B2 (en) 2000-04-05 2001-12-04 Identifying, recording and reproducing information
US11/946,099 US20080069515A1 (en) 2000-04-05 2007-11-28 Identifying, recording and reproducing information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0008436A GB2361130A (en) 2000-04-05 2000-04-05 Identifying video and/or audio material

Publications (2)

Publication Number Publication Date
GB0008436D0 GB0008436D0 (en) 2000-05-24
GB2361130A true GB2361130A (en) 2001-10-10

Family

ID=9889315

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0008436A Withdrawn GB2361130A (en) 2000-04-05 2000-04-05 Identifying video and/or audio material

Country Status (1)

Country Link
GB (1) GB2361130A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7834916B2 (en) * 2002-04-05 2010-11-16 Sony Corporation Video content editing support system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2055503A (en) * 1979-07-30 1981-03-04 Atlantic Richfield Co Information carrier
GB2131996A (en) * 1982-11-30 1984-06-27 George Saint Data storage devices
US4466029A (en) * 1980-08-08 1984-08-14 Sony Corporation Method and apparatus for detecting an edit point on a record medium
JPS62233989A (en) * 1986-04-03 1987-10-14 Canon Inc Recording and reproducing device
EP0279885A1 (en) * 1987-02-27 1988-08-31 Sony Corporation Video signal reproducing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2055503A (en) * 1979-07-30 1981-03-04 Atlantic Richfield Co Information carrier
US4466029A (en) * 1980-08-08 1984-08-14 Sony Corporation Method and apparatus for detecting an edit point on a record medium
GB2131996A (en) * 1982-11-30 1984-06-27 George Saint Data storage devices
JPS62233989A (en) * 1986-04-03 1987-10-14 Canon Inc Recording and reproducing device
EP0279885A1 (en) * 1987-02-27 1988-08-31 Sony Corporation Video signal reproducing apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7834916B2 (en) * 2002-04-05 2010-11-16 Sony Corporation Video content editing support system and method

Also Published As

Publication number Publication date
GB0008436D0 (en) 2000-05-24

Similar Documents

Publication Publication Date Title
US10200767B2 (en) Audio and/or video generation apparatus and method of generating audio and/or video signals
EP1947648B1 (en) Video processing apparatus and method
US7778516B2 (en) Identifying, recording and reproducing information
EP1188164B1 (en) Identifying and processing of audio and/or video material
US9311962B2 (en) Audio and/or video generation apparatus and method of generating audio and/or video signals
GB2356080A (en) Generation system for audio, video or a combination thereof where metadata is generated and stored or recorded with the audio/video signal
GB2361128A (en) Video and/or audio processing apparatus
GB2361127A (en) Audio/video reproduction via a communications network
GB2361096A (en) Metadata generation in audio or video apparatus
GB2361130A (en) Identifying video and/or audio material
GB2361098A (en) Editing system and method using metadata
GB2361090A (en) Generating sample images to assist video editing
GB2362254A (en) Identifying material

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)