GB2361131A - Identifying the type of source of video/audio data - Google Patents

Identifying the type of source of video/audio data Download PDF

Info

Publication number
GB2361131A
GB2361131A GB0008440A GB0008440A GB2361131A GB 2361131 A GB2361131 A GB 2361131A GB 0008440 A GB0008440 A GB 0008440A GB 0008440 A GB0008440 A GB 0008440A GB 2361131 A GB2361131 A GB 2361131A
Authority
GB
United Kingdom
Prior art keywords
identifier
source
data field
type
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0008440A
Other versions
GB0008440D0 (en
Inventor
Morgan William Amos David
James Hedley Wilkinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Europe Ltd
Original Assignee
Sony United Kingdom Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony United Kingdom Ltd filed Critical Sony United Kingdom Ltd
Priority to GB0008440A priority Critical patent/GB2361131A/en
Publication of GB0008440D0 publication Critical patent/GB0008440D0/en
Priority to CNB018015697A priority patent/CN100385559C/en
Priority to CA002375688A priority patent/CA2375688A1/en
Priority to BR0105577-1A priority patent/BR0105577A/en
Priority to JP2001573479A priority patent/JP4711379B2/en
Priority to PCT/GB2001/001461 priority patent/WO2001075886A1/en
Priority to EP01915568A priority patent/EP1188164B1/en
Priority to AU42652/01A priority patent/AU4265201A/en
Priority to KR1020017015673A priority patent/KR100788628B1/en
Publication of GB2361131A publication Critical patent/GB2361131A/en
Priority to US10/008,072 priority patent/US7526178B2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • G11B27/323Time code signal, e.g. on a cue track as SMPTE- or EBU-time code
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/328Table of contents on a tape [TTOC]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/90Tape-like record carriers

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

A method of identifying video and/or audio material comprising providing a material identifier, the identifier having a data field containing data indicating the type of material source. The identifier may be part of a signal or associated with a signal and preferably indicates whether the source is a live source or a reproducer of recorded material. The identifier may be a UMID.

Description

2361131 1 Identifying video and/or audio material The present invention
relates to identifying video and/or audio material.
It has been proposed to identify video and audio material using a material identifier. An example of such an identifier is a UMID. Some material may not have an identifier at source and one needs to be associated with the material. Some processes performed on material generate new material from old material and thus involve generating a new identifier.
Some recorded material is reproduced from the record e.g. tape or disc and recorded again, without changing the content. UMIDs have instance numbers which are used to denote different instances of the same material. For example the first record of material has instance number zero, the next recording of the same material has instance number one. The identifier is otherwise unchanged.
Material from a live source such as a camera or microphone also undergoes change of form without being recorded e.g. analogue to digital, digital to compressed digital, without changing the content of the material.
According to the present invention, there is provided a video and /or audio signal processing system in which video and/or audio material represented by a video and/or audio signal and emanating from a source is associated with a material identifier having a data field identifying the type of source.
Thus the type of source, e.g. whether the source is a live source such as a camera or microphone, or a source of recorded material, is indicated. Material which is streamed from a live source but not recorded has for example instance numbers allocated to it, which instance numbers are denoted by a type indicator as being instances of streamed and unrecorded data. Material from recorded sources has instance numbers allocated to it but which have a different type indicator to streamed instance numbers..
For streamed material the identifier, e.g. the UMID, is embedded in the data stream. For instance it is in the Vertical blanking interval. Some processes in the processing chain such as encoding and decoding pass the vertical blanking interval unchanged or do not provide for processing it giving a potentially false instance 2 number. Thus in accordance with an embodiment of the invention, such an identifier is denoted by the type indicator as applying only to recorded instances.
For a better understanding of the present invention, reference will now be made by 5 way of example to the accompanying drawings in which:
Figure 1 is a schematic diagram of the data structure of a UMID modified in accordance with the present invention; Figure 2 is a schematic block diagram of a material processing system which uses the structure of Figure 1; and Figure 3 is a schematic diagram of a metadata structure in a database.
UMIDs UMIDs have been proposed as a standard issued by SMPTE: Referring to Figure 1 A, a UMID according to the proposed standard is as follows: UMIDs UMIDs are known from for example SMPTE Journal March 2000. Referring to Figure 1A, an extended UMID is shown. It comprises a first set of 32 bytes of basic UMID and a second set of 32 bytes of signature metadata.
The first set of 32 bytes is the basic UMID. The components are:
A 12-byte Universal Label to identify this as a SMPTE UMID. It defines the type of material which the UMID identifies and also defines the methods by which the globally unique Material and locally unique Instance numbers are created.
A 1 -byte length value to define the length of the remaining part of the UMID.
A 3-byte Instance number which is used to distinguish between different 'instances' of material with the same Material number.
A 16-byte Material number which is used to identify each clip. Each Material number is the same for related instances of the same material.
The second set of 32 bytes of the signature metadata as a set of packed metadata items used to create an extended UMID. The extended UMID comprises the basic UMID followed immediately by signature metadata which comprises:
An 8-byte time/date code identifying the time and date of the Content Unit creation.
3 A 12-byte value which defines the spatial co-ordinates at the time of Content Unit creation.
codes 3 groups of 4-byte codes which register the country, organisation and user Each component of the basic and extended UMIDs will now be defined in tum.
The 12-byte Universal Label The first 12 bytes of the UMID provide identification of the UMID by the registered string value defined in table 1.
Byte No. Description Value (hex)
1 object Identifier 06h 2 Label size 0Ch 3 Designation: ISO 2Bh 4 Designation: SMPTE 34h Registry: Dictionaries 01h 6 Registry: Metadata Dictionaries 01h 7 Standard: Dictionary Number 01h 8 Version number 01h 9 Class: Identification and location 01h Sub-class: Globally Unique Identifiers 01h 11 Type: UMID (Picture, Audio, Data, Group) 01,02,03,04h 12 Type: Number creation method XXh Table 1: Specification of the UMID Universal Label
The hex values in table 1 may be changed: the values given are examples. Also the bytes 1- 12 may have designations other than those shown by way of example in the table. Referring to the Table 1, in the example shown byte 4 indicates that bytes 5-12 relate to a data format agreed by SMPTE. Byte 5 indicates that bytes 6 to 10 relate to "dictionary" data. Byte 6 indicates that such data is "metadata" defined by bytes 7 to 10. Byte 7 indicates the part of the dictionary containing metadata defined by bytes 9 and 10. Byte 10 indicates the version of the dictionary. Byte 9 indicates the class of data and Byte 10 indicates a particular item in the class.
The UMID type (byte 11) has 4 separate values to identify each of 4 different data types as follows:
0 1 h' = UMID for Picture material '02h' = UMID for Audio material '03h' = UMID for Data material 04h' = UMID for Group material (i.e. a combination of related essence).
The last (12th) byte of the 12 byte label identifies the methods by which the material and instance numbers are created. This byte is divided into top and bottom nibbles where the top nibble defines the method of Material number creation and the bottom nibble defines the method of Instance number creation.
Length The Length is a 1 byte number with the value '13h' for basic UMIDs and '33h' for extended UMIDs.
Instance Number The Instance number is a unique 3-byte number which is created by one of several means defined by the standard. It provides the link between a particular 'instance' of a clip and externally associated metadata. Without this instance number, all material could be linked to any instance of the material and its associated metadata.
The creation of a new clip requires the creation of a new Material number together with a zero Instance number. Therefore, a non-zero Instance number indicates that the associated clip is not the source material. An Instance number is primarily used to identify associated metadata related to any particular instance of a clip.
Material Number The 16-byte Material number is a non-zero number created by one of several means identified in the standard. The number is dependent on a 6-byte registered port ID number, time and a random number generator.
Signature Metadata Any component from the signature metadata may be null-filled where no meaningful value can be entered. Any null-filled component is wholly null- filled to clearly indicate a downstream decoder that the component is not valid.
The Time-Date Format The date-time format is 8 bytes where the first 4 bytes are a UTC (Universal Time Code) based time component. The time is defined either by an AES3 32- bit audio sample clock or SMPTE 12M depending on the essence type.
The second 4 bytes define the date based on the Modified Julian Data (MJD) as defined in SMPTE 309M. This counts up to 999,999 days after midnight on the 17th November 1858 and allows dates to the year 4597.
The Spatial Co-ordinate Format The spatial co-ordinate value consists of three compo. nents defined as follows:
Altitude: 8 decimal numbers specifying up to 99,999,999 metres.
Longitude: 8 decimal numbers specifying East/West 180.00000 degrees (5 decimal places active).
Latitude: 8 decimal numbers specifying North/South 90.00000 degrees (5 decimal places active).
The Altitude value is expressed as a value in metres from the centre of the earth thus allowing altitudes below the sea level.
It should be noted that although spatial co-ordinates are static for most clips, this is not true for all cases. Material captured from a moving source such as a camera mounted on a vehicle may show changing spatial coordinate values.
Country Code The Country code is an abbreviated 4-byte alpha-numeric string according to the set defined in ISO 3166. Countries which are not registered can obtain a registered alpha-numeric string from the SMPTE Registration Authority.
Organisation Code The Organisation code is an abbreviated 4-byte alpha-numeric string registered with SMPTE. Organisation codes have meaning only in relation to their registered Country code so that Organisation codes can have the same value in different countries.
User Code The User code is a 4-byte alpha-numeric string assigned locally by each organisation and is not g lobally registered. User codes are defined in relation to their 6 registered Organisation and Country codes so that User codes may have the same value in different organisations and countries.
Freelance Operators Freelance operators may use their country of domicile for the country code and use the Organisation and User codes concatenated to e.g. an 8 byte code which can be registered with SMPTE. These freelance codes may start with the '-' symbol ( ISO 8859 character number 7Eh) and followed by a registered 7 digit alphanumeric string, In accordance with an embodiment of the present invention, the UMID of Figure 1A is modified as shown in Figure 1B. The Instance number field contains a number defined by byte 12 of the universal label. That byte has unassigned values 3 to 15.
One of those values is assigned to indicate that the instance number relates to streamed rather than recorded material. In the preferred embodiment of Figure 1 B, the 15 value 15 is chosen for streamed material.
Another of those values is assigned to indicate that the instance number relates to recorded rather than streamed material. In the preferred embodiment the value 14 is chosen for recorded material.
Referring to Figure 2, a source 2 which may be a camera and/or microphone of original unrecorded video and/or audio material produces for example analogue data.
For simplicity assume the source is a camera producing only video. A UMID generator 4 generates a UMID with instance number 0 and type indicator 15 denoting unrecorded material. The UMID is embedded in the vertical blanking interval of the video by a multiplexer 6. The video is emitted to a processing chain 14, 8 having at least one processor 8. The processor 8 changes the form of the video, e.g. from analogue to digital in a way that does not change the content. Thus the material number of the UMID does not change but the instance number does. If the material is processed in a way that fails to change the instance number then the instance number may become false.
The processing chain may include a recorder 16. The recorder records the material and allocates a recorded material instance number to the material with type code 14 and the appropriate instance number, the material number being unchanged.
7 UMIDs generated at the source, the processor 8 and/or the recorder 16 may be communicated to a metadata base 10 where metadata generated e.g. by data entry means 12 is associated with the UMIDs. Examples of metadata are given in the section Metadata below.
Metadata The following is provided, by way of example, to illustrate the possible types of metadata generated during the production of a programme, and one possible organisational approach to structuring that metadata.
Figure 3 illustrates an example structure for organising metadata. A number of tables each comprising a number of fields containing metadata are provided. The tables may be associated with each other by way of common fields within the respective tables, thereby providing a relational structure. Also, the structure may comprise a number of instances of the same table to represent multiple instances of the object that the table may represent. The fields may be formatted in a predetermined manner. The size of the fields may also be predetermined. Example sizes include "Int" which represents 2 bytes, "Long Int" which represents 4 bytes and "Double" which represents 8 bytes. Alternatively, the size of the fields may be defined with reference to the number of characters to be held within the field such as, for example,
8, 10, 16, 32, 128, and 255 characters.
Turning to the structure in more detail, there is provided a Programme Table.
The Programme Table comprises a number of fields including Programme ID (PID),
Title, Working Title, Genre ID, Synopsis, Aspect Ratio, Director ID and Picturestamp.
Associated with the Programme Table is a Genre Table, a Keywords Table, a Script Table, a People Table, a Schedule Table and a plurality of Media Object Tables.
The Genre Table comprises a number of fields including Genre ID, which is associated with the Genre ID field of the Programme Table, and Genre Description.
The Keywords Table comprises a number of fields including Programme ID, which is associated with the Programme ID field of the Programme Table, Keyword
ID and Keyword.
The Script Table comprises a number of fields including Script ID, Script
Name, Script Type, Document Format, Path, Creation Date, Original Author, Version, 8 Last Modified, Modified By, PID associated with Programme ID and Notes. The People Table comprises a number of fields including Image.
The People Table is associated with a number of Individual Tables and a number of Group Tables. Each Individual Table comprises a number of fields including Image. Each Group Table comprises a number of fields including Image. Each Individual Table is associated with either a Production Staff Table or a Cast Table.
The Production Staff Table comprises a number of fields including Production Staff ID, Surname, Firstname, Contract ID, Agent, Agency ID, Email, Address, Phone
Number, Role ID, Notes, Allergies, DOB, National Insurance Number and Bank ID and Picture Stamp.
The Cast Table comprises a number of fields including Cast ID, Surname, Firstname, Character Name, Contract ID, Agent, Agency ID, Equity Number, E-mail, Address, Phone Number, DOB and Bank ID and Picture Stamp. Associated with the
Production Staff Table and Cast Table are a Bank Details Table and an Agency Table.
The Bank Details Table comprises a number of fields including Bank ID, which is associated with the Bank ID field of the Production Staff Table and the Bank ID field of the Cast Table, Sort Code, Account Number and Account Name.
The Agency Table comprises a number of fields including Agency ID, which is associated with the Agency ID field of the Production Staff Table and the Agency ID field of the Cast Table, Name, Address, Phone Number, Web Site and E-mail and a Picture Stamp. Also associated with the Production Staff Table is a Role Table.
The Role Table comprises a number of fields including Role ID, which is associated with the Role ID field of the Production Staff Table, Function and Notes and a Picture Stamp. Each Group Table is associated with an Organisation Table.
The Organisation Table comprises a number fields including Organisation ID, Name, Type, Address, Contract ID, Contact Name, Contact Phone Number and Web Site and a Picture Stamp.
Each Media Object Table comprises a number of fields including Media Object
ID, Name, Description, Picturestamp, PID, Format, schedule ID, script ID and Master ID. Associated with each Media Object Table is the People Table, a Master Table, a Schedule Table, a Storyboard Table, a script table and a number of Shot Tables.
9 The Master Table comprises a number of fields including Master ID, which is associated with the Master ID field of the Media Object Table, Title, Basic UMID, EDL ID, Tape ID and Duration and a Picture Stamp.
The Schedule Table comprises a number of fields including Schedule ID, Schedule Name, Document Format, Path, Creation Date, Original Author, Start Date, End Date, Version, Last Modified, Modified By and Notes and PID which is associated with the programme ID.
The contract table contains: a contract ID which is associated with the contract ID of the Production staff, cast, and organisation tables; commencement date, rate, job title, expiry date and details.
The Storyboard Table comprises a number of fields including Storyboard ID, which is associated with the Storyboard ID of the shot Table, Description, Author, Path and Media ID.
Each Shot Table comprises a number of fields including Shot ID, PID, Media
ID, Title, Location ID, Notes, Picturestamp, script ID, schedule ID, and description. Associated with each Shot Table is the People Table, the Schedule Table, script table, a Location Table and a number of Take Tables.
The Location Table comprises a number of fields including Location ID, which is associated with the Location ID field of the Shot Table, GPS, Address, Description,
Name, Cost Per Hour, Directions, Contact Name, Contact Address and Contact Phone Number and a Picture Stamp.
Each Take Table comprises a number of fields including Basic UMID, Take Number, Shot ID, Media ID, Timecode IN, Timecode OUT, Sign Metadata, Tape ID, Camera ID, Head Hours, Videographer, IN Stamp, OUT Stamp. Lens ID, AUTOID ingest ID and Notes. Associated with each Take Table is a Tape Table, a Task Table, a Camera Table, a lens table, an ingest table and a number of Take Annotation Tables.
The Ingest table contains an Ingest ID which is associated with the Ingest Id in the take table and a description.
The Tape Table comprises a number of fields including Tape ID, which is associated with the Tape ID field of the Take Table, PID, Format, Max Duration, First Usage, Max Erasures, Current Erasure, ETA ( estimated time of arrival) and Last Erasure Date and a Picture Stamp.
The Task Table comprises a number of fields including Task ID, PID, Media
ID, Shot ID, which are associated with the Media ID and Shot ID fields respectively of the Take Table, Title, Task Notes, Distribution List and CC List. Associated with the Task Table is a Planned Shot Table.
The Planned Shot Table comprises a number of fields including Planned Shot ID, PID, Media ID, Shot ID, which are associated with the PID, Media ID and Shot ID respectively of the Task Table, Director, Shot Title, Location, Notes, Description, Videographer, Due date, Programme title, media title Aspect Ratio and Format.
The Camera Table comprises a number of fields including Camera ID, which is associated with the Camera ID field of the Take Table, Manufacturer, Model, Format,
Serial Number, Head Hours, Lens ID, Notes, Contact Name, Contact Address and Contact Phone Number and a Picture Stamp.
The Lens Table comprises a number of fields including Lens ID, which is associated with the Lens ID field of the Take Table, Manufacturer, Model, Serial 15 Number, Contact Name, Contact Address and Contact Phone Number and a Picture Stamp.
Each Take Annotation Table comprises a number of fields including Take Annotation ID, Basic UMID, Timecode, Shutter Speed, Iris, Zoom, Gamma, Shot Marker ID, Filter Wheel, Detail and Gain. Associated with each Take Annotation 20 Table is a Shot Marker Table.
The Shot Marker Table comprises a number of fields including Shot Marker ID, which is associated with the Shot Marker ID of the Take Annotation Table, and Description.
Modifications video.
For recorded material, the UMID may be recorded separately form the Whilst the foregoing description refers to video, the techniques may be applied also to audio material.
11

Claims (1)

  1. 5. 6.
    7.
    1 A video and /or audio signal processing system in which video and/or audio material represented by a video and/or audio signal and emanating from a source is associated with a material identifier having a data field identifying the type of source.
    2. A system according to claim 1, wherein the said data field identifies whether the source is a reproducer of recorded material or a live source.
    3. A system according to claim 1 or 2 wherein the identifier forms part of the signal. A system according to claim 1 or 2, wherein the identifier is separate from the signal and the signal is associated with the identifier. A system according to claim 1, 2, 3 or 4, wherein the identifier is a UMID. A system according to claim 5 wherein the data field is the "Type" field of the UMID. A system according to claim 1,2, 3, 4, 5 or 6, in combination with a database processor in which the identifier is related to metadata relating to the material.
    8.
    A method of identifying video and/or audio material comprising providing a material identifier, the identifier having a data field containing data indicating the type of material source.
    9. A method according to claim 8, wherein the said data field distinguishes between recorded signals and signals from live sources.
    10. A method according to claim 8 or 9 wherein the identifier is a UMID.
    11. A method according to claim 10, wherein the data field is the instance number field.
    12. A method according to claim 8, 9, 10 or 11, wherein the identifier is associated with the material.
    13. A method according to claim 12, wherein the identifier is in the material.
    14. A method according to any one of claims 8 to 13, wherein the identifier is generated at the source of the material.
    12 15.
    16.
    17. 18.
    19.
    22.
    Apparatus for identifying video and/or audio material comprising an identifier generator for providing a material identifier, the identifier having a data field containing data indicating the type of material source.
    Apparatus according to claim 15, wherein the said data field distinguishes between recorded signals and signals from live sources.
    Apparatus according to claim 15 or 16 wherein the identifier is a UMID.
    Apparatus according to claim 17, wherein the data field is the instance number field.
    Apparatus according to claim 15, 16, 17 or 18, wherein the identifier is associated with the material.
    20. Apparatus according to claim 19, wherein the identifier is in the material.
    21. A signal representing video and/or audio material, the signal including a material identifier having a data field identifying the type of source of the signal.
    A signal according to claim 21, wherein the type identifier indicates whether the source is a live source or not.
    23. A recording medium on which is recorded a signal representing video and/or audio material, and a material identifier having a data field identifying the type of source of the signal.
    24. A recording medium according to claim 23, wherein the type identifier indicates whether the source is a live source or not.
    25. A system substantially as hereinbefore described with reference to the accompanying drawings.
    26. A method substantially as hereinbefore described with reference to the accompanying drawings.
    27. Apparatus substantially as hereinbefore described with reference to the accompanying drawings.
GB0008440A 2000-04-05 2000-04-05 Identifying the type of source of video/audio data Withdrawn GB2361131A (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
GB0008440A GB2361131A (en) 2000-04-05 2000-04-05 Identifying the type of source of video/audio data
KR1020017015673A KR100788628B1 (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
JP2001573479A JP4711379B2 (en) 2000-04-05 2001-03-30 Audio and / or video material identification and processing method
CA002375688A CA2375688A1 (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
BR0105577-1A BR0105577A (en) 2000-04-05 2001-03-30 Processor and method for processing video and / or audio material identifiers and for processing video and / or audio material, computer program product, video and / or audio signal processing system, method and apparatus for identify video and / or audio material signal representing video and / or audio material and recording medium
CNB018015697A CN100385559C (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
PCT/GB2001/001461 WO2001075886A1 (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
EP01915568A EP1188164B1 (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
AU42652/01A AU4265201A (en) 2000-04-05 2001-03-30 Identifying and processing of audio and/or video material
US10/008,072 US7526178B2 (en) 2000-04-05 2001-12-04 Identifying and processing of audio and/or video material

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0008440A GB2361131A (en) 2000-04-05 2000-04-05 Identifying the type of source of video/audio data

Publications (2)

Publication Number Publication Date
GB0008440D0 GB0008440D0 (en) 2000-05-24
GB2361131A true GB2361131A (en) 2001-10-10

Family

ID=9889319

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0008440A Withdrawn GB2361131A (en) 2000-04-05 2000-04-05 Identifying the type of source of video/audio data

Country Status (1)

Country Link
GB (1) GB2361131A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003073426A1 (en) 2002-02-28 2003-09-04 Sony Corporation Data processing apparatus, method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0580367A2 (en) * 1992-07-24 1994-01-26 Sony Corporation Video signal transmission, recording and reproduction
EP0726680A2 (en) * 1995-02-09 1996-08-14 Mitsubishi Denki Kabushiki Kaisha Multimedia information processing system
GB2301930A (en) * 1995-06-06 1996-12-18 Sony Corp Data reproducing system with copy prohibition means
EP0757488A2 (en) * 1995-08-03 1997-02-05 Mitsubishi Denki Kabushiki Kaisha A multimedia information processing apparatus
WO1998033325A2 (en) * 1997-01-27 1998-07-30 Koninklijke Philips Electronics N.V. Method and system for transferring content information and supplemental information relating thereto

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0580367A2 (en) * 1992-07-24 1994-01-26 Sony Corporation Video signal transmission, recording and reproduction
EP0726680A2 (en) * 1995-02-09 1996-08-14 Mitsubishi Denki Kabushiki Kaisha Multimedia information processing system
GB2301930A (en) * 1995-06-06 1996-12-18 Sony Corp Data reproducing system with copy prohibition means
EP0757488A2 (en) * 1995-08-03 1997-02-05 Mitsubishi Denki Kabushiki Kaisha A multimedia information processing apparatus
WO1998033325A2 (en) * 1997-01-27 1998-07-30 Koninklijke Philips Electronics N.V. Method and system for transferring content information and supplemental information relating thereto

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003073426A1 (en) 2002-02-28 2003-09-04 Sony Corporation Data processing apparatus, method, and program
EP1480217A1 (en) * 2002-02-28 2004-11-24 Sony Corporation Data processing apparatus, method, and program
EP1480217A4 (en) * 2002-02-28 2008-05-07 Sony Corp Data processing apparatus, method, and program
US7953718B2 (en) 2002-02-28 2011-05-31 Sony Corporation Data processing apparatus, method, and program

Also Published As

Publication number Publication date
GB0008440D0 (en) 2000-05-24

Similar Documents

Publication Publication Date Title
US10200767B2 (en) Audio and/or video generation apparatus and method of generating audio and/or video signals
US8457467B2 (en) Apparatus and associated methodology of recording and accessing metadata via unique identifying indicia
EP1947648B1 (en) Video processing apparatus and method
KR100788628B1 (en) Identifying and processing of audio and/or video material
EP1986436A2 (en) Audio and/or video generation apparatus and method of generating audio and /or video signals
EP1102276A1 (en) Method of recording and accessing metadata
GB2356080A (en) Generation system for audio, video or a combination thereof where metadata is generated and stored or recorded with the audio/video signal
GB2361128A (en) Video and/or audio processing apparatus
GB2361131A (en) Identifying the type of source of video/audio data
GB2361097A (en) A system for generating audio/video productions
US20020131763A1 (en) Video processing and/or recording
GB2361127A (en) Audio/video reproduction via a communications network
GB2361096A (en) Metadata generation in audio or video apparatus
GB2361098A (en) Editing system and method using metadata
GB2361092A (en) Video recording with metadata within timecode
GB2361090A (en) Generating sample images to assist video editing
GB2361094A (en) Producing audio/visual material and identifying equipment used in its production

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)