US20010009446A1 - Editing system with router for connection to HDTV circuitry - Google Patents

Editing system with router for connection to HDTV circuitry Download PDF

Info

Publication number
US20010009446A1
US20010009446A1 US09/800,867 US80086701A US2001009446A1 US 20010009446 A1 US20010009446 A1 US 20010009446A1 US 80086701 A US80086701 A US 80086701A US 2001009446 A1 US2001009446 A1 US 2001009446A1
Authority
US
United States
Prior art keywords
high definition
video data
video
router
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/800,867
Inventor
Morton Tarr
Peter Fasciano
Craig Frink
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/800,867 priority Critical patent/US20010009446A1/en
Publication of US20010009446A1 publication Critical patent/US20010009446A1/en
Priority to US10/375,599 priority patent/US7046251B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/90Tape-like record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/032Electronic editing of digitised analogue information signals, e.g. audio or video signals on tapes

Definitions

  • the present invention is related to a system for editing high definition television (HDTV) resolution video data.
  • HDTV high definition television
  • a non-linear editor is connected to video processing equipment through a serial digital video interface (SDI) to edit high definition television video data.
  • SDI serial digital video interface
  • a system edits HDTV-resolution video data.
  • a non-linear editor includes a random access, computer-readable and re-writeable storage medium that stores a plurality of sequences of high definition (HD) digital still images in media files. Each image may represent a single frame, i.e., two fields, or a single field of HD video data.
  • the non-linear editor provides a configuration control signal to define a video program to be rendered using the stored HD digital still images.
  • the non-linear editor includes an input serial digital interface and an output serial digital interface to provide the HD video data to be edited.
  • a multiformat video router directs the HD video data between the non-linear editor and video processing equipment.
  • the router is connected by a video interconnect to the input serial interface and the output serial interface of the non-linear editor and the router receives the configuration control signal from the non-linear editor.
  • the router is connected by a video interconnect to provide video data to an input of the video processing equipment, and is connected by a video interconnect to receive a video data output of the video processing equipment.
  • video processing equipment has an input for receiving HD video data from the multiformat router, and an output for sending HD video data to the multiformat router.
  • the video processing equipment also has an input for receiving the configuration control signal from the non-linear editor.
  • Another aspect is a non-linear editor which may include a plurality of HD serial digital interfaces to convert a video stream to bus data.
  • Another aspect is video processing equipment including at least one HD codecs for formatting the HD video data and a video effects generator for editing the HD video data.
  • Another aspect is a multiformat video router which includes a standard definition router and an HD router.
  • Another aspect is a method for storing edited HDTV-resolution video data.
  • Edited HD video data is received from a video effects generator.
  • the edited HD video data is compressed into an HD video data file which includes HD video data information.
  • the compressed HD video data file is sent through a serial digital interface to a non-linear storage system.
  • HD video data is retrieved from storage and transferred through a serial digital interface.
  • the retrieved HD video data is decompressed and sent to a video effects generator for processing.
  • the processed HD video data is provided as an output.
  • Another aspect is buffering the HD video data at an HD serial interface and providing the buffered HD video data to the video effects generator.
  • Another aspect is capturing the output of the video effects generator at an HD serial digital interface.
  • Another aspect is a method for editing HDTV-resolution video data.
  • a plurality of sequences of HD digital still images is stored and a video program to be rendered using selected stored HD digital still images is defined.
  • Devices including video processing equipment are configured for processing the selected HD still images.
  • the selected HD still images are transferred over a video interconnection to the video processing equipment and the processed HD still images are rendered.
  • FIG. 1 a is a block diagram of a system in one embodiment
  • FIG. 1 b illustrates a table for tracking equivalency of media data files
  • FIG. 2 is a more detailed block diagram of the system in FIG. 1;
  • FIG. 3 is a flowchart describing how video effects are played back in the system of FIG. 2;
  • FIG. 4 is a flowchart describing how video with video effects is stored in the system of FIG. 2.
  • FIG. 1 a is a block diagram of an example system for editing high definition (HD) video data.
  • HD video data may include any data at a resolution higher than standard definition (SD) video data, for example, data with a resolution greater than 525 scan lines and/or at more than 30 frames/sec.
  • SD standard definition
  • the HD data may be in 8 or 10-bit components.
  • the system includes video processing equipment 110 which processes HD video data, and a router 120 which transfers HD video data to the video processing equipment 110 from the editing system 130 .
  • the video processing equipment 110 may be, for example, one or more coder/decoder processors (codecs), a video effects generator or display or capture device.
  • Video processing equipment 110 may capture high definition (HD) video data for processing at input 118 .
  • Video to be played back may be output from video processing equipment 110 at output 114 .
  • Video router 120 may be a multiformat router (e.g., a router capable of directing both standard (SD) and HD video data) which is an interface for the HD video data which travels between the video processing equipment 110 and the editing system 130 through inputs and outputs 114 , 118 , 124 and 128 .
  • Router 120 may also be two separate routers—one for HD video data and one for standard definition (SD) video data.
  • Router 120 may be a cross-point switch such as the HDS-V3232 by Sony Corporation. Router 120 is configurable by the editing system 130 based on the editing to be performed.
  • a configuration control signal 134 may be sent by editing system 130 to router 120 and video processing equipment 110 to configure those devices according to the type of editing to be performed and the amount of HD video data to be edited.
  • the editing system 130 is a non-linear editor including a random-access, computer-readable and re-writeable storage medium that stores a sequence of digital still images. Each still image may represent a single frame, i.e., two fields, or a single field of motion video data.
  • the editing system 130 may allow any particular image in the sequence of still images to be randomly accessed for playback.
  • the images may include uncompressed video data, however, since digital data representing motion video may consume large amounts of computer memory, the digital data typically is compressed to reduce storage requirements.
  • Intraframe compression which involves compressing the data representing each still image independently of other still images.
  • Commonly-used motion video compression schemes using intraframe compression include “motion-JPEG” and “I-frame only” MPEG.
  • Intraframe compression allows purely non-linear access to any image in the sequence.
  • Interframe compression involves predicting one image using another. This kind of compression often is used in combination with intraframe compression.
  • Several standards use interframe compression techniques, such as MPEG-1(ISO/IEC 11172-1 through 5), MPEG-2(ISO/IEC 13818-1 through 9) and H.261, an International Telecommunications Union (ITU) standard.
  • MPEG-2 for example, compresses some images using intraframe compression (called I-frames or key frames), and other images using interframe compression techniques for example by computing predictive errors between images. The predictive errors may be computed for forward prediction (called P-frames) or bidirectional prediction (called B-frames).
  • MPEG-2 is designed to provide broadcast quality full motion video.
  • Interframe compression does not allow purely non-linear access to every image in the sequence, because an image may depend on either previous or following images in the sequence.
  • the invention is not limited to a particular kind of compression and does not require compression.
  • Multimedia authoring, processing and playback systems typically have a data structure which represents the multimedia composition.
  • the data structure ultimately refers to clips of source material, such as digitized video or audio, using an identifier of the source material, such as a unique identifier or a file name, and possibly a temporal range within the source material defining the clip.
  • the identifier may be of a type that may be used with a list of equivalent data files to identify a file name for the source material.
  • An index may be used to translate the temporal range in the source into a range of bytes within a corresponding file. This range of bytes may be used with the segment table for the file to identify segments of data that are needed and the storage units from which the data is retrieved.
  • FIG. 1 b shows an example list structure that may be used to represent part of a multimedia composition that may be created by editing system 130 .
  • there are several clips 560 each of which includes a reference to a source identifier, indicated at 562 , and a range within the source, as indicated at 564 .
  • a more complex structure is shown in PCT Published Application WO93/21636 published on Oct. 28, 1993.
  • multimedia compositions include those defined by Open Media Framework Interchange Specification from Avid Technology, Inc., Advanced Authoring Format from the Multimedia Task Force, QuickTime from Apple Computer, DirectShow from Microsoft, and Bento also from Apple Computer, and as shown in PCT Publication WO96/26600.
  • the data structure described above and used to represent multimedia programs may use multiple types of data that are synchronized and displayed.
  • the most common example is a television program or film production which includes motion video (often two or more streams or tracks) with associated audio (often four or more streams or tracks).
  • the video and audio data may be stored in different data files and may be combined arbitrarily, better performance may be obtained if requests for data for these different data files are managed efficiently. For example, an application may identify a stream for which data can be read, and then may determine an amount of data which should be read, if any. A process for performing this kind of management of read operations is shown in U.S. Pat. No. 5,045,940. In general, the application determines which stream has the least amount of data available for display. If there is a sufficient amount of memory data to be played back for that stream to efficiently read an amount of data, then that data is read from the file.
  • each segment of the data is requested from a storage unit selected from those on which the segment is stored.
  • the editing system may convert a data structure representing a composition, such as shown in FIG. 1 b , into file names and ranges within those files.
  • Editing system 130 may use various audio and video media files stored on a storage system to create a composition.
  • Editing system 130 may be capable of handling one or more tracks of audio/video information, and may be capable of performing editing functions such as dissolves, wipes, flips, flops, and other functions known in the art of video production.
  • Media files are typically created by a digitizing system (not shown) that receives one or more audio/video inputs from a media player (not shown). These media files may also be digitized directly by a digital recorder (not shown).
  • Editing system 130 may also use interactive elements in creating a composition.
  • a commercial editing system 130 may be used, such as the Media Composer video production system or NewsCutter news editing system available from Avid Technology, Inc. (NewsCutter is a registered trademark of Avid Technologies, Inc.). Also, a commercial playback system suitable for implementing the present invention may be used that implements the Media Engine video playback system available from Avid Technology, Inc. that is incorporated in the Avid AirPlay MP playback server system (AirPlay is a registered trademark of Avid Technology, Inc.). A commercial storage system (not shown) suitable for storing composition files includes the MediaShare external storage device (MediaShare is a trademark of Avid Technology, Inc.). Other commercial systems may be used.
  • FIG. 2 is a more detailed block diagram of a system for editing high definition video data such as the one shown in FIG. 1 a .
  • a non-linear editor is shown as computer 210 and non-linear storage system 205 .
  • the need for multiple copies of video sources to produce arbitrary sequences of segments has been avoided by the random-access nature of the media.
  • Arbitrary sequences of segments from multiple data files are provided by pipelining and buffering non-linear accesses to the motion video data.
  • Storage system 205 stores HD video data in compressed format as media files, although the HD video data may also be in uncompressed format.
  • Another example of an editing system may be found in U.S. Patent Application entitled “HDTV EDITING AND PREVISUALIZATION USING SDTV DEVICES” by Craig R. Frink et al. filed Apr. 3, 1998.
  • Computer 210 includes a serial digital interface (SDI) and a high definition serial digital interface (HD-SDI).
  • SDI and HD-SDI interfaces provide video interconnections to router 120 .
  • the SDI cards may be, for example, the Dynamo VideoPump card by Viewgraphics, Inc, or an SDI card by Gennum. From the point of view of the non-linear editor, the SDI is a video input and output device.
  • the SDI cards can transfer multiple streams of HD video data concurrently and in real-time to the storage system 205 .
  • the HD-SDI cards may be any interface card which can capture an HD video stream at a rate in the range of 54 million to (480 Progressive) 148.5 million components/second (1080 interlaced), (e.g., 8 or 10 bits ) and to convert the HD video stream to peripheral connection interface (PCI) type bus data.
  • PCI peripheral connection interface
  • a 64 bit/33 MHZ PCI bus or a 32 bit/66 MHZ PCI may be capable of transferring HD data in real time thereby minimizing the buffer size requirements.
  • Each of the HD-SDI cards has a buffer capable of capturing a number of high definition (HD) video frames which may later be transferred for processing or storage.
  • the cards may be expandable to accommodate additional codec or video effects generator equipment which include more inputs and outputs of HD video data.
  • the SDI and HD-SDI cards provide a video interconnection between computer 210 and routers 215 and 220 .
  • the video interconnection between the SDI cards and router 215 allows compressed HD video data representing an image to be edited to be carried in packet form between the non-linear editor 210 and HD codecs 230 and 240 .
  • the video data transferred by the SDI is defined using markers signifying the Start of Active Video (SAV) and End of Active Video (EAV) to delineate a field of HD video data.
  • SAV Start of Active Video
  • EAV End of Active Video
  • a computer interconnection between the interface cards and the routers may also be used instead of a video interconnection.
  • the computer interconnection assigns an address for each device in the system and uses control information to identify a start of a frame of HD video data and a number of lines which is being sent to coordinate the transfer of the HD video data between the devices.
  • the non-linear editor is responsible for identifying each device and its address in the system.
  • the non-linear editor is responsible for providing an output HD video data stream. Therefore, the devices which receive or send the HD video data stream, as well as other devices in the system, are transparent to the non-linear editor.
  • Router 215 transfers compressed high definition video data between computer 210 and video processing equipment which may include video effects generator 245 and high definition codec 230 or high definition codec 240 .
  • Router 215 may be connected to an input/output port 225 for receiving and transferring compressed HD video data.
  • Router 215 may also be connected to an external videotape recorder (VTR) 235 , such as a D-5 VTR from Panasonic, to store HD video data.
  • VTR videotape recorder
  • Router 215 may be used to transfer compressed HD data and may be, for example, a Society of Motion Picture and Television Engineers (SMPTE) standard SMPTE-259 router, such as a DVS-V1616 by Sony or a VIA 32 series router such as VIA 16 ⁇ 16s by Leitch.
  • SMPTE Society of Motion Picture and Television Engineers
  • the input and the output side of router 215 may be configurable, for example, by increments of four channels.
  • Router 220 directs uncompressed high definition video data between computer 210 and video effects generator 245 .
  • Router 220 may be a SMPTE-292 router for transferring uncompressed HD video data.
  • Router 220 is also connected to HD video effects generator 245 which operates on real-time video streams through input/output port 270 used for transferring digital high definition video signals to or from external equipment.
  • Video effects generator 245 may be, for example, a product developed by Sony, Grass Valley or Abekas.
  • Router 220 is connected to high definition digital to analog (D/A) converter 250 which provides an output to high definition video monitor 275 or to analog high definition output 260 .
  • D/A digital to analog
  • HD video monitor may be for example, Sony's HDM2830 or Panasonic's AT-H3215W plus a digital to analog convertor such as Panasonic Model AJ-HDA500.
  • Router 220 includes an input 244 from high definition analog to digital (A/D) converter 255 which receives an analog high definition input 265 .
  • A/D analog to digital
  • Both routers 215 and 220 are configurable by the non-linear editor which specifies the type of editing to be performed and the amount of HD data to be edited.
  • the routers 215 and 220 transfer the HD video data based on the configuration control signal.
  • the editing process switches between playback and storage of the edited HD video data (the processes are described below in connection with FIGS. 3 and 4) and the routers change their routing configuration based on a time code or a number of frames to be played back or stored as indicated by the non-linear editor during the initial configuration.
  • HD codecs 230 and 240 may be, for example, the Digital HD VTR Processor by Panasonic, part number AJ-HDP500P.
  • the codecs compress and decompress the HD video data. The operation of FIG. 2 will now be described in connection with FIGS. 3 and 4.
  • FIG. 3 is a flowchart of the process of editing HDTV video data and playing back the edited HD video data.
  • high definition video data is retrieved by non-linear editor 210 from a storage system 205 , step 305 .
  • the HD video data may be retrieved as a single field or frame of HD video data or as a linear or non-linear sequence of video clips.
  • the storage system 205 may be non-linear and may allow random non-linear access of HD video data.
  • the retrieved HD video data is transferred by the SDI card to router 215 and is sent to codec 230 or codec 240 , step 310 .
  • codec 230 receives the retrieved HD video data, which is in the format of a compressed data file representing the HD video image, and decompresses the HD data file to video data format.
  • step 315 if the effect to be applied is determined to be an A/B effect (i.e., a wipe, fade, etc.) then the uncompressed video is sent through router 220 to video effects generator 245 , step 320 . If the effect is not an A/B effect, and is for example, an A/B/C effect (i.e., a ripple, page curl, etc.), the uncompressed video is sent to an HD-SDI interface card, step 325 where it is buffered while the system waits for another clip of HD video data to be used in creating the effects. After the streams of HD video data for the A/B/C effect is buffered, the HD video data is sent to the video effects generator 245 in step 330 for processing.
  • an A/B effect i.e., a wipe, fade, etc.
  • the HD video data output of the video effects generator 245 which includes the added effects is captured by the HD-SDI card in the non-linear editor 210 .
  • the non-linear editor 210 may also be used to edit the HD video data.
  • the HD video data output of the video effects generator 245 may be sent to HD video monitor 275 or it may be transferred as an analog 260 or digital 270 HD output. The process of storing the generated HD video is described below.
  • FIG. 4 is a flowchart of the process of storing an output of the video effects generator 245 to disk.
  • step 405 the rendered video from the video effects generator 245 is sent to the HD-SDI card where it is buffered.
  • step 415 If there are more effects to be added in step 415 , then additional video is sent to the video effects generator 245 with the rendered video to allow more complex editing, in step 410 .
  • the HD video data is sent through a codec, such as codec 230 in step 420 where the HD video data with the effect is transformed to a compressed format.
  • the compressed HD video data is next transferred to an SDI card in the non-linear editor 210 .
  • the edited HD video data is transferred to storage system 205 in step 422 .
  • a composition which has been edited and stored according to the process in FIG. 4 can be played back using the process in FIG. 3, such that the data file including the edits is played back, rather than separate sources of HD video data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

A non-linear editor is connected to video processing equipment through a serial digital video interface to edit high definition (HD) television video data. The non-linear editor includes a randomly accessible, computer-readable and re-writeable storage medium that stores a plurality of sequences of HD digital images representing a frame or field of HD motion video data. The non-linear editor provides a configuration control signal to identify processing to be performed on the HD video data and defines a video program to be rendered using the stored HD digital images. An input serial digital interface and an output serial digital interface in the non-linear editor provide the HD video data to be edited. A multiformat video router controls the HD video data sent between the non-linear editor and the video processing equipment. The router is video interconnected to the video processing equipment and to the serial digital interfaces of the non-linear editor.

Description

    FIELD OF THE INVENTION
  • The present invention is related to a system for editing high definition television (HDTV) resolution video data. [0001]
  • BACKGROUND OF THE INVENTION
  • Separate editing systems exist for editing standard definition television (SDTV) resolution video data and for editing HDTV-resolution video data. Currently, there is a need for a HDTV editor for manipulating digital high definition (HD) video data which can be configured from a set of editing devices such as a non-linear editor video interconnected to video processing equipment. [0002]
  • SUMMARY OF THE INVENTION
  • A non-linear editor is connected to video processing equipment through a serial digital video interface (SDI) to edit high definition television video data. [0003]
  • Accordingly, in one aspect a system edits HDTV-resolution video data. In the system, a non-linear editor includes a random access, computer-readable and re-writeable storage medium that stores a plurality of sequences of high definition (HD) digital still images in media files. Each image may represent a single frame, i.e., two fields, or a single field of HD video data. The non-linear editor provides a configuration control signal to define a video program to be rendered using the stored HD digital still images. The non-linear editor includes an input serial digital interface and an output serial digital interface to provide the HD video data to be edited. In the system, a multiformat video router directs the HD video data between the non-linear editor and video processing equipment. The router is connected by a video interconnect to the input serial interface and the output serial interface of the non-linear editor and the router receives the configuration control signal from the non-linear editor. The router is connected by a video interconnect to provide video data to an input of the video processing equipment, and is connected by a video interconnect to receive a video data output of the video processing equipment. In the system, video processing equipment has an input for receiving HD video data from the multiformat router, and an output for sending HD video data to the multiformat router. [0004]
  • The video processing equipment also has an input for receiving the configuration control signal from the non-linear editor. [0005]
  • Another aspect is a non-linear editor which may include a plurality of HD serial digital interfaces to convert a video stream to bus data. [0006]
  • Another aspect is video processing equipment including at least one HD codecs for formatting the HD video data and a video effects generator for editing the HD video data. [0007]
  • Another aspect is a multiformat video router which includes a standard definition router and an HD router. [0008]
  • Another aspect is a method for storing edited HDTV-resolution video data. Edited HD video data is received from a video effects generator. The edited HD video data is compressed into an HD video data file which includes HD video data information. The compressed HD video data file is sent through a serial digital interface to a non-linear storage system. [0009]
  • Another aspect is a method for playing back HD video data. HD video data is retrieved from storage and transferred through a serial digital interface. The retrieved HD video data is decompressed and sent to a video effects generator for processing. The processed HD video data is provided as an output. [0010]
  • Another aspect is buffering the HD video data at an HD serial interface and providing the buffered HD video data to the video effects generator. [0011]
  • Another aspect is capturing the output of the video effects generator at an HD serial digital interface. [0012]
  • Another aspect is a method for editing HDTV-resolution video data. A plurality of sequences of HD digital still images is stored and a video program to be rendered using selected stored HD digital still images is defined. Devices including video processing equipment are configured for processing the selected HD still images. The selected HD still images are transferred over a video interconnection to the video processing equipment and the processed HD still images are rendered. [0013]
  • BRIEF DESCRIPTION OF THE DRAWING
  • In the drawing, [0014]
  • FIG. 1[0015] a is a block diagram of a system in one embodiment;
  • FIG. 1[0016] b illustrates a table for tracking equivalency of media data files;
  • FIG. 2 is a more detailed block diagram of the system in FIG. 1; [0017]
  • FIG. 3 is a flowchart describing how video effects are played back in the system of FIG. 2; and [0018]
  • FIG. 4 is a flowchart describing how video with video effects is stored in the system of FIG. 2. [0019]
  • DETAILED DESCRIPTION
  • FIG. 1[0020] a is a block diagram of an example system for editing high definition (HD) video data. HD video data may include any data at a resolution higher than standard definition (SD) video data, for example, data with a resolution greater than 525 scan lines and/or at more than 30 frames/sec. The HD data may be in 8 or 10-bit components. The system includes video processing equipment 110 which processes HD video data, and a router 120 which transfers HD video data to the video processing equipment 110 from the editing system 130.
  • The [0021] video processing equipment 110 may be, for example, one or more coder/decoder processors (codecs), a video effects generator or display or capture device. Video processing equipment 110 may capture high definition (HD) video data for processing at input 118. Video to be played back may be output from video processing equipment 110 at output 114.
  • [0022] Video router 120 may be a multiformat router (e.g., a router capable of directing both standard (SD) and HD video data) which is an interface for the HD video data which travels between the video processing equipment 110 and the editing system 130 through inputs and outputs 114, 118, 124 and 128. Router 120 may also be two separate routers—one for HD video data and one for standard definition (SD) video data. Router 120 may be a cross-point switch such as the HDS-V3232 by Sony Corporation. Router 120 is configurable by the editing system 130 based on the editing to be performed.
  • A [0023] configuration control signal 134 may be sent by editing system 130 to router 120 and video processing equipment 110 to configure those devices according to the type of editing to be performed and the amount of HD video data to be edited.
  • The [0024] editing system 130 is a non-linear editor including a random-access, computer-readable and re-writeable storage medium that stores a sequence of digital still images. Each still image may represent a single frame, i.e., two fields, or a single field of motion video data. The editing system 130 may allow any particular image in the sequence of still images to be randomly accessed for playback. The images may include uncompressed video data, however, since digital data representing motion video may consume large amounts of computer memory, the digital data typically is compressed to reduce storage requirements.
  • Various types of compression may be used. Some kinds of compression may operate on a stream of data regardless of how the data may be divided to define an image. One kind of compression is called “intraframe” compression which involves compressing the data representing each still image independently of other still images. Commonly-used motion video compression schemes using intraframe compression include “motion-JPEG” and “I-frame only” MPEG. Intraframe compression allows purely non-linear access to any image in the sequence. [0025]
  • More compression can obtained for motion video sequences by using what is commonly called “interframe” compression. Interframe compression involves predicting one image using another. This kind of compression often is used in combination with intraframe compression. Several standards use interframe compression techniques, such as MPEG-1(ISO/IEC 11172-1 through 5), MPEG-2(ISO/IEC 13818-1 through 9) and H.261, an International Telecommunications Union (ITU) standard. MPEG-2, for example, compresses some images using intraframe compression (called I-frames or key frames), and other images using interframe compression techniques for example by computing predictive errors between images. The predictive errors may be computed for forward prediction (called P-frames) or bidirectional prediction (called B-frames). MPEG-2 is designed to provide broadcast quality full motion video. Interframe compression does not allow purely non-linear access to every image in the sequence, because an image may depend on either previous or following images in the sequence. The invention is not limited to a particular kind of compression and does not require compression. [0026]
  • There are several kinds of systems that may be used to author, process and display multimedia data. These systems may be used to modify the data, define different combinations of data, create new data and display data to a user. A variety of techniques are known in the art for implementing these kinds of systems. [0027]
  • Multimedia authoring, processing and playback systems typically have a data structure which represents the multimedia composition. The data structure ultimately refers to clips of source material, such as digitized video or audio, using an identifier of the source material, such as a unique identifier or a file name, and possibly a temporal range within the source material defining the clip. The identifier may be of a type that may be used with a list of equivalent data files to identify a file name for the source material. An index may be used to translate the temporal range in the source into a range of bytes within a corresponding file. This range of bytes may be used with the segment table for the file to identify segments of data that are needed and the storage units from which the data is retrieved. [0028]
  • FIG. 1[0029] b shows an example list structure that may be used to represent part of a multimedia composition that may be created by editing system 130. In an example shown in FIG. 1b, there are several clips 560, each of which includes a reference to a source identifier, indicated at 562, and a range within the source, as indicated at 564. Generally, there may be such a list for each track of media in a temporal composition. There are a variety of data structures which may be used to represent a composition. In addition to a list structure, a more complex structure is shown in PCT Published Application WO93/21636 published on Oct. 28, 1993. Other example representations of multimedia compositions include those defined by Open Media Framework Interchange Specification from Avid Technology, Inc., Advanced Authoring Format from the Multimedia Task Force, QuickTime from Apple Computer, DirectShow from Microsoft, and Bento also from Apple Computer, and as shown in PCT Publication WO96/26600.
  • The data structure described above and used to represent multimedia programs may use multiple types of data that are synchronized and displayed. The most common example is a television program or film production which includes motion video (often two or more streams or tracks) with associated audio (often four or more streams or tracks). [0030]
  • Because the video and audio data may be stored in different data files and may be combined arbitrarily, better performance may be obtained if requests for data for these different data files are managed efficiently. For example, an application may identify a stream for which data can be read, and then may determine an amount of data which should be read, if any. A process for performing this kind of management of read operations is shown in U.S. Pat. No. 5,045,940. In general, the application determines which stream has the least amount of data available for display. If there is a sufficient amount of memory data to be played back for that stream to efficiently read an amount of data, then that data is read from the file. When it is determined that data for a stream should be requested, each segment of the data is requested from a storage unit selected from those on which the segment is stored. In order to identify which files to request from the storage unit, the editing system may convert a data structure representing a composition, such as shown in FIG. 1[0031] b, into file names and ranges within those files.
  • Editing [0032] system 130 may use various audio and video media files stored on a storage system to create a composition. Editing system 130 may be capable of handling one or more tracks of audio/video information, and may be capable of performing editing functions such as dissolves, wipes, flips, flops, and other functions known in the art of video production. Media files are typically created by a digitizing system (not shown) that receives one or more audio/video inputs from a media player (not shown). These media files may also be digitized directly by a digital recorder (not shown). Editing system 130 may also use interactive elements in creating a composition.
  • A [0033] commercial editing system 130 may be used, such as the Media Composer video production system or NewsCutter news editing system available from Avid Technology, Inc. (NewsCutter is a registered trademark of Avid Technologies, Inc.). Also, a commercial playback system suitable for implementing the present invention may be used that implements the Media Engine video playback system available from Avid Technology, Inc. that is incorporated in the Avid AirPlay MP playback server system (AirPlay is a registered trademark of Avid Technology, Inc.). A commercial storage system (not shown) suitable for storing composition files includes the MediaShare external storage device (MediaShare is a trademark of Avid Technology, Inc.). Other commercial systems may be used.
  • FIG. 2 is a more detailed block diagram of a system for editing high definition video data such as the one shown in FIG. 1[0034] a. A non-linear editor is shown as computer 210 and non-linear storage system 205. In non-linear systems, the need for multiple copies of video sources to produce arbitrary sequences of segments has been avoided by the random-access nature of the media. Arbitrary sequences of segments from multiple data files are provided by pipelining and buffering non-linear accesses to the motion video data. Storage system 205 stores HD video data in compressed format as media files, although the HD video data may also be in uncompressed format. Another example of an editing system may be found in U.S. Patent Application entitled “HDTV EDITING AND PREVISUALIZATION USING SDTV DEVICES” by Craig R. Frink et al. filed Apr. 3, 1998.
  • [0035] Computer 210 includes a serial digital interface (SDI) and a high definition serial digital interface (HD-SDI). The SDI and HD-SDI interfaces provide video interconnections to router 120. The SDI cards may be, for example, the Dynamo VideoPump card by Viewgraphics, Inc, or an SDI card by Gennum. From the point of view of the non-linear editor, the SDI is a video input and output device. The SDI cards can transfer multiple streams of HD video data concurrently and in real-time to the storage system 205.
  • The HD-SDI cards may be any interface card which can capture an HD video stream at a rate in the range of 54 million to (480 Progressive) 148.5 million components/second (1080 interlaced), (e.g., 8 or 10 bits ) and to convert the HD video stream to peripheral connection interface (PCI) type bus data. A 64 bit/33 MHZ PCI bus or a 32 bit/66 MHZ PCI may be capable of transferring HD data in real time thereby minimizing the buffer size requirements. Each of the HD-SDI cards has a buffer capable of capturing a number of high definition (HD) video frames which may later be transferred for processing or storage. The cards may be expandable to accommodate additional codec or video effects generator equipment which include more inputs and outputs of HD video data. One of ordinary skill in the art may develop an HD-SDI card based on known products by Sony and Panasonic which include both SDI and HD-SDI interfaces, from known HD products, or from technology used for SDI cards by Viewgraphics and Gennum. The SDI and HD-SDI cards provide a video interconnection between [0036] computer 210 and routers 215 and 220. The video interconnection between the SDI cards and router 215 allows compressed HD video data representing an image to be edited to be carried in packet form between the non-linear editor 210 and HD codecs 230 and 240. The video data transferred by the SDI is defined using markers signifying the Start of Active Video (SAV) and End of Active Video (EAV) to delineate a field of HD video data. The video interconnection between the HD-SDI cards and router 220 allows a continuous, uncompressed HD video data stream to be carried between the HD-SDI cards and router 220.
  • A computer interconnection between the interface cards and the routers may also be used instead of a video interconnection. The computer interconnection assigns an address for each device in the system and uses control information to identify a start of a frame of HD video data and a number of lines which is being sent to coordinate the transfer of the HD video data between the devices. When using a computer interconnection, the non-linear editor is responsible for identifying each device and its address in the system. However, when the video interconnection is used, the non-linear editor is responsible for providing an output HD video data stream. Therefore, the devices which receive or send the HD video data stream, as well as other devices in the system, are transparent to the non-linear editor. [0037]
  • [0038] Router 215 transfers compressed high definition video data between computer 210 and video processing equipment which may include video effects generator 245 and high definition codec 230 or high definition codec 240. Router 215 may be connected to an input/output port 225 for receiving and transferring compressed HD video data. Router 215 may also be connected to an external videotape recorder (VTR) 235, such as a D-5 VTR from Panasonic, to store HD video data. Router 215 may be used to transfer compressed HD data and may be, for example, a Society of Motion Picture and Television Engineers (SMPTE) standard SMPTE-259 router, such as a DVS-V1616 by Sony or a VIA 32 series router such as VIA 16×16s by Leitch. The input and the output side of router 215 may be configurable, for example, by increments of four channels.
  • [0039] Router 220 directs uncompressed high definition video data between computer 210 and video effects generator 245. Router 220 may be a SMPTE-292 router for transferring uncompressed HD video data. Router 220 is also connected to HD video effects generator 245 which operates on real-time video streams through input/output port 270 used for transferring digital high definition video signals to or from external equipment. Video effects generator 245 may be, for example, a product developed by Sony, Grass Valley or Abekas. Router 220 is connected to high definition digital to analog (D/A) converter 250 which provides an output to high definition video monitor 275 or to analog high definition output 260. HD video monitor may be for example, Sony's HDM2830 or Panasonic's AT-H3215W plus a digital to analog convertor such as Panasonic Model AJ-HDA500. Router 220 includes an input 244 from high definition analog to digital (A/D) converter 255 which receives an analog high definition input 265.
  • Both [0040] routers 215 and 220 are configurable by the non-linear editor which specifies the type of editing to be performed and the amount of HD data to be edited. The routers 215 and 220 transfer the HD video data based on the configuration control signal. The editing process switches between playback and storage of the edited HD video data (the processes are described below in connection with FIGS. 3 and 4) and the routers change their routing configuration based on a time code or a number of frames to be played back or stored as indicated by the non-linear editor during the initial configuration.
  • [0041] HD codecs 230 and 240 may be, for example, the Digital HD VTR Processor by Panasonic, part number AJ-HDP500P. The codecs compress and decompress the HD video data. The operation of FIG. 2 will now be described in connection with FIGS. 3 and 4.
  • FIG. 3 is a flowchart of the process of editing HDTV video data and playing back the edited HD video data. By defining a sequence of clips of video data, high definition video data is retrieved by [0042] non-linear editor 210 from a storage system 205, step 305. The HD video data may be retrieved as a single field or frame of HD video data or as a linear or non-linear sequence of video clips. The storage system 205 may be non-linear and may allow random non-linear access of HD video data.
  • The retrieved HD video data is transferred by the SDI card to [0043] router 215 and is sent to codec 230 or codec 240, step 310. In a single stream system only codec 230 is used. Codec 230 receives the retrieved HD video data, which is in the format of a compressed data file representing the HD video image, and decompresses the HD data file to video data format.
  • In [0044] step 315, if the effect to be applied is determined to be an A/B effect (i.e., a wipe, fade, etc.) then the uncompressed video is sent through router 220 to video effects generator 245, step 320. If the effect is not an A/B effect, and is for example, an A/B/C effect (i.e., a ripple, page curl, etc.), the uncompressed video is sent to an HD-SDI interface card, step 325 where it is buffered while the system waits for another clip of HD video data to be used in creating the effects. After the streams of HD video data for the A/B/C effect is buffered, the HD video data is sent to the video effects generator 245 in step 330 for processing.
  • The HD video data output of the [0045] video effects generator 245 which includes the added effects is captured by the HD-SDI card in the non-linear editor 210. The non-linear editor 210 may also be used to edit the HD video data. The HD video data output of the video effects generator 245 may be sent to HD video monitor 275 or it may be transferred as an analog 260 or digital 270 HD output. The process of storing the generated HD video is described below.
  • FIG. 4 is a flowchart of the process of storing an output of the [0046] video effects generator 245 to disk. In step 405, the rendered video from the video effects generator 245 is sent to the HD-SDI card where it is buffered.
  • If there are more effects to be added in [0047] step 415, then additional video is sent to the video effects generator 245 with the rendered video to allow more complex editing, in step 410. After all of the desired effects have been added to the HD video data in step 415, the HD video data is sent through a codec, such as codec 230 in step 420 where the HD video data with the effect is transformed to a compressed format. In step 421, the compressed HD video data is next transferred to an SDI card in the non-linear editor 210. The edited HD video data is transferred to storage system 205 in step 422. A composition which has been edited and stored according to the process in FIG. 4 can be played back using the process in FIG. 3, such that the data file including the edits is played back, rather than separate sources of HD video data.
  • Having now described a few embodiments, it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by way of example only. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention. [0048]

Claims (17)

What is claimed is:
1. A system for editing high definition television resolution video data comprising:
a non-linear editor including a random-access, computer-readable and re-writeable storage medium that stores a plurality of sequences of digital still images representing high definition video data in media files, wherein the non-linear editor provides a configuration control signal defines a video program to be edited using the stored high definition digital still images, and wherein the non-linear editor includes a input serial digital interface and an output serial digital interface to provide the high definition video data to be edited;
a multiformat video router for directing the high definition video data between the nonlinear editor and video processing equipment, wherein the router is connected by a video interconnect to the input serial interface and the output serial interface of the non-linear editor, wherein the router receives the configuration control signal from the non-linear editor to initialize the configuration of the router, wherein the router is connected by a video interconnect to provide video data to an input of video processing equipment, and wherein the router is connected by a video interconnect to receive a video data output of the video processing equipment; and
video processing equipment having an input for receiving high definition video data to be edited from the multiformat router and an output for sending edited high definition video data to the multiformat router, and having an input for receiving the configuration control signal from the non-linear editor to determine processing to be performed on the received high definition video data.
2. The system of
claim 1
, wherein the non-linear editor further includes at least one input high definition serial digital interface and at least one output high definition serial digital interface.
3. The system of
claim 1
, wherein the image data represents a single frame of motion high definition video data.
4. The system of
claim 1
, wherein the image data represents a single field of motion high definition video data.
5. The system of
claim 1
, wherein the image is uncompressed video data.
6. The system of
claim 1
, wherein the image is compressed video data.
7. The system of
claim 1
, wherein the video processing equipment includes at least one high definition coder/decoders for formatting the high definition video data and a video effects generator for editing the high definition video data.
8. The system of
claim 1
, wherein the multiformat router includes a standard definition router and an high definition router.
9. The system of
claim 1
, wherein the output of the video processing equipment is transferred to an high definition video monitor.
10. A method for storing edited high definition television resolution video data comprising the steps of:
receiving edited high definition video data from a video effects generator;
compressing the edited high definition video data into a high definition video data file which includes high definition video data information; and
sending the compressed high definition video data file through a serial digital interface to a non-linear storage system.
11. The method of
claim 10
, further including the step of:
playing back the stored high definition video data file which includes the high definition video data information from at least two streams of edited high definition video data.
12. A method for storing edited high definition television resolution video data comprising the steps of:
receiving edited, uncompressed high definition video data from a video effects generator; and
sending the uncompressed high definition video data through a high definition serial digital interface to a random-access, computer-readable and re-writeable storage medium that stores a plurality of sequences of digital still images representing high definition video data in media files.
13. A method for playing back high definition video data comprising the steps of:
retrieving high definition video data from storage;
transferring the retrieved high definition video data through a serial digital interface;
decompressing the retrieved high definition video data;
sending the decompressed high definition video data to a video effects generator for processing; and
providing processed high definition video data as an output of the video effects generator.
14. The method of
claim 13
, wherein the step of sending further includes:
buffering the high definition video data at a high definition serial digital interface; and
providing the buffered high definition video data to the video effects generator.
15. The method of
claim 13
, wherein the step of providing an output further includes capturing the output of the video effects generator at a high definition serial digital interface.
16. The method of
claim 13
, wherein the step of providing an output further includes storing the output of the video effects generator as a data file.
17. A method for editing high definition television resolution video data comprising the steps of:
storing a plurality of sequences of high definition digital still images;
defining a video program to be rendered using selected stored high definition digital still images;
configuring devices for processing the selected high definition still images, wherein the devices include video processing equipment;
transferring the selected high definition still images over a video interconnection to the video processing equipment for processing; and
rendering the processed high definition still images.
US09/800,867 1998-04-03 2001-03-07 Editing system with router for connection to HDTV circuitry Abandoned US20010009446A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US09/800,867 US20010009446A1 (en) 1998-04-03 2001-03-07 Editing system with router for connection to HDTV circuitry
US10/375,599 US7046251B2 (en) 2001-03-07 2003-02-27 Editing system with router for connection to HDTV circuitry

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/055,048 US6229576B1 (en) 1998-04-03 1998-04-03 Editing system with router for connection to HDTV circuitry
US09/800,867 US20010009446A1 (en) 1998-04-03 2001-03-07 Editing system with router for connection to HDTV circuitry

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/055,048 Continuation US6229576B1 (en) 1998-04-03 1998-04-03 Editing system with router for connection to HDTV circuitry

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/375,599 Continuation US7046251B2 (en) 2001-03-07 2003-02-27 Editing system with router for connection to HDTV circuitry

Publications (1)

Publication Number Publication Date
US20010009446A1 true US20010009446A1 (en) 2001-07-26

Family

ID=21995232

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/055,048 Expired - Fee Related US6229576B1 (en) 1998-04-03 1998-04-03 Editing system with router for connection to HDTV circuitry
US09/800,867 Abandoned US20010009446A1 (en) 1998-04-03 2001-03-07 Editing system with router for connection to HDTV circuitry

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/055,048 Expired - Fee Related US6229576B1 (en) 1998-04-03 1998-04-03 Editing system with router for connection to HDTV circuitry

Country Status (5)

Country Link
US (2) US6229576B1 (en)
EP (1) EP1068615A1 (en)
AU (1) AU2997199A (en)
CA (1) CA2326841A1 (en)
WO (1) WO1999052112A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047436A1 (en) * 2003-07-18 2005-03-03 Tadahiro Ohata Media converter
US20070236584A1 (en) * 2006-04-07 2007-10-11 Cinegest, Inc. Portable high capacity digital data storage device
US20090256863A1 (en) * 2008-04-09 2009-10-15 Harris Corporation, Corporation Of The State Of Delaware Video multiviewer system with serial digital interface and related methods

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6226038B1 (en) 1998-04-03 2001-05-01 Avid Technology, Inc. HDTV editing and effects previsualization using SDTV devices
JP2000251394A (en) * 1999-02-26 2000-09-14 Matsushita Electric Ind Co Ltd Video/audio data converting device and disk device using it
US6847373B1 (en) 1999-04-16 2005-01-25 Avid Technology, Inc. Natural color matching in a video editing system
US6571255B1 (en) * 1999-04-16 2003-05-27 Robert Gonsalves Modification of media with common attributes on a digital nonlinear editing system
US7046251B2 (en) * 2001-03-07 2006-05-16 Avid Technology, Inc. Editing system with router for connection to HDTV circuitry
US7949777B2 (en) 2002-11-01 2011-05-24 Avid Technology, Inc. Communication protocol for controlling transfer of temporal data over a bus between devices in synchronization with a periodic reference signal
JP4222869B2 (en) 2002-12-10 2009-02-12 株式会社ソニー・コンピュータエンタテインメント Image playback device
US7375768B2 (en) * 2004-08-24 2008-05-20 Magix Ag System and method for automatic creation of device specific high definition material
US8041190B2 (en) 2004-12-15 2011-10-18 Sony Corporation System and method for the creation, synchronization and delivery of alternate content
US7895617B2 (en) 2004-12-15 2011-02-22 Sony Corporation Content substitution editor
US8185921B2 (en) 2006-02-28 2012-05-22 Sony Corporation Parental control of displayed content using closed captioning
DE602006012383D1 (en) * 2006-09-25 2010-04-01 Siemens Ag ROUTING DEVICE FOR A LOWER SEA ELECTRONIC MODULE
DE602006014603D1 (en) * 2006-09-25 2010-07-08 Siemens Ag ROUTING DEVICE FOR A LOWER SEA ELECTRONIC MODULE
US8396018B2 (en) 2006-12-04 2013-03-12 Samsung Electronics Co., Ltd. System and method for wireless communication of uncompressed video having beacon design
US8102835B2 (en) * 2006-12-04 2012-01-24 Samsung Electronics Co., Ltd. System and method for wireless communication of uncompressed video having a beacon length indication
WO2008069803A1 (en) 2006-12-08 2008-06-12 Thomson Licensing Identification of video signals in a video system
US10709315B2 (en) 2017-04-28 2020-07-14 Hoya Corporation Apparatuses and methods for endoscopic connection

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2600479A1 (en) 1986-06-23 1987-12-24 Niles David Method of editing and cutting high-definition images
JPH0391384A (en) 1989-09-01 1991-04-16 Sharp Corp Edit device for high precision video signal
US5057911A (en) 1989-10-19 1991-10-15 Matsushita Electric Industrial Co., Ltd. System and method for conversion of digital video signals
US5218672A (en) 1990-01-19 1993-06-08 Sony Corporation Of America Offline editing system with user interface for controlling edit list generation
US5412773A (en) 1991-11-19 1995-05-02 Sony Electronics Inc. Computerized interactive menu-driven video signal processing apparatus and method
WO1993021636A1 (en) 1992-04-10 1993-10-28 Avid Technology, Inc. A method and apparatus for representing and editing multimedia compositions
JP2689823B2 (en) 1992-07-21 1997-12-10 松下電器産業株式会社 Image signal reproducing device and disc device
US5459585A (en) 1992-09-09 1995-10-17 Hitachi, Ltd. Apparatus and method of storing image signals
US5367341A (en) * 1992-10-20 1994-11-22 Canon Information Systems, Inc. Digital video editor having lost video frame protection
CA2115976C (en) 1993-02-23 2002-08-06 Saiprasad V. Naimpally Digital high definition television video recorder with trick-play features
KR960015397B1 (en) 1993-03-17 1996-11-11 엘지전자 주식회사 Hdtv signal converting circuit employing side-cut mode and letter box mode
US5450140A (en) 1993-04-21 1995-09-12 Washino; Kinya Personal-computer-based video production system
JPH0787526A (en) 1993-09-16 1995-03-31 Sony Corp Sampling rate conversion system
US5577042A (en) 1994-01-18 1996-11-19 Mcgraw Broadcast Broadcast and presentation system and method
GB2287845B (en) 1994-03-18 1998-03-25 Sony Uk Ltd Multichannel video data storage
JPH08205082A (en) 1995-01-25 1996-08-09 Sony Corp Data processing unit
EP0811290B1 (en) 1995-02-23 2002-09-18 Avid Technology, Inc. Combined editing system and digital moving picture recording system
EP0777383B1 (en) 1995-06-19 2009-12-09 Sony Corporation Data communication device
GB2307128B (en) 1995-11-09 2000-01-26 Sony Uk Ltd Controlling video down-conversion
JP3493872B2 (en) * 1996-02-29 2004-02-03 ソニー株式会社 Image data processing method and apparatus
US5861864A (en) 1996-04-02 1999-01-19 Hewlett-Packard Company Video interface system and method
GB2312319B (en) 1996-04-15 1998-12-09 Discreet Logic Inc Video storage
JP4462639B2 (en) 1996-04-17 2010-05-12 ソニー株式会社 Matrix switcher and display method of source signal name in matrix switcher
US5872565A (en) 1996-11-26 1999-02-16 Play, Inc. Real-time video processing system
US5963262A (en) 1997-06-30 1999-10-05 Cirrus Logic, Inc. System and method for scaling images and reducing flicker in interlaced television images converted from non-interlaced computer graphics data
US6226038B1 (en) 1998-04-03 2001-05-01 Avid Technology, Inc. HDTV editing and effects previsualization using SDTV devices

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047436A1 (en) * 2003-07-18 2005-03-03 Tadahiro Ohata Media converter
US20080291939A1 (en) * 2003-07-18 2008-11-27 Tadahiro Ohata Media converter
US7460557B2 (en) * 2003-07-18 2008-12-02 Sony Corporation Media converter
US7864801B2 (en) 2003-07-18 2011-01-04 Sony Corporation Media converter
US20070236584A1 (en) * 2006-04-07 2007-10-11 Cinegest, Inc. Portable high capacity digital data storage device
US8170402B2 (en) * 2006-04-07 2012-05-01 Cinegest, Inc. Portable high capacity digital data storage device
US20090256863A1 (en) * 2008-04-09 2009-10-15 Harris Corporation, Corporation Of The State Of Delaware Video multiviewer system with serial digital interface and related methods
US8773469B2 (en) * 2008-04-09 2014-07-08 Imagine Communications Corp. Video multiviewer system with serial digital interface and related methods

Also Published As

Publication number Publication date
AU2997199A (en) 1999-10-25
EP1068615A1 (en) 2001-01-17
US6229576B1 (en) 2001-05-08
CA2326841A1 (en) 1999-10-14
WO1999052112A1 (en) 1999-10-14

Similar Documents

Publication Publication Date Title
US6229576B1 (en) Editing system with router for connection to HDTV circuitry
EP0972405B1 (en) Computer system and process for capture, editing and playback of motion video compressed using interframe and intraframe techniques
JP3907947B2 (en) HDTV editing and pre-visualization of effects using SDTV devices
JP5112287B2 (en) Method and system for providing distributed editing and storage of digital media over a network
JP3883579B2 (en) Multimedia system with improved data management mechanism
US6057832A (en) Method and apparatus for video-on-demand with fast play capability
US5577191A (en) System and method for digital video editing and publishing, using intraframe-only video data in intermediate steps
US20020122656A1 (en) Method and apparatus for recording broadcast data
JPH0991463A (en) Image edit device
US9020042B2 (en) Audio/video speedup system and method in a server-client streaming architecture
US7046251B2 (en) Editing system with router for connection to HDTV circuitry
US20210195261A1 (en) High-quality, reduced data rate streaming video production and monitoring system
JP3741299B2 (en) Video signal processing apparatus and video signal processing method
US20060233533A1 (en) Information recording/reproducing system, information recording/reproducing apparatus and information recording/reproducing method
EP0796013B1 (en) Video image processing apparatus and the method of the same
JPH08154230A (en) Method for storing moving image coded data on medium
WO2004088984A1 (en) Video data storage and retrieval system and method with resolution conversion
JP2000156840A (en) Method and device for synchronizing data of plural formats
JP3900382B2 (en) Video signal processing apparatus and video signal processing method
JP3900384B2 (en) Video signal processing apparatus and video signal processing method
WO2001019082A1 (en) Converting non-temporal based compressed image data to temporal based compressed image data
JP3045066B2 (en) Compressed video editing device
JP3919033B2 (en) Video / audio processor
EP1534005A2 (en) Method and apparatus for recording broadcast data
Crooks Considerations for moving to a video server-based facility

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION