EP1188318A1 - System and method for providing an enhanced digital video file - Google Patents
System and method for providing an enhanced digital video fileInfo
- Publication number
- EP1188318A1 EP1188318A1 EP00944619A EP00944619A EP1188318A1 EP 1188318 A1 EP1188318 A1 EP 1188318A1 EP 00944619 A EP00944619 A EP 00944619A EP 00944619 A EP00944619 A EP 00944619A EP 1188318 A1 EP1188318 A1 EP 1188318A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- file
- digital video
- video
- pixels
- streaming
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 75
- 238000004422 calculation algorithm Methods 0.000 description 9
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000013515 script Methods 0.000 description 6
- 241000713772 Human immunodeficiency virus 1 Species 0.000 description 4
- 238000005070 sampling Methods 0.000 description 3
- 238000001444 catalytic combustion detection Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6377—Control signals issued by the client directed to the server or network components directed to server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3872—Repositioning or masking
- H04N1/3873—Repositioning or masking defined only by a limited number of coordinate points or parameters, e.g. corners, centre; for trimming
- H04N1/3875—Repositioning or masking defined only by a limited number of coordinate points or parameters, e.g. corners, centre; for trimming combined with enlarging or reducing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2365—Multiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
Definitions
- the present invention relates generally to video imaging. More specifically, the present invention relates to a system and method for providing high quality digital video files for streaming across a network.
- Streaming video is a technique by which video is played in real time as it is downloaded over the Internet, as opposed to storing it in a local file first.
- a video player decompresses and plays the data as it is transferred to a user computer over the World-Wide Web.
- Streaming video avoids the delay entailed in downloading an entire file and then playing it with a plug-in application.
- Streaming video requires a communications connection (e.g., a network, Internet, etc.) and a computer powerful enough to execute the decompression algorithm in real time.
- One teaching in the art is to reduce the number of frames per second that are being encoded, from the 25 to 30 fps of standard television to 6 or 7 fps or less for streaming video. While this reduces the amount of data that is being sent, the video appears jittery and corresponding voice appears asynchronous with the jittery video.
- Another teaching in the art is to capture the video at a small frame size of 1 60 x 1 20 or less. The small frame size of 1 60 x 1 20 is the widely used standard in Internet streaming video. Further teachings are directed to reducing the amount of data that is provided prior to compressing to reduce the file size resulting from compression.
- a method of providing a streaming video file includes providing digital video data having a capture frame size of at least 69,300 pixels per frame and converting the digital video data to a streaming video file having a i s converted frame size of at least 69,300 pixels per frame.
- a method of providing a streaming video file includes providing digital video data having a capture frame rate of at least 24 frames per second and converting the digital video data to a streaming video file having 20 a converted frame rate of at least 24 frames per second.
- a method of providing a streaming video file includes obtaining a source video signal having a predetermined source video parameter; capturing the source video signal while maintaining substantially the 5 same source video parameter to provide a captured digital video file; and encoding the captured digital video file while maintaining substantially the same source video parameter to provide a streaming video file.
- a method of generating a streaming video file for streaming over the Internet includes providing digital video data having a capture frame size of at least 320 x 240 pixels; compressing the digital video; data; encoding the digital video data into a streaming video file, wherein the streaming video file has a frame size of at least 320 x 240 pixels; uploading the streaming video file to an Internet server.
- a system for providing a streaming video file includes means for providing digital video data having a capture frame size of at least 320 x 240 pixels and means for converting the digital video data to a streaming video file having a converted frame size of at least 320 x 240 pixels.
- a system for providing a streaming video file includes means for providing digital video data having a capture frame rate of at least 24 frames per second and means for converting the digital video data to a streaming video file having a converted frame rate of at least 24 frames per second.
- FIG. 1 is a block diagram of a system for generating an enhanced digital video file according to an exemplary embodiment
- FIG. 2 is a flowchart of a method for generating an enhanced digital video file according to the exemplary embodiment of FIG. 1 ;
- FIG. 3 is a block diagram of a system for playing a digital video file across a network.
- System 1 0 for generating an enhanced digital video file is shown.
- System 1 0 may be used as shown, or portions of system 1 0 may be integrated with other video processing systems, such as medical imaging equipment, motion picture production equipment, etc.
- System 10 generates a digital video file expandable to a full screen size and having a real video frame rate (i.e., life-like, smooth, not jerky, comparable with recorded video formats, such as, NTSC (National Television Standards Committee) at 29.97 frames per second (fps), PAL (Phase Alternative Line) at 25 fps, and SECAM (Sequentiel Couleur Avec Memoire) at 25 fps)) with a file size that is suitable for streaming over the Internet, for such uses as high definition television, Web television, computers and servers utilized in wireless environments, etc.
- NTSC National Television Standards Committee
- PAL Phase Alternative Line
- SECAM Suddens Couleur Avec Memoire
- video is recorded having certain standard recorded video parameters, such as, frame rate, and number of lines scanned.
- a source conforming to the NTSC (National Television Standards Committee) standard operates at 29.97 frames per second (fps)
- a source conforming to the PAL (Phase Alternative Line) standard operates at 25 fps
- a source conforming to the SECAM (Sequentiel Couleur Avec Memoire) standard operates at 25 fps.
- the NTSC standard includes two interleaved frames at 240 lines scanned, while the PAL standard is 270 lines scanned. Note that the number of lines scanned corresponds to the number of vertical pixels in a standard 320 x 240 frame size compatible with standard capture cards, such as, a Dazzle LAV-1 000S capture device manufactured by Dazzle, Inc. of Fremont, California.
- System 1 0 includes one or more sources, including recording devices 1 2 or playback device 25, a capture device 1 4, a computer 1 6, and a network server 1 8.
- Recording devices 1 2 include a camcorder 20, a digital video camera 22, and a reel-to-reel camera 24, each of which may be hand-held or mounted on a tripod or stand.
- System 1 0 may include a playback device 25 (e.g., tape player, VHS (Vertical Helix Scan) player, Beta player, DVD (Digital Versatile Disk) player, etc.) .
- Camcorder 20 may be a VHS recorder, Beta recorder, or other camcorder, and is configured to store video on magnetic tape.
- Digital video camera 22 may be any type of digital video camera configured to generate video in a digital format.
- digital video camera 22 stores the digital video data to a tape.
- Digital video camera 22 is configured to provide digital video data in real time or via the tape in a digital format, such as, Beta digital, AVI, MOV, MPEG (Motion Picture Experts Group), or other format compatible with the IEEE 1 394 standard, etc., to capture device 1 4.
- AVI is an audio/video standard designed by Microsoft Corp., Redmond, Washington.
- a digital video camera including 3CCD technology is used to record the video.
- the 3CCD technology (3- chip charge-coupled device) includes a dichroic prism and three CCDs, each CCD being aligned to detect only the red, green, or blue color.
- a 3CCD camera will provide enhanced color resolution.
- Reel- to-reel camera 24 includes recording equipment that uses magnetic tape which must be threaded through the equipment and onto an empty reel.
- a separate audio recording device such as a microphone, may be utilized in conjunction with recording devices 1 2, in which embodiment recording devices 1 2 are used to record only video.
- Other recording devices may be used, such as, devices optimized for live videoconferencing.
- Computer 1 6 includes a processor, memory, magnetic storage device, input/output devices and circuitry, etc.
- Computer 1 6 may include multiple computer at multiple sites, with different portions of the process described hereinafter operating on different computers.
- Capture device 1 4 is coupled to one or more of sources 1 1 .
- Capture device 1 4 is shown external to computer 1 6, but may alternatively be an internal capture device coupled within the housing of computer 1 6 or an internal capture device within the housing of one of recording devices 1 2 or playback device 25.
- a Dazzle LAV-1 000S capture device is utilized, though other capture devices may be used, such as a Pinnacle DC1 0PLUS or Pinnacle DC30PRO device, both manufactured by Pinnacle Systems, Inc., Mountain View, California, or a MotoDV Mobile capture device, manufactured by Digital Origin, Inc., Mountain View, California.
- Capture software 26, such as Amigo 2.1 1 , manufactured by Dazzle, Inc.
- capture device 1 4 is configured to receive a video signal from one of recording devices 1 2 or playback device 25, to digitize the video signal, and to store the video signal as a digital video file.
- the parameters of the video capture will be discussed below with reference to FIG. 2.
- the digital video file is an MPEG-1 file in this exemplary embodiment, but may alternatively be generated in other digital video formats, such as, MPEG-2, AVI, etc.
- Capture device 14 is a combined audio/video capture device, but may alternatively include discrete audio and video capture devices, the audio capture device configured to digitize any audio which corresponds to the video being captured by the video capture device. As a further alternative, audio captured device may be utilized alone without a video capture device.
- the audio capture device may be, for example, a Montego II device, manufactured by Voyetra Turtle Beach, Inc., Yonkers, New York, and configured to generate a digital audio file in a digital audio format, such as, PCM (Pulse Code Modulation) .
- Editing software 28 is operable on computer 1 6. In this exemplary embodiment, Adobe Premier 5.1 is utilized, though other video editing software may be used. Editing software 28 receives the captured digital video file and enables an operator to edit the digital video file by adding or deleting frames, adjusting the color, contrast, and brightness of the frames, etc. The edits are then saved to the digital video file or can be exported to AVI or MOV file types.
- Encoding software 30 is operable on computer 1 6. In this exemplary embodiment, RealProducer G2 is utilized, though other encoding software may be used. Encoding software 30 receives the edited digital video file and encodes the digital video file into an encoded format, such as, an RM format. Encoding software 30 may also compress the digital video file, if needed, to reduce the size of the digital video file, using a video compression algorithm, such as MPEG-1 , MPEG-4, etc.
- a video compression algorithm such as MPEG-1 , MPEG-4, etc.
- Markup software 32 is operable on computer 1 6.
- a hypertext markup language e.g., HTML, Dynamic HTML, Cold Fusion
- An operator marks up the encoded digital video file in HTML to prepare the digital video file for uploading to the network server 1 8.
- a code segment representing a full screen frame size such as 640 x 480 pixels, is associated with the digital video file in the HTML code.
- the full screen frame size code segment may alternatively include other screen sizes, such as 800 x 600 pixels, 1 024 x 768 pixels, 1 280 x 1 024 pixels, and 1 600 x 1 200 pixels.
- the full screen frame size code segment causes or enables a video player program, such as RealPlayer, manufactured by RealNetworks, Inc., to enlarge the streaming video to a full screen frame size, such as 640 x 480 pixels.
- a video player program such as RealPlayer, manufactured by RealNetworks, Inc.
- references herein to frame sizes in pixels are intended to include equivalent frames sizes thereto.
- a frame size of 320 x 240 pixels may include an additional number of unneeded pixels (e.g., which can be as much as 1 0% of the total pixels) attributed to overscan.
- one equivalent to a 320 x 240 pixel frame size is 304 x 228 pixels.
- the exact pixel count differs from the stated frame size.
- one equivalent to a 320 x 240 pixel frame size is 352 x 240.
- the uploading process utilizes uploading software 33, such as, a Web FTP (file transfer protocol) software (e.g., WS FTP PRO, manufactured by Ipswitch, Inc., Lexington, Massachusetts.)
- the digital video file is uploaded to network server 1 8, which includes a computer configured to generate a web page on an internet-protocol network, such as the Internet or a company-wide intranet.
- a web page is a block of data written in a markup language, such as HTML, and any related files for scripts and graphics.
- Network server 1 8 may alternatively be coupled to a non- internet-protocol network, such as, an ethernet, a local area network, a wide area network, a wireless network, etc.
- a user computer 34 may access the web page provided by network server 1 8 via a network, such as, the Internet.
- a user input device e.g., a web page button, hypertext link, etc.
- the HTML code launches a suitable video player program (e.g., RealPlayer) at user computer 34, activates the full screen frame size at user computer 34, and streams the video from the digital video file to user computer 34.
- the video player program may initially play the streaming video at a smaller frame size (e.g., 320 x 240), and the user may actuate a user input device on the video player to enlarge the streaming video to a full-screen size, such as 640 x 480.
- capture software 26, editing software 28, encoding software 30, markup software 32, and uploading software 33 may be operable on one computer or on different computers during different steps in the process.
- the encoded digital video file is stored directly to a storage device, such as, a compact disk, a digital video disk, a magnetic storage device, etc., for subsequent viewing on another computer, on a personal digital assistant (e.g., a Palm Pilot manufactured by Palm, Inc., Santa Clara, California), etc.
- digital video data is provided on a storage device (e.g., a floppy disk, a hard disk storage, etc.) which has been pre-captured.
- the pre-captured digital video data is provided in a compressed or uncompressed digital video format to encoding software 30 for subsequent processing.
- Method 50 is operable using one or more of the elements of system 1 0, as needed. While the steps of method 50 are explained with reference to captured video, it is understood that captured audio may be processed along with the captured video, or perhaps processed independently in a similar manner. As will be seen, the recorded video will be captured and encoded at near- optimal levels, as determined by the selected parameters in these processes, thereby preserving the highest quality video content. While exemplary values are presented herein for such parameters, it is understood that one of ordinary skill in the art will recognize other combinations of parameters based on these teachings.
- a customer provides pre-recorded video saved to a disk or other storage device.
- the method proceeds to step 58.
- video is recorded using one or more of recording devices 1 2 or playback device 25.
- the video is recorded into any suitable format, such as, VHS or Beta, and is played back using a television standard, such as, NTSC (National Television Standards Committee), PAL (Phase Alternative Line) , SECAM (Sequentiel Couleur Avec Memoire), a digital format, such as, AVI, MOV, MPEG, a digital format compatible with the IEEE 1 394 standard, or another format, etc.
- the video is captured by coupling one of recording devices 1 2 or playback device 25 to capture device 1 4, which is an external Dazzle LAV-1 000 capture device in this exemplary embodiment, but may alternatively be an internal card or other capture devices, such as a Pinnacle DC1 0 device.
- Capture software is also utilized, such as, Amigo 2.1 1 , Adobe Premier 5.1 or Real Producer G2.
- Capture device 1 4 and capture software 26 generate a digital video file based on the recorded video. If the recorded video is in an analog format, capture device 1 4 digitizes the analog video to create digital video data. If the recorded video is in a digital format, capture device 1 4 merely receives the digital video data and formats a file in the appropriate standard (e.g., AVI, MOV, MPEG 1 , etc.) .
- capture software 26 is set for real video capture, i.e., having a frame rate of a television or movie standard, such as, 29.97 frames per second.
- Real video capture may further have a frame rate of between 24 and 30 frames per second, or at least substantially more than the 6 to 9 frames per second conventionally used in streaming video applications. Further, the video is captured with at least approximately 76,800 pixels per frame (at least approximately 69,000 pixels taking into consideration overscan) .
- the frame size of the video capture is at least 320 x 240 in this exemplary embodiment (at least 304 x 228 taking into consideration overscan), or at least more than the 1 60 x 1 20 used in conventional streaming video applications.
- Frame sizes of 480 x 320 and 640 x 480 may also be utilized in the video capture. However, particularly advantageous results are associated with the 320 x 240 capture frame size.
- a separate audio capture device is utilized in parallel with the video capture device.
- corresponding audio capture software is operable on computer 1 6 to digitize the audio into a digital audio format, such as PCM.
- the sampling rate is between 44 and 48 kiloHertz (kHz); the bus size is 1 6-bit, allowing an audio resolution of 1 6-bits; and the audio is sampled in stereo.
- the captured video data may be stored as a data file in a storage device (e.g., a hard drive) or may be stored in memory and fed directly to an encoder.
- the captured video data may further be compressed, for example, to an MPEG-1 file before being saved to the storage device.
- the digital video file is edited using a video editing software, such as, Adobe Premier 5.1 .
- Adobe Premier 5.1 generates an output file in a MOV or AVI format, but may alternatively generate an output file in any digital video format.
- the edited digital video file may be stored in the storage device.
- Step 58 is optional but, if included, preferably Adobe Premier 5.1 maintains a frame size of at least 320 x 240 pixels and a real video frame rate.
- the edited digital video file is converted or encoded using a video encoding algorithm to create a streaming video file.
- the edited digital video file is first retrieved from the storage device (unless the digital video data is provided directly from capture device 1 4) .
- the digital video file is encoded to a RealMedia format (i.e., RM) using a RealNetworks encoding algorithm.
- RM is an audiovisual file format proprietary to RealNetworks, Inc.
- Windows Media Encoder manufactured by Microsoft Corp.
- ASF Advanced Streaming Format
- ASX Advanced Streaming Format
- QuickTime manufactured by Apple Computer, Inc., Cupertino, California, may be utilized to encode the captured digital video file, for example, to an MOV format.
- Encoding may additionally include compression, if a smaller file size is desirable, as indicated by steps 62 and 64.
- the amount of compression may be selected by the operator using encoding software 30 or alternative compression software.
- the digital video file is encoded to have a data rate of between approximately 35 kbps (kilobits per second) to 750 kbps, and a frame rate of between approximately 24 fps (frames per second) and 30 fps (e.g. 29.97 fps.) .
- the number of pixels per frame is set to at least approximately 76,800 (again, at least approximately 69,000 pixels taking into consideration overscan) which, for a 4:3 aspect ratio, is 320 x 240 pixels (again, at least 304 x 228 pixels taking into consideration overscan), or at least more than the 1 60 x 1 20 pixels of conventional usage.
- editing, encoding, and compression are optional steps.
- the digital video file is marked up with a markup language, such as, HTML.
- a full screen frame size is associated with the digital video file.
- a full screen frame size is at least 640 x 480 pixels, and may also be 800 x 600 pixels, 1 024 x 768 pixels, 1 280 x 1 024 pixels, 1 600 x 1 200 pixels, etc.
- the markup language associated with the digital video file includes a code segment that causes the digital video file to stream at the desired full screen frame size. While the markup language is used to associate the full screen frame size code segment with the digital video file in this exemplary embodiment, the full screen frame size code segment may be associated with the digital video file in another step of the method, such as the encode step 60, compression step 62, or another step.
- the digital video file is uploaded to an Internet web page using uploading software, such as, WS FTP PRO.
- a script e.g., an ASCII file (American Standard Code for Information Interchange)
- the script calls the video to stream in response to a user actuation from user computer 34.
- the script is written in a RAM format, such as from a Microsoft Notepad software program.
- the script is included in the markup language associated with the digital video file.
- an actuatable user input device e.g., a hypertext link
- HTML code e.g., a hypertext link
- a user from anywhere in the world may access network server 1 8 via the internet, actuate the user input device, and call the video to stream.
- the HTML codes launch video playing software (e.g., RealPlayer) at the user computer, enlarge the viewing window of the software to full screen mode (i.e., at least 640 x 480), and begin streaming the video to the user computer.
- video playing software e.g., RealPlayer
- the user may expand the viewing screen to full screen mode by actuating an input device on the video player software.
- Other methods of expanding the viewing screen to a full screen are contemplated .
- the transmission speed of the digital video file is dependent upon the bandwidth of the user's network connection, but may range from approximately 35 kbps to 750 kbps, or as low as 28.8 kbps, with a frame rate of between approximately 24 fps to 29.97 or 30 fps.
- network server 1 8 is configured to query user computer 34 to ascertain the network connection used by computer 34 (e.g., 28.8 kbps modem, T1 line, ISDN, etc.) . Thereafter, network server 1 8 determines the appropriate transmission rate based on the ascertained network connection.
- the network connection used by computer 34 e.g., 28.8 kbps modem, T1 line, ISDN, etc.
- the video camera generated an output signal of 6MHz in NTSC format.
- a Dazzle LAV-1 000S external capture device was coupled to the video camera.
- Amigo 2.1 1 Dazzle's capture software was used.
- the Dazzle capture device and capture software were programmed with several parameters.
- the frame size was left at the default setting of 320 x 240 pixels.
- the frame speed was set to 29.97 frames per second.
- the bit rate was set to 3.0 Megabits (Mb) per second.
- the audio capture was set to 44 kHz, 1 6 bit sampling rate.
- An MPEG-1 file was generated based on the video signal using the capture device and software programmed with these parameters.
- Adobe Premier 5.1 was utilized to receive the MPEG-1 file and export it to a MOV or AVI or MPEG file., based on several parameters.
- the frame rate in Adobe Premier 5.1 was set to 29.97 fps.
- the frame size was set to 320 x 240.
- the "Quality" setting representing the number of colors to appear in the edited file, was set to a high setting (e.g., 1 00%) .
- Adobe Premier 5.1 generated an AVI file or an MOV file or a MPEG file, depending upon the operator selection.
- RealEncoder G2 software was used to encode the AVI or MOV file into a streaming video file in RM format.
- the RealEncoder G2 software was programmed with several parameters. The bitrate was set to 220 kbps. The frame rate was set to 30 fps.
- the "Surestream” option was selected. "Surestream” technology adjusts the playing speed of the encoded digital video file to accommodate the network connection speed of the user. For sound quality, "stereo/music", the highest quality, was selected . For image quality, “sharpest image”, the highest quality, was selected. Regarding frame size, this version of RealEncoder generated an output signal having a frame size equal to that of the frame size of the MOV or AVI input file. RealEncoder compressed the MOV or AVI input file using the RealNetworks compression algorithm. An RM file was generated based on the these parameters.
- the RM file was uploaded to an Internet server. Using Microsoft Notepad, a script was written in RAM format to 1 ) identify the location of the RM file, 2) launch RealPlayer on the user computer, 3) resize the viewing screen on the user computer to 640 x 480, and 4) begin the video stream.
- the result was unexpectedly high-quality, full-screen, real video frame rate, streaming video.
- the RM file was subsequently streamed to a client computer via a telephone modem and via other broadband connections. The same unexpectedly high-quality, full-screen, real video frame rate, streaming video was experienced.
- the streaming playback was intermittent due to the need to buffer to accommodate the lower bit- rate of transmission.
- an NTSC analog signal is provided to a Pinnacle DC-10PLUS capture device.
- the Pinnacle capture device and associated software generate a digital video file in AVI format based on several parameters.
- the capture type is set to NTSC.
- the frame size is set to 320 x 240 pixels, or "1 /4 full frame size". Brightness, sharpness, and color are adjusted, as desired.
- the compression rate is set to 2.5: 1 .
- the frame rate is set to 29.97. Square pixel ratio is selected. Audio is set to stereo format, 44 kHz, 1 6 bit sampling.
- the data rate is set to 1 739 kbps.
- the capture device utilizes a Miro codec to create a digital video file in AVI format.
- a header and footer is provided to the beginning and end of the digital video file.
- the header and footer include a trademark for the assignee of the present application.
- Adobe Premier is used to render the header, footer, and watermark to the digital video file.
- a parameter within Adobe Premier is set to a frame size of 320 x 240.
- Adobe Premier further utilizes a Miro codec to create a digital video file in AVI format.
- the edited AVI file is encoded by RealProducer software.
- the following parameters are programmed in the RealProducer software.
- One set of parameters was used for a low- speed network connection at the user computer (hereinafter designated "LO"), and another set of parameters was used for a high-speed network connection at the user computer (hereinafter designated "HI") .
- RealNetworks "Surestream” technology is selected. Alternatively, “single-stream" can be selected, and an
- RAM file can be generated to query the connection speed of the user computer and stream the video at the proper connection speed.
- the encoding speed is set to, for LO, 28 kbps or 56 kbps, and for HI, LAN, DSL, Cable Modem, or T1 .
- Sound quality is set to "voice only” or “stereo music” or “CD quality”.
- Video quality is set to "sharper image”.
- Frame rate is set to 29.97 fps.
- Target bit rate is set to 350 kbps.
- the target player is specified as RealPlayer G2.
- Frame size is set to 320 x 240. Based on these parameters, the RealEncoder software generates an RM file or other streaming video data file, which is subsequently uploaded to RealServer.
- the exemplary embodiments disclosed herein provide greatly enhanced streaming video suitable for streaming over a limited-bandwidth network, such as the Internet.
- a limited-bandwidth network such as the Internet.
- the first discovery was that the efficiency of encoding from a captured digital video file to a streaming video file is increased with an increase in the frame size of the captured digital video file.
- conventional teachings pointed toward minimizing the capturing and encoding frame sizes (typically to 1 60 x 1 20 pixels, which has widely become an Internet standard for streaming video) to reduce the size of the resulting file
- the present inventors turned away from these teachings and increased the capturing and encoding frame sizes to 320 x 240 pixels.
- one goal of the present inventors was to achieve full-screen, real video frame rate, streaming video.
- RealProducer G2 taught away from real video streaming since digital video files that were captured at a real video frame rate (e.g., 30 fps) would be automatically reduced to a lower, non-real video frame rate (e.g ., 1 5 fps) to reduce the size of the streaming video file. Furthermore, digital video files which were captured directly from a capture device using RealProducer G2 were encoded at a frame rate of only 6-7 fps and had no option to adjust frame size.
- a real video frame rate e.g. 30 fps
- non-real video frame rate e.g ., 1 5 fps
- System 80 includes a network server 82 having a processor 84, a storage device 86, and a network interface 88.
- a capture device 90 is coupled to network server 82 and is configured to capture a video signal, as described hereinabove.
- Processor 84 controls capture device 90 and provides various parameters to capture device 90 regarding frame size, bit rate, etc.
- processor 84 may implement one or more of the methods for capturing video and generating a digital video file described hereinabove.
- Processor 84 and capture device 90 generate a digital video file in a digital video format (e.g., MPEG, AVI, etc.) and store it to storage device 86.
- a digital video format e.g., MPEG, AVI, etc.
- storage device includes such devices as magnetic tape, a hard drive, a floppy disk, magnetic disk, or other similar non-volatile storage media, but not including random access memory or other temporary memory.
- the capture process may alternatively be carried out on another computer, after which the resulting digital video file is stored in (e.g., uploaded to) storage device 86.
- Network server 82 is coupled through network interface 88 to a network 92, such as the Internet, a LAN, etc.
- Processor 84 is configured to generate a web page having a hypertext link to the digital video file stored in storage device 86.
- a network client 94 includes a processor 96, a storage device 98, an input device 1 00, a display 1 02, and a network interface 1 04.
- Network client 94 is operable via a user to access the web page generated by network server 82 and to actuate the hypertext link to begin downloading the digital video file from storage device 86.
- One drawback of downloading video files is that, for very large files, the delay before any portion of the digital video file can be viewed can be on the order of minutes, hours, or longer.
- system 80 while the digital video file is being downloaded to network client 94 and stored in storage device 98, some of the digital video file which has already been downloaded and stored is being simultaneously played on display 1 02.
- a suitable player which supports AVI, MPEG, and other digital video formats is utilized for the video play. This procedure may be referred to as viewing/downloading.
- a first portion of the digital video file is played from storage device 98 while later portions of the digital video file are still downloading from storage device 86 via network 92 to storage device 98.
- One method of launching the player and beginning the play of the first portion is for a user to simply select these steps via input device 1 00 (e.g., a mouse, a keyboard, etc.) a certain time after the downloading has begun.
- input device 1 00 e.g., a mouse, a keyboard, etc.
- an algorithm may be provided, either attached to the digital video file (e.g., HTML, Java, a macro, etc.) or as part of the player (e.g., QuickTime, RealPlayer, etc.) which begins playing the digital video file at a predetermined time after the download to storage device 98 has begun. This predetermined time may be pre-programmed or adjusted in real-time based on inputs from client server 94 or network server 82.
- the algorithm calculates the predetermined time based on the download speed (e.g., including network connection speed of network interface 1 04, etc.), the viewing speed (e.g., frames per second, etc.), and the size of the digital video file. For example, if the viewing speed is four times the download speed, the algorithm monitors the amount of the file (e.g., in bytes) which is downloaded until 75 % of the file is downloaded. When 75 % of the file is downloaded, the algorithm begins playing the digital video file from storage device 98. By playing the file at this predetermined time, the digital video file will play substantially without delays for buffering. Of course, other predetermined times are contemplated, including those earlier and later than that set forth in this exemplary embodiment.
- network client 94 retains a copy of the digital video file in storage device 98 for later playing.
- the digital video data is captured in real-time and streamed in real-time across network 92 (i.e., without first storing to storage device 86) to storage device 98.
- FIGS, and described above are presently preferred, it should be understood that these embodiments are offered by way of example only.
- the steps of the exemplary embodiments contemplate recording audio and video at one time and streaming the audio and video at another time
- the audio and video may alternatively be fed through system 1 0 in realtime, thereby facilitating real-time audio/video transmissions.
- the exemplary software programs mentioned may be replaced by newly developed versions and/or programs in the future. Accordingly, the present invention is not limited to a particular embodiment, but extends to various modifications that nevertheless fall within the scope of the appended claims.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A system and method of providing a streaming video file includes providing digital video data having a capture frame size of at least 69,300 pixels per frame and converting the digital video data to a streaming video file having a converted frame size of at least 69,300 pixels per frame. According to another exemplary embodiment, a method of providing a streaming video file includes providing digital video data having a capture frame rate of at least 24 frames per second and converting the digital video data to a streaming video file having a converted frame rate of at least 24 frames per second.
Description
TITLE OF THE INVENTION
SYSTEM AND METHOD FOR PROVIDING AN ENHANCED DIGITAL VIDEO FILE
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application No. 60/1 37,297, filed June 3, 1 999, U.S. Provisional Application No. 60/1 55,404, filed September 22, 1 999, and U.S. Provisional Application No. 60/1 69,559, filed December 8, 1 999.
FIELD OF THE INVENTION
The present invention relates generally to video imaging. More specifically, the present invention relates to a system and method for providing high quality digital video files for streaming across a network.
BACKGROUND OF THE INVENTION
Streaming video is a technique by which video is played in real time as it is downloaded over the Internet, as opposed to storing it in a local file first. A video player decompresses and plays the data as it is transferred to a user computer over the World-Wide Web. Streaming video avoids the delay entailed in downloading an entire file and then playing it with a plug-in application. Streaming video requires a communications connection (e.g., a network, Internet, etc.) and a computer powerful enough to execute the decompression algorithm in real time.
In the field of streaming video, the primary design challenge is that the viewer desires perfect video quality over a
limited-bandwidth network. Perfect video quality requires an enormous amount of digital data. Today's networks are not capable of providing life-like, full motion, full screen streaming video.
It is known to capture video using a capture device, compress the resulting captured video, store the compressed video, and send the compressed video across the Internet. However, prior attempts have failed to produce high quality streaming video that can be transmitted over the Internet. For example, prior attempts at streaming video have been unable to produce full-screen, real video frame rate video at any acceptable quality.
Several teachings have emerged that attempt to improve the quality and decrease the file size of streaming video. One teaching in the art is to reduce the number of frames per second that are being encoded, from the 25 to 30 fps of standard television to 6 or 7 fps or less for streaming video. While this reduces the amount of data that is being sent, the video appears jittery and corresponding voice appears asynchronous with the jittery video. Another teaching in the art is to capture the video at a small frame size of 1 60 x 1 20 or less. The small frame size of 1 60 x 1 20 is the widely used standard in Internet streaming video. Further teachings are directed to reducing the amount of data that is provided prior to compressing to reduce the file size resulting from compression. Other teachings in the art have pointed toward compressing a digital video file as much as possible prior to transmission. Full-screen, full- motion video has historically been viewed as requiring far too much data for transmission over a limited-bandwidth network.
Accordingly, there is a need for an improved system and method for providing an enhanced digital video file for streaming across a network. Further, there is a need for a digital video file having high quality at various screen sizes with minimal quality loss 5 when the video is expanded to full screen size. Further still, there is a need for a digital video file having a real video frame rate that can be streamed across a limited bandwidth network, such as the Internet. Further yet, there is a need for a video transmission which, once commenced, need not be stopped.
10 BRIEF SUMMARY OF THE INVENTION
According to an exemplary embodiment, a method of providing a streaming video file includes providing digital video data having a capture frame size of at least 69,300 pixels per frame and converting the digital video data to a streaming video file having a i s converted frame size of at least 69,300 pixels per frame.
According to another exemplary embodiment, a method of providing a streaming video file includes providing digital video data having a capture frame rate of at least 24 frames per second and converting the digital video data to a streaming video file having 20 a converted frame rate of at least 24 frames per second.
According to yet another exemplary embodiment, a method of providing a streaming video file includes obtaining a source video signal having a predetermined source video parameter; capturing the source video signal while maintaining substantially the 5 same source video parameter to provide a captured digital video file; and encoding the captured digital video file while maintaining substantially the same source video parameter to provide a streaming video file.
According to still another exemplary embodiment, a method of generating a streaming video file for streaming over the Internet includes providing digital video data having a capture frame size of at least 320 x 240 pixels; compressing the digital video; data; encoding the digital video data into a streaming video file, wherein the streaming video file has a frame size of at least 320 x 240 pixels; uploading the streaming video file to an Internet server.
According to still another exemplary embodiment, a system for providing a streaming video file includes means for providing digital video data having a capture frame size of at least 320 x 240 pixels and means for converting the digital video data to a streaming video file having a converted frame size of at least 320 x 240 pixels.
According to still another exemplary embodiment, a system for providing a streaming video file includes means for providing digital video data having a capture frame rate of at least 24 frames per second and means for converting the digital video data to a streaming video file having a converted frame rate of at least 24 frames per second.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will become more fully understood from the following detailed description, taken in conjunction with the accompanying drawings, wherein like reference numerals refer to like parts, in which: FIG. 1 is a block diagram of a system for generating an enhanced digital video file according to an exemplary embodiment;
FIG. 2 is a flowchart of a method for generating an enhanced digital video file according to the exemplary embodiment of FIG. 1 ; and
FIG. 3 is a block diagram of a system for playing a digital video file across a network.
DETAILED DESCRIPTION OF THE INVENTION
Referring to FIG . 1 , a system 1 0 for generating an enhanced digital video file is shown. System 1 0 may be used as shown, or portions of system 1 0 may be integrated with other video processing systems, such as medical imaging equipment, motion picture production equipment, etc. System 10 generates a digital video file expandable to a full screen size and having a real video frame rate (i.e., life-like, smooth, not jerky, comparable with recorded video formats, such as, NTSC (National Television Standards Committee) at 29.97 frames per second (fps), PAL (Phase Alternative Line) at 25 fps, and SECAM (Sequentiel Couleur Avec Memoire) at 25 fps)) with a file size that is suitable for streaming over the Internet, for such uses as high definition television, Web television, computers and servers utilized in wireless environments, etc.
As known in the art, video is recorded having certain standard recorded video parameters, such as, frame rate, and number of lines scanned. For example, it is will known that a source conforming to the NTSC (National Television Standards Committee) standard operates at 29.97 frames per second (fps), a source conforming to the PAL (Phase Alternative Line) standard operates at 25 fps, and a source conforming to the SECAM (Sequentiel Couleur Avec Memoire) standard operates at 25 fps. It is will known in the
art that the NTSC standard includes two interleaved frames at 240 lines scanned, while the PAL standard is 270 lines scanned. Note that the number of lines scanned corresponds to the number of vertical pixels in a standard 320 x 240 frame size compatible with standard capture cards, such as, a Dazzle LAV-1 000S capture device manufactured by Dazzle, Inc. of Fremont, California.
System 1 0 includes one or more sources, including recording devices 1 2 or playback device 25, a capture device 1 4, a computer 1 6, and a network server 1 8. Recording devices 1 2 include a camcorder 20, a digital video camera 22, and a reel-to-reel camera 24, each of which may be hand-held or mounted on a tripod or stand. System 1 0 may include a playback device 25 (e.g., tape player, VHS (Vertical Helix Scan) player, Beta player, DVD (Digital Versatile Disk) player, etc.) . Camcorder 20 may be a VHS recorder, Beta recorder, or other camcorder, and is configured to store video on magnetic tape. Digital video camera 22 may be any type of digital video camera configured to generate video in a digital format. In this exemplary embodiment, digital video camera 22 stores the digital video data to a tape. Digital video camera 22 is configured to provide digital video data in real time or via the tape in a digital format, such as, Beta digital, AVI, MOV, MPEG (Motion Picture Experts Group), or other format compatible with the IEEE 1 394 standard, etc., to capture device 1 4. AVI is an audio/video standard designed by Microsoft Corp., Redmond, Washington. According to one exemplary embodiment, a digital video camera including 3CCD technology is used to record the video. The 3CCD technology (3- chip charge-coupled device) includes a dichroic prism and three CCDs, each CCD being aligned to detect only the red, green, or blue color. A 3CCD camera will provide enhanced color resolution. Reel-
to-reel camera 24 includes recording equipment that uses magnetic tape which must be threaded through the equipment and onto an empty reel. According to one alternative embodiment, a separate audio recording device, such as a microphone, may be utilized in conjunction with recording devices 1 2, in which embodiment recording devices 1 2 are used to record only video. Other recording devices may be used, such as, devices optimized for live videoconferencing.
Computer 1 6 includes a processor, memory, magnetic storage device, input/output devices and circuitry, etc. Computer 1 6 may include multiple computer at multiple sites, with different portions of the process described hereinafter operating on different computers.
Capture device 1 4 is coupled to one or more of sources 1 1 . Capture device 1 4 is shown external to computer 1 6, but may alternatively be an internal capture device coupled within the housing of computer 1 6 or an internal capture device within the housing of one of recording devices 1 2 or playback device 25. In this exemplary embodiment, a Dazzle LAV-1 000S capture device is utilized, though other capture devices may be used, such as a Pinnacle DC1 0PLUS or Pinnacle DC30PRO device, both manufactured by Pinnacle Systems, Inc., Mountain View, California, or a MotoDV Mobile capture device, manufactured by Digital Origin, Inc., Mountain View, California. Capture software 26, such as Amigo 2.1 1 , manufactured by Dazzle, Inc. or Adobe Premier 5.1 , manufactured by Adobe Systems Inc., San Jose, California, is operable on computer 1 6 to interface capture device 1 4 with computer 1 6. Other capture software may be utilized, such as.
RealProducer G2, manufactured by RealNetworks, Inc., Seattle, Washington.
In conjunction with capture software 26, capture device 1 4 is configured to receive a video signal from one of recording devices 1 2 or playback device 25, to digitize the video signal, and to store the video signal as a digital video file. The parameters of the video capture will be discussed below with reference to FIG. 2. The digital video file is an MPEG-1 file in this exemplary embodiment, but may alternatively be generated in other digital video formats, such as, MPEG-2, AVI, etc. Capture device 14 is a combined audio/video capture device, but may alternatively include discrete audio and video capture devices, the audio capture device configured to digitize any audio which corresponds to the video being captured by the video capture device. As a further alternative, audio captured device may be utilized alone without a video capture device. The audio capture device may be, for example, a Montego II device, manufactured by Voyetra Turtle Beach, Inc., Yonkers, New York, and configured to generate a digital audio file in a digital audio format, such as, PCM (Pulse Code Modulation) . Editing software 28 is operable on computer 1 6. In this exemplary embodiment, Adobe Premier 5.1 is utilized, though other video editing software may be used. Editing software 28 receives the captured digital video file and enables an operator to edit the digital video file by adding or deleting frames, adjusting the color, contrast, and brightness of the frames, etc. The edits are then saved to the digital video file or can be exported to AVI or MOV file types.
Encoding software 30 is operable on computer 1 6. In this exemplary embodiment, RealProducer G2 is utilized, though
other encoding software may be used. Encoding software 30 receives the edited digital video file and encodes the digital video file into an encoded format, such as, an RM format. Encoding software 30 may also compress the digital video file, if needed, to reduce the size of the digital video file, using a video compression algorithm, such as MPEG-1 , MPEG-4, etc.
Markup software 32 is operable on computer 1 6. In this exemplary embodiment, a hypertext markup language (e.g., HTML, Dynamic HTML, Cold Fusion) is utilized. An operator marks up the encoded digital video file in HTML to prepare the digital video file for uploading to the network server 1 8. In this exemplary embodiment, a code segment representing a full screen frame size, such as 640 x 480 pixels, is associated with the digital video file in the HTML code. The full screen frame size code segment may alternatively include other screen sizes, such as 800 x 600 pixels, 1 024 x 768 pixels, 1 280 x 1 024 pixels, and 1 600 x 1 200 pixels. During a subsequent video streaming step, the full screen frame size code segment causes or enables a video player program, such as RealPlayer, manufactured by RealNetworks, Inc., to enlarge the streaming video to a full screen frame size, such as 640 x 480 pixels.
References herein to frame sizes in pixels, such as, 320 x 240 pixels, 640 x 480 pixels, are intended to include equivalent frames sizes thereto. For example, it is known that a frame size of 320 x 240 pixels may include an additional number of unneeded pixels (e.g., which can be as much as 1 0% of the total pixels) attributed to overscan. Thus, one equivalent to a 320 x 240 pixel frame size is 304 x 228 pixels. As a second example, when rectangular pixels are used, the exact pixel count differs from the
stated frame size. Thus, one equivalent to a 320 x 240 pixel frame size is 352 x 240. Accordingly, references to frame sizes in pixels are intended to included these and other equivalent frame sizes, and the teachings herein include any and all such insubstantial variations. The uploading process utilizes uploading software 33, such as, a Web FTP (file transfer protocol) software (e.g., WS FTP PRO, manufactured by Ipswitch, Inc., Lexington, Massachusetts.) The digital video file is uploaded to network server 1 8, which includes a computer configured to generate a web page on an internet-protocol network, such as the Internet or a company-wide intranet. A web page is a block of data written in a markup language, such as HTML, and any related files for scripts and graphics. Network server 1 8 may alternatively be coupled to a non- internet-protocol network, such as, an ethernet, a local area network, a wide area network, a wireless network, etc.
A user computer 34 may access the web page provided by network server 1 8 via a network, such as, the Internet. Upon actuating a user input device (e.g., a web page button, hypertext link, etc.) associated with the uploaded digital video file, the HTML code launches a suitable video player program (e.g., RealPlayer) at user computer 34, activates the full screen frame size at user computer 34, and streams the video from the digital video file to user computer 34. Alternatively, the video player program may initially play the streaming video at a smaller frame size (e.g., 320 x 240), and the user may actuate a user input device on the video player to enlarge the streaming video to a full-screen size, such as 640 x 480. Notably, capture software 26, editing software 28, encoding software 30, markup software 32, and uploading software
33 may be operable on one computer or on different computers during different steps in the process.
According to one alternative embodiment, the encoded digital video file is stored directly to a storage device, such as, a compact disk, a digital video disk, a magnetic storage device, etc., for subsequent viewing on another computer, on a personal digital assistant (e.g., a Palm Pilot manufactured by Palm, Inc., Santa Clara, California), etc. According to another alternative embodiment, digital video data is provided on a storage device (e.g., a floppy disk, a hard disk storage, etc.) which has been pre-captured. The pre-captured digital video data is provided in a compressed or uncompressed digital video format to encoding software 30 for subsequent processing.
Referring now to FIG. 2, a method 50 for generating an enhanced digital video file according to the exemplary embodiment of FIG. 1 is shown. Method 50 is operable using one or more of the elements of system 1 0, as needed. While the steps of method 50 are explained with reference to captured video, it is understood that captured audio may be processed along with the captured video, or perhaps processed independently in a similar manner. As will be seen, the recorded video will be captured and encoded at near- optimal levels, as determined by the selected parameters in these processes, thereby preserving the highest quality video content. While exemplary values are presented herein for such parameters, it is understood that one of ordinary skill in the art will recognize other combinations of parameters based on these teachings.
According to one exemplary embodiment, a customer provides pre-recorded video saved to a disk or other storage device. At step 52, if the video has been pre-recorded by the customer, the
method proceeds to step 58. If the video has not yet been recorded, at step 54, video is recorded using one or more of recording devices 1 2 or playback device 25. The video is recorded into any suitable format, such as, VHS or Beta, and is played back using a television standard, such as, NTSC (National Television Standards Committee), PAL (Phase Alternative Line) , SECAM (Sequentiel Couleur Avec Memoire), a digital format, such as, AVI, MOV, MPEG, a digital format compatible with the IEEE 1 394 standard, or another format, etc. At step 56, the video is captured by coupling one of recording devices 1 2 or playback device 25 to capture device 1 4, which is an external Dazzle LAV-1 000 capture device in this exemplary embodiment, but may alternatively be an internal card or other capture devices, such as a Pinnacle DC1 0 device.
Capture software is also utilized, such as, Amigo 2.1 1 , Adobe Premier 5.1 or Real Producer G2. Capture device 1 4 and capture software 26 generate a digital video file based on the recorded video. If the recorded video is in an analog format, capture device 1 4 digitizes the analog video to create digital video data. If the recorded video is in a digital format, capture device 1 4 merely receives the digital video data and formats a file in the appropriate standard (e.g., AVI, MOV, MPEG 1 , etc.) . According to one exemplary embodiment, capture software 26 is set for real video capture, i.e., having a frame rate of a television or movie standard, such as, 29.97 frames per second. Real video capture may further have a frame rate of between 24 and 30 frames per second, or at least substantially more than the 6 to 9 frames per second conventionally used in streaming video applications. Further, the video is captured with at least approximately 76,800 pixels per frame (at least approximately 69,000 pixels taking into consideration
overscan) . For a 4:3 aspect ratio, the frame size of the video capture is at least 320 x 240 in this exemplary embodiment (at least 304 x 228 taking into consideration overscan), or at least more than the 1 60 x 1 20 used in conventional streaming video applications. Frame sizes of 480 x 320 and 640 x 480 may also be utilized in the video capture. However, particularly advantageous results are associated with the 320 x 240 capture frame size.
In an alternative embodiment, a separate audio capture device is utilized in parallel with the video capture device. In the alternative embodiment, corresponding audio capture software is operable on computer 1 6 to digitize the audio into a digital audio format, such as PCM. The sampling rate is between 44 and 48 kiloHertz (kHz); the bus size is 1 6-bit, allowing an audio resolution of 1 6-bits; and the audio is sampled in stereo. These parameters may also be set using the video capture software in an embodiment wherein video and audio are captured using one capture device.
The captured video data may be stored as a data file in a storage device (e.g., a hard drive) or may be stored in memory and fed directly to an encoder. The captured video data may further be compressed, for example, to an MPEG-1 file before being saved to the storage device.
At step 58, the digital video file is edited using a video editing software, such as, Adobe Premier 5.1 . Adobe Premier 5.1 generates an output file in a MOV or AVI format, but may alternatively generate an output file in any digital video format. The edited digital video file may be stored in the storage device. Step 58 is optional but, if included, preferably Adobe Premier 5.1 maintains a frame size of at least 320 x 240 pixels and a real video frame rate.
At step 60, the edited digital video file is converted or encoded using a video encoding algorithm to create a streaming video file. The edited digital video file is first retrieved from the storage device (unless the digital video data is provided directly from capture device 1 4) . In this exemplary embodiment, the digital video file is encoded to a RealMedia format (i.e., RM) using a RealNetworks encoding algorithm. RM is an audiovisual file format proprietary to RealNetworks, Inc. As a further alternative, Windows Media Encoder, manufactured by Microsoft Corp., may be utilized to encode the captured digital video file, for example, to an ASF format (Advanced Streaming Format) or ASX format. Further still, QuickTime, manufactured by Apple Computer, Inc., Cupertino, California, may be utilized to encode the captured digital video file, for example, to an MOV format. Encoding may additionally include compression, if a smaller file size is desirable, as indicated by steps 62 and 64. The amount of compression may be selected by the operator using encoding software 30 or alternative compression software. During the encoding process, the digital video file is encoded to have a data rate of between approximately 35 kbps (kilobits per second) to 750 kbps, and a frame rate of between approximately 24 fps (frames per second) and 30 fps (e.g. 29.97 fps.) . The number of pixels per frame is set to at least approximately 76,800 (again, at least approximately 69,000 pixels taking into consideration overscan) which, for a 4:3 aspect ratio, is 320 x 240 pixels (again, at least 304 x 228 pixels taking into consideration overscan), or at least more than the 1 60 x 1 20 pixels of conventional usage. However, editing, encoding, and compression are optional steps.
At step 66, the digital video file is marked up with a markup language, such as, HTML. At step 68, a full screen frame size is associated with the digital video file. A full screen frame size is at least 640 x 480 pixels, and may also be 800 x 600 pixels, 1 024 x 768 pixels, 1 280 x 1 024 pixels, 1 600 x 1 200 pixels, etc. In this exemplary embodiment, the markup language associated with the digital video file includes a code segment that causes the digital video file to stream at the desired full screen frame size. While the markup language is used to associate the full screen frame size code segment with the digital video file in this exemplary embodiment, the full screen frame size code segment may be associated with the digital video file in another step of the method, such as the encode step 60, compression step 62, or another step.
At step 70, the digital video file is uploaded to an Internet web page using uploading software, such as, WS FTP PRO. At step 72, a script (e.g., an ASCII file (American Standard Code for Information Interchange)) is associated with the marked-up digital video file. The script calls the video to stream in response to a user actuation from user computer 34. The script is written in a RAM format, such as from a Microsoft Notepad software program. The script is included in the markup language associated with the digital video file. In this exemplary embodiment, an actuatable user input device (e.g., a hypertext link) is associated with the HTML code. Thus, a user from anywhere in the world may access network server 1 8 via the internet, actuate the user input device, and call the video to stream. Upon actuation, the HTML codes launch video playing software (e.g., RealPlayer) at the user computer, enlarge the viewing window of the software to full screen mode (i.e., at least 640 x 480), and begin streaming the video to the
user computer. Alternatively, the user may expand the viewing screen to full screen mode by actuating an input device on the video player software. Other methods of expanding the viewing screen to a full screen are contemplated . The transmission speed of the digital video file is dependent upon the bandwidth of the user's network connection, but may range from approximately 35 kbps to 750 kbps, or as low as 28.8 kbps, with a frame rate of between approximately 24 fps to 29.97 or 30 fps.
According to one alternative embodiment, network server 1 8 is configured to query user computer 34 to ascertain the network connection used by computer 34 (e.g., 28.8 kbps modem, T1 line, ISDN, etc.) . Thereafter, network server 1 8 determines the appropriate transmission rate based on the ascertained network connection.
EXAMPLE A
A Sony DCR VX-1 000 digital video camera, having 3CCD technology, manufactured by Sony Electronics, Inc., Park Ridge, N.J., was utilized to record a video signal. The video camera generated an output signal of 6MHz in NTSC format.
A Dazzle LAV-1 000S external capture device was coupled to the video camera. Amigo 2.1 1 , Dazzle's capture software was used. The Dazzle capture device and capture software were programmed with several parameters. The frame size was left at the default setting of 320 x 240 pixels. The frame speed was set to 29.97 frames per second. The bit rate was set to 3.0 Megabits (Mb) per second. The audio capture was set to 44 kHz, 1 6 bit sampling rate. An MPEG-1 file was generated based on the video
signal using the capture device and software programmed with these parameters.
When the captured MPEG-1 file was provided to RealEncoder G2, the resulting encoded file failed to retain the real video frame rate. Therefore, Adobe Premier 5.1 was utilized to receive the MPEG-1 file and export it to a MOV or AVI or MPEG file., based on several parameters. The frame rate in Adobe Premier 5.1 was set to 29.97 fps. The frame size was set to 320 x 240. The "Quality" setting, representing the number of colors to appear in the edited file, was set to a high setting (e.g., 1 00%) . Adobe Premier 5.1 generated an AVI file or an MOV file or a MPEG file, depending upon the operator selection.
RealEncoder G2 software was used to encode the AVI or MOV file into a streaming video file in RM format. The RealEncoder G2 software was programmed with several parameters. The bitrate was set to 220 kbps. The frame rate was set to 30 fps. The "Surestream" option was selected. "Surestream" technology adjusts the playing speed of the encoded digital video file to accommodate the network connection speed of the user. For sound quality, "stereo/music", the highest quality, was selected . For image quality, "sharpest image", the highest quality, was selected. Regarding frame size, this version of RealEncoder generated an output signal having a frame size equal to that of the frame size of the MOV or AVI input file. RealEncoder compressed the MOV or AVI input file using the RealNetworks compression algorithm. An RM file was generated based on the these parameters.
The RM file was uploaded to an Internet server. Using Microsoft Notepad, a script was written in RAM format to 1 ) identify the location of the RM file, 2) launch RealPlayer on the user
computer, 3) resize the viewing screen on the user computer to 640 x 480, and 4) begin the video stream. The result was unexpectedly high-quality, full-screen, real video frame rate, streaming video. The RM file was subsequently streamed to a client computer via a telephone modem and via other broadband connections. The same unexpectedly high-quality, full-screen, real video frame rate, streaming video was experienced. The streaming playback was intermittent due to the need to buffer to accommodate the lower bit- rate of transmission.
EXAMPLE B
According to another example, an NTSC analog signal is provided to a Pinnacle DC-10PLUS capture device. The Pinnacle capture device and associated software generate a digital video file in AVI format based on several parameters. The capture type is set to NTSC. The frame size is set to 320 x 240 pixels, or "1 /4 full frame size". Brightness, sharpness, and color are adjusted, as desired. The compression rate is set to 2.5: 1 . The frame rate is set to 29.97. Square pixel ratio is selected. Audio is set to stereo format, 44 kHz, 1 6 bit sampling. The data rate is set to 1 739 kbps. The capture device utilizes a Miro codec to create a digital video file in AVI format.
Optionally, a header and footer is provided to the beginning and end of the digital video file. The header and footer include a trademark for the assignee of the present application. Adobe Premier is used to render the header, footer, and watermark to the digital video file. A parameter within Adobe Premier is set to
a frame size of 320 x 240. Adobe Premier further utilizes a Miro codec to create a digital video file in AVI format.
The edited AVI file is encoded by RealProducer software. The following parameters are programmed in the RealProducer software. One set of parameters was used for a low- speed network connection at the user computer (hereinafter designated "LO"), and another set of parameters was used for a high-speed network connection at the user computer (hereinafter designated "HI") . RealNetworks "Surestream" technology is selected. Alternatively, "single-stream" can be selected, and an
RAM file can be generated to query the connection speed of the user computer and stream the video at the proper connection speed. The encoding speed is set to, for LO, 28 kbps or 56 kbps, and for HI, LAN, DSL, Cable Modem, or T1 . Sound quality is set to "voice only" or "stereo music" or "CD quality". Video quality is set to "sharper image". Frame rate is set to 29.97 fps. Target bit rate is set to 350 kbps. The target player is specified as RealPlayer G2. Frame size is set to 320 x 240. Based on these parameters, the RealEncoder software generates an RM file or other streaming video data file, which is subsequently uploaded to RealServer.
The exemplary embodiments disclosed herein provide greatly enhanced streaming video suitable for streaming over a limited-bandwidth network, such as the Internet. Several discoveries have enabled various aspects of this technology. The first discovery was that the efficiency of encoding from a captured digital video file to a streaming video file is increased with an increase in the frame size of the captured digital video file. Thus, while conventional teachings pointed toward minimizing the capturing and encoding
frame sizes (typically to 1 60 x 1 20 pixels, which has widely become an Internet standard for streaming video) to reduce the size of the resulting file, the present inventors turned away from these teachings and increased the capturing and encoding frame sizes to 320 x 240 pixels. Second, one goal of the present inventors was to achieve full-screen, real video frame rate, streaming video. Conventional teachings would point toward encoding at a frame size of 640 x 480 pixels to achieve full-screen streaming video. However, with today's technology, enlarging the frame size of a captured digital video file during encoding to 640 x 480 (for example, from 1 60 x 1 20 pixels) pixels causes an enormous increase in the amount of data in the resulting encoded digital video file and requires enormous bandwidth to stream. Therefore, the present inventors discovered that encoding at 320 x 240 pixels (or its equivalent) provided greatly improved results when doubled to fullscreen for viewing .
These conventional teachings were evidenced in the capabilities of the encoder used at the time of invention, namely, RealProducer G2. RealProducer G2 taught away from real video streaming since digital video files that were captured at a real video frame rate (e.g., 30 fps) would be automatically reduced to a lower, non-real video frame rate (e.g ., 1 5 fps) to reduce the size of the streaming video file. Furthermore, digital video files which were captured directly from a capture device using RealProducer G2 were encoded at a frame rate of only 6-7 fps and had no option to adjust frame size. Therefore, to obtain a real video frame rate, the inventors followed the steps in EXAMPLE A above to achieve the first high quality, full-screen, real frame rate streaming video file.
Referring now to FIG. 3, a system 80 for playing a digital video file across a network is shown, and a corresponding method is described. System 80 includes a network server 82 having a processor 84, a storage device 86, and a network interface 88. A capture device 90 is coupled to network server 82 and is configured to capture a video signal, as described hereinabove. Processor 84 controls capture device 90 and provides various parameters to capture device 90 regarding frame size, bit rate, etc. For example, one or more of the methods for capturing video and generating a digital video file described hereinabove may be implemented by processor 84, storage device 86, and capture device 90. Processor 84 and capture device 90 generate a digital video file in a digital video format (e.g., MPEG, AVI, etc.) and store it to storage device 86. As used in this description of FIG. 3, the term "storage device" includes such devices as magnetic tape, a hard drive, a floppy disk, magnetic disk, or other similar non-volatile storage media, but not including random access memory or other temporary memory. The capture process may alternatively be carried out on another computer, after which the resulting digital video file is stored in (e.g., uploaded to) storage device 86.
Network server 82 is coupled through network interface 88 to a network 92, such as the Internet, a LAN, etc. Processor 84 is configured to generate a web page having a hypertext link to the digital video file stored in storage device 86. A network client 94 includes a processor 96, a storage device 98, an input device 1 00, a display 1 02, and a network interface 1 04. Network client 94 is operable via a user to access the web page generated by network server 82 and to actuate the hypertext link to begin downloading the digital video file from storage device 86.
One drawback of downloading video files is that, for very large files, the delay before any portion of the digital video file can be viewed can be on the order of minutes, hours, or longer. Thus, according to one advantageous aspect of system 80, while the digital video file is being downloaded to network client 94 and stored in storage device 98, some of the digital video file which has already been downloaded and stored is being simultaneously played on display 1 02. A suitable player which supports AVI, MPEG, and other digital video formats is utilized for the video play. This procedure may be referred to as viewing/downloading. Stated another way, a first portion of the digital video file is played from storage device 98 while later portions of the digital video file are still downloading from storage device 86 via network 92 to storage device 98. One method of launching the player and beginning the play of the first portion is for a user to simply select these steps via input device 1 00 (e.g., a mouse, a keyboard, etc.) a certain time after the downloading has begun. Alternatively, an algorithm may be provided, either attached to the digital video file (e.g., HTML, Java, a macro, etc.) or as part of the player (e.g., QuickTime, RealPlayer, etc.) which begins playing the digital video file at a predetermined time after the download to storage device 98 has begun. This predetermined time may be pre-programmed or adjusted in real-time based on inputs from client server 94 or network server 82. According to one example, the algorithm calculates the predetermined time based on the download speed (e.g., including network connection speed of network interface 1 04, etc.), the viewing speed (e.g., frames per second, etc.), and the size of the digital video file. For example, if the viewing speed is four times the
download speed, the algorithm monitors the amount of the file (e.g., in bytes) which is downloaded until 75 % of the file is downloaded. When 75 % of the file is downloaded, the algorithm begins playing the digital video file from storage device 98. By playing the file at this predetermined time, the digital video file will play substantially without delays for buffering. Of course, other predetermined times are contemplated, including those earlier and later than that set forth in this exemplary embodiment.
Thus, one can view a digital video file shortly after clicking on the hypertext link and before the entire digital video file has downloaded to storage device 98. Once the entire digital video file is finished playing, network client 94 retains a copy of the digital video file in storage device 98 for later playing.
According to one alternative, the digital video data is captured in real-time and streamed in real-time across network 92 (i.e., without first storing to storage device 86) to storage device 98.
While the embodiments and applications of the invention illustrated in the FIGS, and described above are presently preferred, it should be understood that these embodiments are offered by way of example only. For example, while the steps of the exemplary embodiments contemplate recording audio and video at one time and streaming the audio and video at another time, the audio and video may alternatively be fed through system 1 0 in realtime, thereby facilitating real-time audio/video transmissions. Furthermore, the exemplary software programs mentioned may be replaced by newly developed versions and/or programs in the future. Accordingly, the present invention is not limited to a particular embodiment, but extends to various modifications that nevertheless fall within the scope of the appended claims.
Claims
1 . A method of providing a streaming video file, comprising: providing digital video data having a capture frame size of at least 69,300 pixels per frame; and converting the digital video data to a streaming video file having a converted frame size of at least 69,300 pixels per frame.
2. The method of claim 1 , wherein the capture frame size has an aspect ratio of 4:3 and the converted frame size of has an aspect ratio of 4:3.
3. The method of claim 2, wherein the capture frame size is at least 304 x 228 pixels and the converted frame size is at least 304 x 228 pixels.
4. The method of claim 3, wherein the capture frame size is approximately 320 x 240 pixels and the converted frame size is approximately 320 x 240 pixels.
5. The method of claim 1 , wherein the step of providing includes capturing a video signal.
6. The method of claim 5, wherein the step of providing includes digitizing the video signal to generate the digital video data.
7. The method of claim 6, wherein the step of providing includes storing the captured video data as a data file in a storage device, and wherein the step of converting includes retrieving the stored data file from the storage device.
8. The method of claim 1 , wherein the step of providing includes retrieving the digital video data from a storage device.
9. The method of claim 1 , further comprising compressing the digital video data.
1 0. The method of claim 9, wherein the digital video data is compressed to an MPEG file format.
1 1 . The method of claim 1 , wherein the streaming video file is converted to an RM format or an ASF format.
1 2. The method of claim 1 , wherein the converted frame size is approximately 320 x 240 pixels.
1 3. The method of claim 1 , wherein the digital video data has a capture frame rate of at least 24 frames per second and the streaming video file has a converted frame rate of at least 24 frames per second.
1 4. The method of claim 1 , further comprising streaming the streaming video file across a network.
1 5. The method of claim 1 4, wherein the network is the Internet.
1 6. A method of providing a streaming video file, comprising: providing digital video data having a capture frame rate of at least 24 frames per second; and converting the digital video data to a streaming video file having a converted frame rate of at least 24 frames per second.
1 7. The method of claim 1 6, wherein the capture frame rate is between 29 and 30 frames per second and the converted frame rate is between 29 and 30 frames per second.
1 8. The method of claim 1 6, wherein the step of providing includes capturing a video signal.
1 9. The method of claim 1 7, wherein the step of providing includes digitizing the video signal to generate the digital video data.
20. The method of claim 1 8, wherein the step of providing includes storing the captured video data as a data file in a storage device, and wherein the step of converting includes retrieving the stored data file from the storage device.
21 . The method of claim 1 6, wherein the step of providing includes retrieving the digital video data from a storage device.
22. The method of claim 1 6, further comprising compressing the digital video data.
23. The method of claim 21 , wherein the digital video data is compressed to an MPEG file format.
24. The method of claim 1 6, wherein the streaming video file is converted to an RM format or an ASF format.
25. The method of claim 1 6, wherein the digital video data has a capture frame size of at least 69,300 pixels per frame and the streaming video file has a converted frame size of at least 69,300 pixels per frame.
26. The method of claim 25, wherein the capture frame size has an aspect ratio of 4:3 and the converted frame size has an aspect ratio of 4:3.
27. The method of claim 26, wherein the capture frame size is at least 302 x 228 pixels and the converted frame size is at least 302 x 228 pixels.
28. The method of claim 27, wherein the capture frame size is approximately 320 x 240 and the converted frame size is approximately 320 x 240 pixels.
29. The method of claim 1 6, further comprising streaming the streaming video file across a network.
30. The method of claim 29, wherein the network is the Internet.
31 . A method of providing a streaming video file, comprising: obtaining a source video signal having a predetermined source video parameter; capturing the source video signal while maintaining substantially the same source video parameter to provide a captured digital video file; and encoding the captured digital video file while maintaining substantially the same source video parameter to provide a streaming video file.
32. The method of claim 31 , wherein the source video parameter includes the frame rate.
33. The method of claim 32, wherein the source video frame rate is at least 24 frames per second.
34. The method of claim 32, wherein the source video frame rate is a real video frame rate.
35. The method of claim 31 , wherein the source video parameter includes the number of scanned lines of video per frame.
36. The method of claim 35, wherein the number of scanned lines of video per frame is at least 240.
37. The method of claim 31 , wherein the streaming video file has a capture frame size of at least 304 x 228 pixels.
38. The method of claim 37, wherein the streaming video file has a capture frame size is approximately 320 x 240 pixels.
39. The method of claim 31 , further comprising editing the captured digital video file using video editing software.
40. The method of claim 31 , wherein the step of encoding includes compressing the captured digital video file.
41 . The method of claim 31 , wherein the captured digital video file is in an MPEG file format.
42. The method of claim 31 , wherein the source video signal is provided from a video playback device.
43. A method of generating a streaming video file for streaming over the Internet, comprising: providing digital video data having a capture frame size of at least 320 x 240 pixels; compressing the digital video data; encoding the digital video data into a streaming video file, wherein the streaming video file has a frame size of at least 320 x 240 pixels; and uploading the streaming video file to an Internet server.
44. The method of claim 43, wherein the streaming video file has a real video frame rate.
45. The method of claim 44, further comprising associating a hypertext link with the streaming video file.
46. The method of claim 45, further comprising running a video player program on an Internet client computer.
47. The method of claim 46, further comprising configuring the video player program for full-screen streaming video.
48. A system for providing a streaming video file, comprising: means for providing digital video data having a capture frame size of at least 320 x 240 pixels; and means for converting the digital video data to a streaming video file having a converted frame size of at least 320 x 240 pixels.
49. The system of claim 48, wherein the digital video data has a capture frame rate of at least 24 frames per second and the streaming video file has a converted frame rate of at least 24 frames per second.
50. The system of claim 48, further comprising means for capturing a video signal.
51 . The system of claim 48, wherein the means for converting includes means for encoding the digital video file into an RM file format.
52. A system for providing a streaming video file, comprising: means for providing digital video data having a capture frame rate of at least 24 frames per second; and means for converting the digital video data to a streaming video file having a converted frame rate of at least 24 frames per second.
53. The system of claim 52, wherein the capture frame size is at least 302 x 228 pixels and the converted frame size is at least 302 x 228 pixels.
54. The system of claim 52, further comprising means for capturing a video signal.
55. The system of claim 52, wherein the means for converting includes means for encoding the digital video data into an RM file format.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13729799P | 1999-06-03 | 1999-06-03 | |
US137297P | 1999-06-03 | ||
US15540499P | 1999-09-22 | 1999-09-22 | |
US155404P | 1999-09-22 | ||
US16955999P | 1999-12-08 | 1999-12-08 | |
US169559P | 1999-12-08 | ||
PCT/US2000/015405 WO2000076218A1 (en) | 1999-06-03 | 2000-06-02 | System and method for providing an enhanced digital video file |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1188318A1 true EP1188318A1 (en) | 2002-03-20 |
Family
ID=27384989
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP00938126A Withdrawn EP1183870A1 (en) | 1999-06-03 | 2000-06-02 | System and method for streaming an enhanced digital video file |
EP00944619A Withdrawn EP1188318A1 (en) | 1999-06-03 | 2000-06-02 | System and method for providing an enhanced digital video file |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP00938126A Withdrawn EP1183870A1 (en) | 1999-06-03 | 2000-06-02 | System and method for streaming an enhanced digital video file |
Country Status (4)
Country | Link |
---|---|
EP (2) | EP1183870A1 (en) |
JP (2) | JP2003501968A (en) |
AU (3) | AU5321100A (en) |
WO (3) | WO2000076219A1 (en) |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8464302B1 (en) | 1999-08-03 | 2013-06-11 | Videoshare, Llc | Method and system for sharing video with advertisements over a network |
US20020056123A1 (en) * | 2000-03-09 | 2002-05-09 | Gad Liwerant | Sharing a streaming video |
US6937814B1 (en) * | 2000-04-14 | 2005-08-30 | Realnetworks, Inc. | System and method for play while recording processing |
JP2002271766A (en) * | 2001-03-07 | 2002-09-20 | Sony Corp | Information processing unit and method, recording medium, and program |
JP2002271768A (en) * | 2001-03-07 | 2002-09-20 | Sony Corp | Information processing unit and method, recording medium, and program |
EP1349061A1 (en) * | 2002-03-27 | 2003-10-01 | Hewlett-Packard Company | Server based hardware control for internet applications |
WO2007045051A1 (en) | 2005-10-21 | 2007-04-26 | Honeywell Limited | An authorisation system and a method of authorisation |
WO2008144803A1 (en) | 2007-05-28 | 2008-12-04 | Honeywell International Inc | Systems and methods for configuring access control devices |
WO2008144804A1 (en) | 2007-05-28 | 2008-12-04 | Honeywell International Inc | Systems and methods for commissioning access control devices |
WO2010039598A2 (en) | 2008-09-30 | 2010-04-08 | Honeywell International Inc. | Systems and methods for interacting with access control devices |
WO2010099575A1 (en) | 2009-03-04 | 2010-09-10 | Honeywell International Inc. | Systems and methods for managing video data |
WO2010106474A1 (en) | 2009-03-19 | 2010-09-23 | Honeywell International Inc. | Systems and methods for managing access control devices |
EP2452489B1 (en) * | 2009-07-08 | 2020-06-17 | Honeywell International Inc. | Systems and methods for managing video data |
US9280365B2 (en) | 2009-12-17 | 2016-03-08 | Honeywell International Inc. | Systems and methods for managing configuration data at disconnected remote devices |
US8707414B2 (en) | 2010-01-07 | 2014-04-22 | Honeywell International Inc. | Systems and methods for location aware access control management |
US8787725B2 (en) | 2010-11-11 | 2014-07-22 | Honeywell International Inc. | Systems and methods for managing video data |
US9894261B2 (en) | 2011-06-24 | 2018-02-13 | Honeywell International Inc. | Systems and methods for presenting digital video management system information via a user-customizable hierarchical tree interface |
US10362273B2 (en) | 2011-08-05 | 2019-07-23 | Honeywell International Inc. | Systems and methods for managing video data |
WO2013020165A2 (en) | 2011-08-05 | 2013-02-14 | HONEYWELL INTERNATIONAL INC. Attn: Patent Services | Systems and methods for managing video data |
US9344684B2 (en) | 2011-08-05 | 2016-05-17 | Honeywell International Inc. | Systems and methods configured to enable content sharing between client terminals of a digital video management system |
CN103856543B (en) * | 2012-12-07 | 2019-02-15 | 腾讯科技(深圳)有限公司 | A kind of method for processing video frequency, mobile terminal and server |
US10523903B2 (en) | 2013-10-30 | 2019-12-31 | Honeywell International Inc. | Computer implemented systems frameworks and methods configured for enabling review of incident data |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5481275A (en) * | 1992-11-02 | 1996-01-02 | The 3Do Company | Resolution enhancement for video display using multi-line interpolation |
US5621660A (en) * | 1995-04-18 | 1997-04-15 | Sun Microsystems, Inc. | Software-based encoder for a software-implemented end-to-end scalable video delivery system |
KR19990072122A (en) * | 1995-12-12 | 1999-09-27 | 바자니 크레이그 에스 | Method and apparatus for real-time image transmission |
US6011537A (en) * | 1997-01-27 | 2000-01-04 | Slotznick; Benjamin | System for delivering and simultaneously displaying primary and secondary information, and for displaying only the secondary information during interstitial space |
-
2000
- 2000-06-02 EP EP00938126A patent/EP1183870A1/en not_active Withdrawn
- 2000-06-02 AU AU53211/00A patent/AU5321100A/en not_active Abandoned
- 2000-06-02 JP JP2001502364A patent/JP2003501968A/en active Pending
- 2000-06-02 WO PCT/US2000/015406 patent/WO2000076219A1/en active Application Filing
- 2000-06-02 EP EP00944619A patent/EP1188318A1/en not_active Withdrawn
- 2000-06-02 WO PCT/US2000/015408 patent/WO2000076220A1/en active Search and Examination
- 2000-06-02 WO PCT/US2000/015405 patent/WO2000076218A1/en not_active Application Discontinuation
- 2000-06-02 AU AU58689/00A patent/AU5868900A/en not_active Abandoned
- 2000-06-02 JP JP2001502362A patent/JP2003533066A/en active Pending
- 2000-06-02 AU AU53210/00A patent/AU5321000A/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of WO0076218A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2000076220A1 (en) | 2000-12-14 |
EP1183870A1 (en) | 2002-03-06 |
WO2000076219A1 (en) | 2000-12-14 |
AU5321000A (en) | 2000-12-28 |
JP2003501968A (en) | 2003-01-14 |
JP2003533066A (en) | 2003-11-05 |
AU5868900A (en) | 2000-12-28 |
AU5321100A (en) | 2000-12-28 |
WO2000076218A1 (en) | 2000-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000076218A1 (en) | System and method for providing an enhanced digital video file | |
JP3495767B2 (en) | Digital video editing apparatus and method | |
US20030185301A1 (en) | Video appliance | |
US7720352B2 (en) | Systems and computer program products to facilitate efficient transmission and playback of digital information | |
US6801968B2 (en) | Streaming-media input port | |
US7532231B2 (en) | Video conference recorder | |
US20020154691A1 (en) | System and process for compression, multiplexing, and real-time low-latency playback of networked audio/video bit streams | |
US20030156649A1 (en) | Video and/or audio processing | |
US20070160142A1 (en) | Camera and/or Camera Converter | |
US6580756B1 (en) | Data transmission method, data transmission system, data receiving method, and data receiving apparatus | |
WO2002100112A1 (en) | System and method for rapid video compression | |
US10945004B2 (en) | High-quality, reduced data rate streaming video production and monitoring system | |
CA2496782C (en) | Audio visual media encoding system | |
KR20020081519A (en) | Method streaming moving picture video on demand | |
US20050039211A1 (en) | High-quality, reduced data rate streaming video production and monitoring system | |
JPH0865663A (en) | Digital image information processor | |
JP2004507958A (en) | Dynamic quality adjustment based on changes in streaming constraints | |
US20040196377A1 (en) | Data recording in communications system | |
US6128435A (en) | Good quality video for the internet at very low bandwidth | |
JP2004349743A (en) | Video stream switching system, method, and video image monitoring and video image distribution system including video stream switching system | |
Haskell et al. | Introduction to digital multimedia, compression, and mpeg-2 | |
Hergert | Video Technologies for the Web | |
WO2000076221A1 (en) | System and method for video playback over a network | |
JP2004328796A (en) | Digital image information processing apparatus and digital image information processing method | |
JP2005237020A (en) | Digital video processing apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20011213 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT |
|
AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
17Q | First examination report despatched |
Effective date: 20021202 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20060102 |