AU2006202296A1 - Digital video storage - Google Patents

Digital video storage Download PDF

Info

Publication number
AU2006202296A1
AU2006202296A1 AU2006202296A AU2006202296A AU2006202296A1 AU 2006202296 A1 AU2006202296 A1 AU 2006202296A1 AU 2006202296 A AU2006202296 A AU 2006202296A AU 2006202296 A AU2006202296 A AU 2006202296A AU 2006202296 A1 AU2006202296 A1 AU 2006202296A1
Authority
AU
Australia
Prior art keywords
video
nvs
segment
storage space
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2006202296A
Inventor
Andrew Kisliakov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2006202296A priority Critical patent/AU2006202296A1/en
Publication of AU2006202296A1 publication Critical patent/AU2006202296A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Television Signal Processing For Recording (AREA)

Description

S&FRef: 743166
AUSTRALIA
PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address of Applicant: Actual Inventor(s): Address for Service: Invention Title: Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3-chome, Ohta-ku, Tokyo, 146, Japan Andrew Kisliakov Spruson Ferguson St Martins Tower Level 31 Market Street Sydney NSW 2000 (CCN 3710000177) Digital video storage The following statement is a full description of this invention, including the best method of performing it known to me/us:- 5845c DIGITAL VIDEO STORAGE Field of the Invention The present invention relates to a method, apparatus and system for writing video frames represented by video data sourced from a video source, to a storage space. The present invention also relates to a computer program product including a computer readable medium having recorded thereon a computer program for writing video frames represented by video data sourced from a video source, to a storage space.
Background There are many products in the marketplace that perform networked digital video recording. Many of these products may be built from readily available components, including a Transmission Control Protocol/Internet Protocol (TCP/IP) network, one or more digital video cameras and one or more storage servers. The storage servers are computer systems running software that receives images from the video cameras and stores the data on a storage medium such as a hard disk drive. Video data captured by the cameras is typically stored in the storage servers within video files containing temporally contiguous data corresponding to a single stream of video data. That is, each video file typically stores video data captured by a single camera. Typically, the storage servers periodically decide to close a currently active video file and begin writing video data to a new video file. This decision may be made when the size of a video file exceeds a certain limit, or when the duration of the video data (or media) contained within the video file exceeds a certain time.
Because the above storage servers typically operate with a finite amount of storage space, each storage server must have a strategy to ensure that enough free storage space is available on the storage server to facilitate continuous storage of new data. A simple and I:\CISRA\NVS\NVS 01\743166.doc IN -2somewhat effective strategy is to delete the oldest available video file whenever the free storage space goes below a threshold value. However, such a strategy suffers from the Sdisadvantage that a lot more video data may be deleted in a single operation than is strictly necessary.
Another strategy for ensuring that enough free storage space is available to facilitate continuous storage of new data may be to open an existing video file and reduce the quality of some of the video data contained therein. Video quality may be reduced by increasing (,i the compression factor of individual video frames represented by the video data, by reducing the resolution of individual video frames compressing the video data), or by deleting certain video frames in order to reduce the frame rate of the video data. When any of these methods are applied to a standard video file, hard disk fragmentation occurs, and large gaps may appear within the video file that cannot be reclaimed as free storage space by an associated file system. The storage server storing such a video file may defragment the video file after executing the above methods, but the process of defragmentation may be slow and consume precious CPU resources.
Thus, a need clearly exists for an improved method, apparatus and system for writing video frames represented by video data, to a storage space.
Summary It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements.
According to one aspect of the present invention there is provided a method of writing video frames represented by video data sourced from a video source, to a storage space, said method comprising the steps of: I:\CISRA\NVS\NVS O1\743166.doc assigning a priority value to one of the video frames for use in subsequent degradation of the video data; and writing the video frame to a segment of the storage space, the segment being selected based on the assigned priority value.
According to another aspect of the present invention there is provided a system for writing video frames represented by video data sourced from a video source, to a storage space, said system comprising: an assigning module for assigning a priority to one of the video frames for use in subsequent degradation of the video data; a writing module for writing the video frame to a segment of the storage space, the segment being selected based on the assigned priority.
Other aspects of the invention are also disclosed.
Brief Description of the Drawings Some aspects of the prior art and one or more embodiments of the present invention will now be described with reference to the drawings and appendices, in which: Fig. 1 is schematic diagram of a digital video recording system upon which embodiments described may be practiced; Fig. 2 is a schematic block diagram of a general-purpose computer upon which a networked video recording (NVR) server described herein may be practiced; Fig. 3 is a flow diagram showing a method of recording video frames represented by video data sourced from a video source; Fig. 4 shows the layout of a video file created in accordance with the preferred embodiment; ]:\CJSRA\NVS\NVS 01\743166.doc Fig. 5 shows a method of writing a video frame to a video file as executed in the method of Fig. 3; Fig. 6 shows a method of freeing storage space on the hard disk drive of the NVR server of Fig. 2; and Fig. 7 is a flow diagram showing a method of writing video frames represented by video data sourced from a video source, to a storage space.
Detailed Description including Best Mode Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears. It is to be noted that the discussions contained in the "Background" section and that above relating to prior art arrangements relate to discussions of documents or devices which form public knowledge through their respective publication and/or use.
Such should not be interpreted as a representation by the present inventor(s) or patent applicant that such documents or devices in any way form part of the common general knowledge in the art.
For ease of explanation the following description has been divided into Sections to 4.0, each section including associated sub-sections.
DIGITAL VIDEO DATA RECORDING SYSTEM OVERVIEW A method 500 (see Fig. 5) of writing a video frame represented by video data sourced from a video source in the form of camera server 109-111 (see Fig. to a storage space in the form of a video file, is described below with reference to Figs. 1 to 6. In one embodiment, the method 500 may be implemented within a digital video recording system I:\CISRA\NVS\NVS 01\743166.doc IN 100, such as that shown in Fig. 1. As used below, the term "recording" when used in relation to video data refers to "accessing and storing" the video data.
Cc The digital video recording system 100 comprises video cameras 112, 113, 114 and 115 connected to a computer network 220, via an associated camera server 109, 110 and 5 111. The computer network 220 may be implemented using a TCP/IP protocol stack and the size of the computer network 220 may range from a local area network (LAN)
IND
Scomprising a small number of nodes to a wide area network (WAN) such as the Internet.
One or more of the cameras 112-115 may be configured within an associated camera server 109-111, such that the camera and camera server are a single unit.
In accordance with the described embodiment, each of the camera servers 109-111 forms the video source. Alternatively, if the camera and camera server is a single unit, the single unit forms the video source. Still further, in another embodiment, one of the cameras 112-115 may have a processor and memory configured therein such that the camera forms the video source.
The video data is output by the camera servers 109-111 as a stream of video data.
The stream of video data may represent one or more frames of video data where each frame of video data (or video frame) represents a digital image. The camera servers 109- 111 optionally comprise sensor inputs to which sensors may be connected. If a sensor connected to a camera server 109-111 is activated, then the camera server 109-111 may be configured to generate an event notification.
The system 100 also comprises Network Video Recording (NVR) servers 200A, 200B, 200C and 200D, which may be used for monitoring the output of video data from any one of the camera servers 109-111 and for processing and storing the video data as one or more video frames. The NVR servers 200A, 200B, 200C and 200D may also be used I:\CISRA\NVS\NVS 01\743166.doc for accessing video data, for event handling and for control of the video recording system 100. The NVR servers 200A to 200D will hereinafter be generically referred to as the NVR server 200, excepting where explicitly distinguished.
The video data captured by one or more of the cameras 112-115 may be uploaded by one of the associated camera servers 109-111, via the computer network 220, from any one of the camera servers 109-111 to the NVR server 200. The NVR server 200 may communicate with the camera servers 109-111 using a communications protocol such as the HTTP-based Webview Livescope T M protocol used by CanonTM video cameras. The video data may be received by the NVR server 200 in Motion-JPEG image format, where each video frame represented by the video data is a JPEG image and may be viewed independently of any other frames of video data. The video data may be processed by the NVR server 200 and stored in free storage space of the hard disk drive 210 of the NVR server (see Fig. 2) as one of more video frames, so that the video data may be viewed by a user using a display 214 (see Fig. 2) of the NVR server 200.
As seen in Fig. 2, the NVR server 200 may be implemented using a computer system, such as that shown in Fig. 2, comprising a computer module 201, input devices such as a keyboard 202 and a mouse pointer device 203, and output devices including a printer215, a display device214 and loudspeakers217. An external Modulator- Demodulator (Modem) transceiver device 216 may be used by the computer module 201 for communicating to and from the computer network 220 via a connection 221. The network 220 may be a wide-area network (WAN), such as the Internet or a private WAN.
Where the connection 221 is a telephone line, the modem 216 may be a traditional "dialup" modem. Alternatively, where the connection 221 is a high capacity (eg: cable) I:\CISRA\NVS\NVS 01\743166.doc -7connection, the modem 216 may be a broadband modem. A wireless modem may also be used for wireless connection to the network 220.
The computer module 201 typically includes at least one processor unit 205, and a memory unit 206 for example formed from semiconductor random access memory (RAM) and read only memory (ROM). The module 201 also includes a number of input/output interfaces including an audio-video interface207 that couples to the video display214 and loudspeakers217, an 1/O interface213 for the keyboard202 and mouse 203, and an interface 208 for the external modem 216 and printer 215. In some implementations, the modem 216 may be incorporated within the computer module 201, for example within the interface 208. The computer module 201 also has a local network interface 211 which, via a connection 223, permits coupling of the computer module 201 to a local computer network 222, known as a Local Area Network (LAN). As also illustrated, the local computer network 222 may also couple to the wide area network 220 via a connection 224, which would typically include a so-called "firewall" device or similar functionality. The interface 211 may be formed by an Ethernet T M circuit card, a wireless Bluetooth T M or an IEEE 802.21 wireless arrangement.
The interfaces 208 and 213 may afford both serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 209 are provided and typically include a hard disk drive (HDD) 210. Each of these storage devices 209 including the hard disk drive 210 have a finite amount of storage space configured thereon. Other devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 212 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (eg: CD-ROM, I:\CISRA\NVS\NVS 01\743166.dac -8- DVD), USB-RAM, and floppy disks for example may then be used as appropriate sources of data to the computer system 200.
The components 205 to 213 of the computer module 201 typically communicate via an interconnected bus 204 and in a manner which results in a conventional mode of operation of the computer module 201 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or alike computer systems evolved therefrom.
The methods described herein may be implemented as software, such as one or more application programs executable within the computer module 201. In particular, the steps of the described methods may be effected by instructions in the software that are carried out within the computer module 201. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the described methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software may be stored in a computer readable medium, including the storage devices described below, for example.
The software may be loaded into the computer module 201 from the computer readable medium, and then be executed by the computer system 201. A computer readable medium having such software or computer program recorded on it is a computer program product.
The use of the computer program product in the computer module 201 preferably effects an advantageous apparatus for implementing the described methods.
Typically, the application programs discussed above are resident on the hard disk drive 210 and read and controlled in execution by the processor 205. Intermediate storage I:\CISRA\NVS\NVS 01\743166.doc
NO
of such programs and any data fetched from the networks 220 and 222 may be accomplished using the semiconductor memory 206, possibly in concert with the hard disk Cc drive 210. In some instances, the application programs may be supplied to the user encoded on one or more CD-ROM and read via the corresponding drive 212, or alternatively may be read by the user from the networks 220 or 222. Still further, the Ssoftware can also be loaded into the computer module 201 from other computer readable media. Computer readable media refers to any storage medium that participates in providing instructions and/or data to the computer module 201 for execution and/or processing. Examples of such media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 201. Examples of computer readable transmission media that may also participate in the provision of instructions and/or data include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
The second part of the application programs and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) to be rendered or otherwise represented upon the display 214. Through manipulation of the keyboard 202 and the mouse 203, a user of the computer module 201 and the application may manipulate the interface to provide controlling commands and/or input to the applications associated with the GUI(s).
The camera servers 109-111 have a similar configuration to the computer module 201. The camera servers 109-111 include a memory memory 206) and a processor I:\CISRA\NVS\NVS O1\743166.doc a processor 205). However, the hardware configuration of the camera servers 109- 111 will not be explained in further detail herein.
As described above, the NVR server 200 may be used for monitoring and handling events from sensors sensors attached to the camera servers 109-111). One of these events may include motion detection using a motion detector (not shown) connected to one or more of the camera servers 109-111 directly. Further events include heat/smoke detection using a heat/smoke detector, a door opening/closing using a limit switch, for example.
The methods described herein may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of the described methods. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
The Networked Video Recording Server As described above, the system 100 also comprises the NVR server 200, which may be used for monitoring the output of video data from any one of the camera servers 109- 111 and, for processing and storing the video data as one or more digital video frames.
The NVR server 200 may also be used for accessing video data, for event handling and for control of the video recording system 100.
2.1 Video processing and recording Each video frame received by the NVR server 200 from a camera server 109-111 follows a pre-defined path of processing. A method 300 of recording video frames represented by video data sourced from a video source in the form of one of the camera I:\CISRA\NVS\NVS 01\743166.doc -11 servers 109-111, will now be described with reference to Fig. 3. Each of the steps in the method 300 is described below in a subsequent sub-section.
The method 300 may be implemented as software resident on the hard disk drive 210 and being controlled in its execution by the processor 205. The steps of the method 300 are typically implemented in separate threads in order to prevent delays in one step of the method 300 from holding up processing of other digital video frames at other steps in the method 300.
2.1.1 Receive Frame The method 300 begins at step 301, where the processor 205 of the NVR server 200 performs the step of receiving video data representing a video frame from a video source in the form of a camera server 109-111, via the network 220. Upon receiving the video frame, the NVR server 200 allocates a buffer in the memory unit 206 large enough to store the entire video frame. The NVR Server 200 then fills the allocated buffer with a portion of the video data representing the video frame received from the network 220.
2.1.2 Analyse Video Frame At the next step 303, the processor 205 of the NVR server 200 performs the step of analysing the video frame received at step 301. The NVR Server 200 may be configured to use a combination of different analysis methods at step 303 to determine certain metrics regarding the video frame. The video frame analysis methods used at step 303 may be configured according to a schedule, which states which video frame analysis methods are active at any given time.
The metrics resulting from the video frame analysis may be stored as real values within the range [0.0,1.0].One video frame analysis method is motion detection, which I:\CISRA\NVS\NVS 01\743166.doc ID-12may be performed using a motion detection method that evaluates the difference between two successive video frames. In this instance, each video frame is assigned a motion value Cc which represents the difference between the video frame and a previous video frame.
Another video frame analysis method is frame quality analysis, which is performed using a C N 5 frame quality method that considers internal video frame qualities such as image
(N
Ssharpness. Each video frame is assigned a quality value obtained using this frame quality method.
The above list of video frame analysis methods is not exhaustive, as there are many other methods of analysing video frames. The NVR server 200 may be extended to support customised video frame analysis methods through additional software modules which may be executed by the processor 205 as necessary.
In one embodiment, a camera server 109-111 may perform the video frame analysis at step 303. If so, the NVR server 200 may use the results determined by the camera server 109-111 instead of the NVR server 200 performing the analysis.
2.1.3 Prioritise Frame The method 300 continues at the next step 305, where before recording the video frame received at step 301, the processor 205 of the NVR server 200 performs the step of assigning a priority value to the video frame for use in subsequent degradation of the video data representing the video frame. As described above, the method 300 is preferably implemented as software resident on the hard disk drive 210 and being controlled in its execution by the processor 205. In particular, step 305 may be implemented as software in the form of an "assigning module". The priority value is always an integral value, but the range of priority values may be variable.
I:\CISRA\NVS\NVS 01\743166.doc -13- The NVR server 200 may be configured to use a combination of different prioritisation methods to assign a final priority value to the video frame. If more than one Smethod is used, the results obtained using each method may be weighted and then added together to obtain a single priority value.
As will be described in detail below, priority ranges, prioritisation methods and Sweightings used to assign the priority value to the video frame at step 305 may be Sconfigured in a similar manner to the video frame analysis methods described above.
The following sections describe some prioritisation methods which may be used to assign a priority value to the video frame at step 305. All equations described below use a value R to represent the range of available priority values, a value n to represent the number of a video frame as received by the NVR server 200 from the camera server 109- 111 and a value p, to represent the priority value assigned to the nth video frame.
2.1.3.1 Round-robin based priority assignment One method which may be used at step 305 for assigning a priority value to the video frame received at step 301 is referred to as the round-robin based priority assignment method. The round-robin based priority assignment method uses Equation 1, described below, to assign a priority value to the nth video frame received by the NVR server 200.
p, n mod R Equation 1: Round-robin priority assignment I:\CISRA\NVS\NVS 01\743166.doc -14- Equation 1 provides an even distribution of video frame priority values to the video frames received by the NVR server 200. However, as will be described below, when video frames having priority values assigned to them using the round-robin priority assignment method are later deleted in order of priority, the remaining video frames are clustered together with gaps of increasing size between clusters, resulting in an uneven frame rate.
2.1.3.2 Round-robin based priority assignment with look-up table Another method which may be used at step 305 for assigning a priority value to the video frame received at step 301 is referred to as a "round-robin based priority assignment with look-up table" method. The round-robin based priority assignment with look-up table method uses Equation 2, below, to assign a priority value to the nth video frame received by the NVR server 200, where tR represents a constant priority look-up table having a number of entries which is equal to R.
Pn tR,,.,odR Equation 2: Round-robin priority assignment with look-up table The values in each entry of the look-up table are within the range The values in the look-up table may be allocated so as to maximise the distance between two successive priority values. This ensures that, as video frame priority values are later deleted in order of priority, the resulting frame rate will remain smoother than using the round-robin priority assignment method of Section 2.1.3.1.
2.1.3.3 Motion based priority assignment Another method which may be used at step 305 for assigning a priority value to the video frame received at step 301 is referred to as the "motion detection" method. The I:\CISRA\NVS\NVS 01\743166.doc motion detection method uses Equation 3, below, to assign a priority value p, to the nth video frame received by the NVR server 200, where m represents the value obtained using
O
the motion detection method.
IN I floor(Rm,) if N (R if m \0 5 Equation 3: Motion level based priority assignment C Since motion detection typically depends on surrounding frames, another prioritisation method for use in the method 500 may be configured to take into account the motion values of surrounding frames.
2.1.3.4 Quality based priority assignment Another method which may be used at step 305 for assigning a priority to the video frame received at step 301 is referred to as the "quality analysis" method. The quality analysis method uses Equation 4, below, to assign a priority value p, to the nth video frame received by the NVR server 200, where q" represents the value obtained using the quality analysis method.
p floor(Rq,) if q, P" Equation 4: Quality based priority assignment 2.1.4 Record Frame After determining a priority value for the video frame received at step 301, at the next step 307, the processor 205 of the NVR server 200 performs the step of recording the video frame to a storage space on the NVR server 200. In accordance with the preferred I:\CISRA\NVS\NVS 01\743166.doc -16embodiment, the video frame is written to a video file stored on the hard disk drive 210 of the NVR server 200. In this embodiment, the video file forms the storage space.
However, in another embodiment the video frames may be stored in free storage space of the hard disk drive 210 without using a video file. In this instance, the hard disk drive 210 forms the storage space.
Separate video files may be used to record video frames represented by video data received from different video sources the camera servers 109-111). Video files are limited by size or time, and when the size or duration of a video file exceeds a given threshold, that video file is closed and a new video file is started. Information regarding the video source, start time and end time of the video frames received by the NVR server 200 in a given video file may be encoded in the video file name.
In the preferred embodiment, the video file is created as a sparse video file in order to efficiently use the hard disk drive 210 of the NVR server 200. Since sparse video files are not supported by all file systems, the NVR server 200 creates video files on file systems known to support sparse video files. For example, NTFS the standard file system used by Windows XP T M may be used for creating the video files.
The video file is preferably saved in a standard format that may be easily viewed on a number of platforms, such as Apple QuickTimeTM. The layout of a video file 400 created in accordance with the preferred embodiment is shown in Fig. 4. The video file 400 consists of two main sections the index 401 and the media data section 403. The index 401 and the media data sections 403 are interchangeable. The index 401 is located at the beginning of the video file 400 and a pre-determined number of bytes is reserved for the index 401. The media data section 403 occupies the remaining portion of the video file 400 and another pre-determined number of bytes are reserved for the media data section 403.
I:\CISRA\NVS\NVS 01\743166.doc -17- The media data section 403 of the video file 400 is divided into a plurality of segments 405), each of which corresponds to a single priority value. Accordingly, prior to executing the method 300 of recording the video frame in the video file 400, the processor 205 may be configured to perform the step of dividing the video file 400 storage space) into a plurality of segments. Each segment contains a header 407), which is a data structure that indicates the validity, priority, capacity and size of the associated segment. The free space of each segment may be calculated by subtracting the size of the used area of the segment from the capacity of the segment. Headers of the segments are updated whenever the segments are created, split or have frames written to a corresponding one of the segments.
When the video file is first created, the media data section 403 consists of a single, undefined segment. During step 307, the processor 205 performs the sub-step of writing the video frame received by the NVR server 200 from the camera server 109-111 at step 301 to a segment of a video file a storage space). The segment is selected by the processor 205 based on the assigned priority value the priority value assigned to the video frame at step 305).
A method 500 of writing video frames the video frame received at step 301 of the method 300) represented by video data sourced from a video source in the form of one of the camera servers 109-111, to storage space in the form of a video file, as executed at step 307, will now be described with reference to Fig. 5. The method 500 may be implemented as software resident on the hard disk drive 210 and being controlled in its execution by the processor 205. In particular, the method 500 may be implemented as a software module in the form of a "writing module".
I:\CISRA\NVS\NVS 01\74316.doc -18- The method 500 begins at step 501, where if the processor 205 determines that the video frame received at step 301 is the first video frame to be written to the video file then the method 500 proceeds to step 503. In this instance, the media data section 403 consists of a single undefined segment. Otherwise, the method 500 proceeds to step 509.
At step 503, the processor 305 performs the step of selecting the single segment within the video file. Then at the next step 505, the processor 205 performs the step of assigning the priority value of the video frame to the segment selected in step 503 by modifying the header of the selected segment. Following step 505, the method 500 proceeds to step 507.
At step 509, if the processor 205 determines that a segment exists within the video file that has the same priority value as the video frame and also has enough free space to contain the video frame, then the method 500 proceeds to step 511. Otherwise, the method 500 proceeds to step 513. At step 511, the processor 205 performs the step of selecting the segment discovered at step 509. Following step 511, the method 500 proceeds to step 511.
At step 513, the processor 205 performs the step of selecting the segment in the video file that has the largest amount of free storage area. Also at step 513, the processor 205 performs the sub-step of dividing the selected segment into a plurality preferably two) of further segments, where each of the further segments has a free storage area.
The segment selected at step 513, is divided in such a fashion as to make the amount of free storage area in both segments as close to equal as is reasonable. The free storage area may not be equal if, for example, the file system of the NVR server 200 supports a minimum block size for sparse files, where it is more practical to align segments along block boundaries. In step 513, the processor 205 also performs the step of assigning the priority value of a second segment of the plurality of further segments resulting from I:\CISRA\NVS\NVS 01\743166.doc -19the divided segment) to the priority value of the video frame by modifying the header of the second segment.
At the next step 515, if the processor 205 determines that the newly created second segment contains enough free storage area to store the video frame, then the method 500 proceeds to step 517. Otherwise, the method 500 proceeds to step 519. At step 517, the processor 205 performs the step of selecting the newly created second segment. Following step 517, the method 200 proceeds to step 507. At step 519, the processor 205 performs the step of closing the video file and creating a new video file. The method 500 then returns to step 501.
At step 507, the processor 205 performs the step of writing the video frame to the segment of the video file storage space) selected at step 505, the segment selected at step 511 or the segment selected at step 517. Accordingly, the processor 205 performs the step of writing the video frame to a segment of the video file storage space), where the segment is selected based on the assigned priority value the priority value assigned to the video file at 305).
Disk management The NVR server 200 continually monitors the available storage space on the hard disk drive 210. If the available storage space drops below a pre-defined threshold, the processor 205 of the NVR server 200 performs the step of deleting one or more video frames represented by video data from segments of one or more video files recorded on that hard disk drive 210. A method 600 of freeing storage space on the hard disk drive 210, will now be described below with reference to Fig. 6. The method 600 may be implemented as software resident on the hard disk drive 210 and being controlled in its execution.
I:\CISRA\NVS\NVS 01 \743166.doc The method 600 begins at step 601, where if the processor 205 determines that the free storage space on the hard disk drive 210 is below a pre-determined threshold value, then the method 600 proceeds to step 603. Otherwise, the method 600 concludes.
At step 603, the processor 205 performs the step of examining the last modification time of each video file stored on the hard disk drive 210 to determine an oldest video file.
The processor 205 then performs the step of opening the oldest video file and examining the segments contained therein. At the next step 605, if the processor 205 determines that the video file opened at step 603 contains only one segment, then the method 600 proceeds to step 607. Otherwise, the method 600 proceeds to step 609. At step 607, the processor 205 performs the step of deleting the video file opened at step 605. The method 600 then returns to step 601.
At step 609, the processor 205 performs the steps of examining all the segments contained in the video file opened at step 603 and determining a lowest priority value of all these segments. Then at the next step 611, the processor 205 performs the step of deleting all segments having an associated priority value of the same magnitude as the lowest priority value determined at step 609. Accordingly, by deleting the segments of the video file, one or more video frames are deleted from the video file so as to degrade the video data representing the video frames received by the NVR server 200 from the camera server 109-111. Alternatively, the processor 205 may perform the step of compressing one or more video frames by reducing the resolution of the video frames in a segment to degrade the video data received by the NVR server 200 from one of the camera servers 109-111.
When a segment is deleted, the contents of that deleted segment are set to zero causing the file system of the NVR server 200 to recover all the storage space used up by deleted segment. The header of the deleted segment is retained, but a flag is set in that I:\CISRA\NVS\NVS 01\743166.doc -21 header to indicate that the deleted segment is no longer valid. The index 401) of the video file is also rebuilt in order to eliminate references to all video frames within the deleted segments. Following step 611, the method 600 returns to step 601.
The number of priority values and, hence, segments used within a video file may vary. When selecting the number of segments used, the expected frame rate of the video file is considered, as well as resulting frame rates once one or more segments have been deleted. For example, if the video data received by the NVR server 200 from one of the camera servers 109-111 is to be recorded at twenty five (25) frames per second and the frame rate of the generated video file is to be no less than five frames per second, then a suitable value for the number of segments in the video file is five However, if the video data received by the NVR server 200 from one of the camera servers 109-111 is to be recorded at twenty-four (24) frames per second and the frame rate of any generated video file is to be no less than six frames per second, a suitable value for the number of segments in the video file is four 4.0 Prioritisation Methods Each of the methods for assigning a priority value to the video frames described in section 2.1.3 has its own advantages and disadvantages.
The round robin method described in section 2.1.3.1 works well with values of R of one two or three However, with a larger number of segments, as segments are deleted from the video file, the remaining frames are temporally clustered with increasing gaps between the clusters.
The round robin method with Look-up Table described in section 2.1.3.2 attempts to solve the problem described in the above paragraph. However, a look-up table must be I:\CISRA\NVS\NVS 01\743166.doc -22generated for each conceivable value of R used by the NVR server 200 in assigning priority values to the video frames.
The motion and quality based methods described sections 2.1.3.3 and 2.1.3.4 do not ensure an even distribution of video frames within a video file. While this may be desirable, an uneven distribution of video frames may have the effect of deleting entire blocks of video data while leaving other blocks unchanged. The motion and quality based methods may be combined with one of the round robin methods in order to reduce such problems.
A method 700 of writing video frames represented by video data sourced from a video source formed by one of the camera servers 109-111, to a storage space formed by a video file, will now be described with reference to Fig. 7. The method 700 may be implemented as software resident on the hard disk drive 210 and being controlled in its execution by the processor 205.
The method 700 begins at the first step 701, where the processor 205 performs the step of assigning a priority value to one of the video frames for use in subsequent degradation of the video data. The priority value is assigned to the video frame at step 701 in a similar manner to step 305 described above, using any one of the priority assignment methods (or prioritisation methods) described above in Sections 2.1.3.1 to 2.1.3.4.
At the next step 703, the processor 205 performs the step of writing the video frame to a segment of the storage space, the segment being selected based on the assigned priority value. Step 703 may be implemented in a similar manner to the method 500 as described above.
I:\CISRA\NVS\NVS 01\743166.doc -23 Industrial Applicability It is apparent from the above that the arrangements described are applicable to the computer and data processing industries.
The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. For example, in an alternative embodiment, the NVR server 200 maintains each segment in a separate video file. In this instance, the index 401) is stored in a separate file, while each video file containing video data representing video frames may contain an independent index that allows navigation within that video file. Such an embodiment may be suitable when using file systems that do not support sparse files, and may exist on the same NVR server, if the NVR server itself is required to support different kinds of file systems.
In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.
I:\CISRA\NVS\NVS 01\743166.doc

Claims (8)

  1. 2. The method of claim 1, further comprising the step of dividing the storage space into a plurality of segments.
  2. 3. The method of claim 2, further comprising the steps of: selecting one of the segments; and dividing the selected segment into a plurality of further segments, each of said plurality of further segments having a free storage area.
  3. 4. The method of claim 1, further comprising the step of deleting a video frame from the segment so as to degrade the video data. The method of claim 1, further comprising the step of compressing one or more video frames in the segment to degrade the video data.
  4. 6. The method of claim 1, wherein the video source is a camera. I:\CISRA\NVS\NVS 01\743166.doc 25
  5. 7. The method of claim 1, wherein the video source is a camera server.
  6. 8. The method of claim 1, wherein the storage space is a video file.
  7. 9. A system for writing video frames represented by video data sourced from a video source, to a storage space, said system comprising: an assigning module for assigning a priority to one of the video frames for use in subsequent degradation of the video data; a writing module for writing the video frame to a segment of the storage space, the segment being selected based on the assigned priority. A method of writing video frames represented by video data sourced from a video source, to a storage space, said method being substantially as herein before described with reference to any one of the embodiments as the embodiment is shown in the accompanying embodiments.
  8. 11. A system for writing video frames represented by video data sourced from a video source, to a storage space, said system being substantially as herein before described with reference to any one of the embodiments as the embodiment is shown in the accompanying embodiments. DATED this Twenty Ninth Day of May 2006 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant SPRUSON&FERGUSON I:\CISRA\NVS\NVS 01\743166.doc
AU2006202296A 2006-05-30 2006-05-30 Digital video storage Abandoned AU2006202296A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2006202296A AU2006202296A1 (en) 2006-05-30 2006-05-30 Digital video storage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2006202296A AU2006202296A1 (en) 2006-05-30 2006-05-30 Digital video storage

Publications (1)

Publication Number Publication Date
AU2006202296A1 true AU2006202296A1 (en) 2007-12-20

Family

ID=38835143

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2006202296A Abandoned AU2006202296A1 (en) 2006-05-30 2006-05-30 Digital video storage

Country Status (1)

Country Link
AU (1) AU2006202296A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111488772A (en) * 2019-01-29 2020-08-04 杭州海康威视数字技术股份有限公司 Method and apparatus for smoke detection

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111488772A (en) * 2019-01-29 2020-08-04 杭州海康威视数字技术股份有限公司 Method and apparatus for smoke detection
CN111488772B (en) * 2019-01-29 2023-09-22 杭州海康威视数字技术股份有限公司 Method and device for detecting smoke

Similar Documents

Publication Publication Date Title
EP2359536B1 (en) Adaptive network content delivery system
US8914529B2 (en) Dynamically adapting media content streaming and playback parameters for existing streaming and playback conditions
US9930385B2 (en) Variable bit video streams for adaptive streaming
US7859998B2 (en) System and method for managing pre-emption of quality of service (QoS) allocations in a network
EP2409240B1 (en) Variable rate media delivery system
US8386465B2 (en) System and method to manage and distribute media using a predictive media cache
USRE45192E1 (en) System and method for storing user data
EP2243272B1 (en) Providing remote access to segments of a transmitted program
US20080205389A1 (en) Selection of transrate and transcode processes by host computer
US20040183903A1 (en) Method and system for managing data in a system
US20120144444A1 (en) Variable Bit Video Streams for Adaptive Streaming
US20070058926A1 (en) Optimizing trick modes for streaming media content
CN108401134A (en) Play method, terminal and the system of video
WO2015196199A1 (en) System, apparatus and method for prioritizing the storage of content based on a threat index
EP2675132A1 (en) System for dynamic stream management in audio video bridged networks
EP1694023A1 (en) Method for performing data transport over a serial bus using internet protocol and apparatus for use in the method
US20150081924A1 (en) Streaming Media
US8117633B2 (en) Communication apparatus and its control method and program
JP2007295530A (en) Global switch resource manager
US11025970B2 (en) Controlling bandwidth usage by media streams by limiting streaming options provided to client systems
JP4601681B2 (en) Method of providing quality of service in network, intermediate node device, and system
US20230362106A1 (en) Application port management
KR20050007549A (en) Quality driven streaming method and apparatus
AU2006202296A1 (en) Digital video storage
JP5192506B2 (en) File cache management method, apparatus, and program

Legal Events

Date Code Title Description
MK1 Application lapsed section 142(2)(a) - no request for examination in relevant period