WO2004068855A1 - Systeme de securite pour video numerique animee adaptative mpsg (scss) - Google Patents

Systeme de securite pour video numerique animee adaptative mpsg (scss) Download PDF

Info

Publication number
WO2004068855A1
WO2004068855A1 PCT/US2003/003076 US0303076W WO2004068855A1 WO 2004068855 A1 WO2004068855 A1 WO 2004068855A1 US 0303076 W US0303076 W US 0303076W WO 2004068855 A1 WO2004068855 A1 WO 2004068855A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
data
security
file
request
Prior art date
Application number
PCT/US2003/003076
Other languages
English (en)
Inventor
Mark C. Koz
Stephen G. Haigh
William B. Brown
Original Assignee
Futuretel Digital Imaging, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futuretel Digital Imaging, Llc filed Critical Futuretel Digital Imaging, Llc
Priority to PCT/US2003/003076 priority Critical patent/WO2004068855A1/fr
Priority to AU2003210799A priority patent/AU2003210799A1/en
Publication of WO2004068855A1 publication Critical patent/WO2004068855A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19656Network used to communicate with a camera, e.g. WAN, LAN, Internet
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19667Details realated to data compression, encryption or encoding, e.g. resolution modes for reducing data volume to lower transmission bandwidth or memory requirements
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19669Event triggers storage or change of storage policy
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19671Addition of non-video data, i.e. metadata, to video stream
    • G08B13/19673Addition of time stamp, i.e. time metadata, to video stream
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19689Remote control of cameras, e.g. remote orientation or image zooming control for a PTZ camera
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19691Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
    • G08B13/19693Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound using multiple video sources viewed on a single or compound screen
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • SCSS MPEG Adaptive Motion Digital Video
  • the present invention relates generally to the technical field of electronic capture, storage, transmission and retrieval of time-synchronized video, audio and related event-data. More particularly, it relates to capture, storage, transmission and retrieval of such time-synchronized data using motion compensated digital video compression, and most particularly to security systems incorporating a multiplicity of quasi-autonomous, remote-controlled video/audio/event capture devices (intelligent pan/tilt/zoom digital video cameras) networked to a second group of multi-user GUI client work stations for data presentation and control through one or more high- level servers providing system control and support, relational data-base storage, retrieval, archiving and control through an integrated software system.
  • quasi-autonomous, remote-controlled video/audio/event capture devices intelligent pan/tilt/zoom digital video cameras
  • the invention's enhanced GUI, the motion adapted, time-synchronized digital data compression and the system's software and relational data base enables multiple users to retrieve, display, transmit, store, and process selected real-time and historical time segments of such digital video, audio and event-data using standard data protocols transmitted through a simple CAT-5 cable network by simple point and click selection of control metaphors on the GUI monitors display.
  • Motion JPEG is clumsy for security systems because of unacceptable trade-offs resulting from the massive data files created.
  • the exceedingly long files generated from days, weeks or even months of video recording little or nothing of interest either requires that huge volumes of storage media (typically tape) be handled, transported and archived in large warehouses or that video frame rate and or resolution be severely limited.
  • Motion JPEG additional imposes crippling limitations on ease of access in that it can only be viewed in serial form much like a VCR, where one is required to successively hit rewind then playing forward either normal speed or fast forward but still having to play sequentially to get from one point in time to another.
  • Prior art JPEG systems just stream the video data into a massive file. For example, it would not be unusual to have a one-month long video file with no marks to correlate ⁇ to time and events, which occurred during that one-month interval. Previously there has been no provision to time stamp events and the video data simultaneously. One must deal with a huge, -unwieldy linear steam of unrelated (apparently) data.
  • An object ofthe present invention is to provide a video security system transmitting storing and retrieving real-time, full motion, DVD quality video and audio throughout the entire system. Another object ofthe present invention is to provide Point and click 3-axis camera control.
  • An object ofthe present invention is to provide High capacity, low cost, efficient, digital video storage.
  • Another object ofthe present invention is to provide User-friendly control and monitoring.
  • An object ofthe present invention is to provide a video security system transmitting, storing and retrieving real-time, full motion, DVD quality video and audio throughout the entire system.
  • Another object ofthe present invention is to provide Point and click 3-axis camera control.
  • Another object ofthe present invention is to provide an "intelligent" video camera with onboard microprocessor control and memory enabling local software control of camera motion, pattern recognition, (insert other features).
  • Yet another object ofthe present invention is to provide a server communicating with a multiplicity of intelligent video cameras acting as the video/audio storage and database server for the security system.
  • Another object ofthe present invention is to provide a server for storing real-time compressed video and audio data from video cameras to hard disks, storing associated real-time alarm and alert events synchronized to the compressed video and audio, in a relational database, in which the data includes MPEG-1, MPEG-2, MPEG-4 and H.264 compressed video and audio data.
  • Yet another object ofthe present invention is to provide security system software architecture for the server, and acting as system controller.
  • Another object ofthe present invention provides a graphical user interface (GUI) with a rectangular grid of video images from independently selectable camera views on a primary graphics display.
  • GUI graphical user interface
  • Another object ofthe present invention is to provide multiple slave monitors, driven from one master SmartViewer station, each slave displaying full-size, real-time, video windows.
  • Another object ofthe present invention is to provide access to the alarm alert database and support historical review of archived video and audio from the server.
  • Another object ofthe present invention is to provide random access for historical review, frame by frame, or jog-shuttle type control and playback of video at play speeds from 1/30th real-time to 50x real-time in forward or reverse direction.
  • Yet another object ofthe present invention is to provide communication between remote cameras and server by 100BaseT network over CAT-5 cabling.
  • Another object ofthe present invention is to provide autonomous, remote video capture units that seamlessly integrate with motion sensors, access controls, and face recognition systems, other event sensors.
  • Another object ofthe present invention is to provide Secure, encrypted architecture: cameras, network, and server.
  • Another object ofthe present invention is to provide Time stamped event log synchronized to stored video/audio.
  • Another object ofthe present invention is to provide Remote access via LAN and Wireless, including 802.16, 802.1 lxx for example.
  • Another object ofthe present invention is to provide simple, low cost installation on CAT-5 network cable.
  • Another object ofthe present invention is to provide "spoof-proof and "hacker-proof capability provided by encryption of encoded data in the remote video cameras, encryption of encoded data in a Server Unit serving the cameras, and encryption of encoded data transmitted over a network connecting the video cameras, the Server Unit and display stations including remote Client Units.
  • Another object ofthe present invention is to provide Optimized disk I/O utilization for time- sensitive I/O requests by dynamic buffer priority assignment.
  • Another object ofthe present invention is to provide Time critical I/O requests automatically re-prioritized by Default or Application specific priority routines.
  • Another object ofthe present invention is to provide Near-real-time digital video playback at any speed.
  • the Futuretel SmartCam MPEG2 based digital video surveillance systems is a client/server based live video streaming system with multi-channel DVR (Digital Video Recording) and playback. In order to support video surveillance applications, it supports the following playback features:
  • the first area is the video, audio and sensor/signal data acquisition domain.
  • the second area is the overall system administration/control, data archival storage and distribution ofthe acquired video, audio and sensor/signal data.
  • the final domain ofthe security system invention is the operator interface, display and presentation function.
  • the functionality is divided primarily among the three major sub-systems: the video camera subsystem, the central server and the control and display stations.
  • Digital data acquisition (digital video-audio, alarm event capture) system event capture and logging) related three major functions that can be divided generally into three major functional areas.
  • the hardware ofthe system generally divides its functional behavior among three major groups of spaced apart, but interconnected locations; the first group being composed of distributed sub-nets of one or more SmartCam security data acquisition units, the second group being a smart server (generally, although not limited to, a single centralized server) connected through an interconnecting gateway to the SmartCam sub-nets and also to the third functional group, a second distributed sub-net of SmartViewer display/control workstations.
  • the first subnet group or groups are "intelligent" video/audio/data capture units: e.g., the SmartCam camera unit ofthe present invention.
  • a SmartCam unit typically includes a high quality pan/tilt/zoom digital video camera with integrated power and drive electronics, integrated MPEG2 encoders, an integrated microcomputer (a microprocessor with RAM, ROM, DSP, and non- volatile flash memory) an optional hard disk for local storage, an interface that connects it to its sub-net and to local sub-units such as additional cameras and sensors.
  • a SmartCam provides for expansion boards to give the enhanced features in accordance with the present invention.
  • the SmartCam interface also connects it to the SmartServer through the system network.
  • the SmartServer acts primarily as a central storage file server and database for the bulk ofthe video/audio and event sensor data generated by the SmartCam units.
  • the SmartServer central storage and database includes a software application for monitoring real time camera feeds and reviewing archived video by or more client workstations in the second distributed sub-net of SmartViewer display/control workstations.
  • One version of security system in accordance with the present invention uses a single server with multiple 100BaseT NIC s for the SmartCam subnets)
  • SmartCam Camera Unit cameras use UDP broadcast across the subnet, each camera sees the network traffic ofthe other cameras. To minimize data loss from the Camera Unit cameras, they are connected to their own sub net, with managed traffic flow and limited to no more than 35Mbps for the subnet.
  • the server(s) routes UDP traffic from the camera subnet to the Client subnet. Traffic from the client subnet to the cameras subnet is not allowed. Data on the client subnet includes re-broadcast UPD camera data and historical data retrieval from the servers, requested by the clients.
  • the server keeps an index of file seek positions and their time codes for MPEG video key (I) frames.
  • the server stores the MPEG data files in "manageable" size segments, typically divided into 1- hour segments of approximately 1GB each. Time information identifying the segment is encoded in the file name, allowing for easy human retrieval of data files, i.e.: a file name of scamO 1-2002022216 is human-readable as that video transport stream captured by smartcamOl on Feb. 22, 2002 at 16 hours UTC.
  • the server keeps an index of file seek positions and their time codes for MPEG video key (I) frames.
  • the SmartServer stores the MPEG data files in "manageable" size segments, typicallyl hour segments of approximately 1GB each. Time information identifying the segment is encoded in the file name, allowing for easy human retrieval of data files, i.e.: a file name of scamOl- 2002022216 is human-readable as that video transport stream captured by smartcamOl on Feb. 22, 2002 at 16 hours UTC.
  • Time base is based on UTC internally and converted to local time in the client GUI.
  • the present SCSS invention is a LAN based video, audio and data capture, distribution system with three major functions: distribution and archival system based on motion adaptive MPEG2 transport streams (compressed data).
  • the system has three major functions that can be divided generally into three related areas.
  • the first area is the video, audio and sensor/signal data acquisition domain.
  • the second area is the control, storage and distribution ofthe acquired video, audio and sensor/signal data.
  • the final domain ofthe security system invention is the display and presentation function.
  • the hardware ofthe system generally divides its functional behavior among three major groups of spaced apart, but interconnected locations; the first group being composed of distributed sub-nets of one or more SmartCam security data acquisition units, the second group being a SmartServer (generally, although not limited to, a single centralized server) connected through an interconnecting gateway to the SmartCam sub-nets and also to the third functional group, a second distributed sub-net of SmartViewer display/control workstations.
  • SmartServer generally, although not limited to, a single centralized server
  • the SmartServer acts primarily as a central storage file server and database for the bulk of the video/audio and event sensor data generated by the SmartCam units.
  • the SmartServer central storage and database includes a software application for monitoring real time camera feeds and reviewing archived video by or more client workstations in the second distributed subnet of SmartViewer display/control workstations.
  • One version of security system in accordance with the present invention uses a single server with multiple 100BaseT NIC's for the SmartCam subnets)
  • SmartCam cameras use UDP broadcast across the subnet, each camera sees the network traffic ofthe other cameras. To minimize data loss from the cameras, they are connected on their own sub net, with managed traffic flow and limited to no more than 35Mbps for the subnet.
  • the server(s) routes UDP traffic from the camera subnet to the Client subnet. Traffic from the client subnet to the cameras subnet is not allowed.
  • Data on the client subnet includes re-broadcast of UDP camera data and historical data retrieve from the servers, requested by the clients.
  • the System Time (time base) is based on UTC internally and converted to local time in the client GUI.
  • Camera Server, Client architecture The SmartCam camera unit, SmartServer server and SmartViewer client architecture and a novel dynamic buffering structure provide heretofore-unavailable security surveillance performance.
  • Camera Video and audio is digitized and compressed in an MPEG encoder, buffered in an encoder streamer buffer and sent to the network that has its own network driver buffer.
  • a network read has some internal buffering, followed by a write buffer to optimize disk I/O.
  • Disk write and read buffers of 2MByte per channel are found to work best on IDE RAIDO and RAID5 configurations. 2MByte of buffering at an average data rate of 2.2Mbps gives about 7 seconds of buffering.
  • the SmartViewer Client there is a network read buffer, a decoder input buffer, an MPEG decoder and decoded video and audio frame output buffers.
  • One ofthe primary and unique capabilities ofthe SmartCam system is the ability to pause live video and quickly play back video from the system storage to within 10 seconds of real time. Since the data path latency from camera, through the buffering, hard disk and network is about 30 seconds, there is provided a novel dynamic disk I/O pipeline bypass buffer.
  • This buffer allows the SmartViewer Client to request data from the Server that has not yet written to disk, or is not yet available for read back from the hard disk. It is a memory buffer that saves an extended portion ofthe contents ofthe disk write buffer for a period of time long enough after submitting a disk write block to the server scheduler, for the normal disk write path to complete its' write, usually about 30 seconds. SmartViewer operations
  • the SmartViewer control GUI communicates with the individual decoder video windows over a network friendly protocol such as CORBA, enabling video decode windows under the control ofthe GUI to run either locally or remotely.
  • SmartViewer workstations run the client application(s). The client application will run on
  • Each MPEG2 video image is square pixel CCIR601, i.e. 640x480.
  • the control GUI communicates with the individual decoder video windows over a network friendly protocol such as CORBA, enabling video decode windows under the control ofthe GUI to run either locally or remotely.
  • a network friendly protocol such as CORBA
  • Each camera broadcasts MPEG2 transport streams at a maximum data rate of 3Mbps (375KB/sec, 1.35GB/hour)
  • Each camera broadcasts over a 100BaseT network.
  • 100BaseT has about 30Mb/s of useful throughput; therefore a maximum of 16 cameras can be connected on a 100BaseT subnet.
  • MPEG2 is a high latency process.
  • the encoder and the decoder typically buffers about
  • Viewer workstations run the client application(s).
  • the client application will run on 1280x1024 or higher resolution screens.
  • Each MPEG2 video image is square pixel CCIR601, i.e. 640x480.
  • this delay is less than 1 second.
  • the client side (Smart Viewer) application supports up to 9 or 16 simultaneous near real time decodes into appropriately scaled down video windows.
  • Frame update rates on all video windows are typically from at least 5fps to full rate.
  • the images are scaled by l A in x and y, i.e. 320x240. 9 of these half size images will occupy 960x 720 pixels plus GUI borders and decorations. 16 video images are scaled by 1/4 in x and y, i.e. 320x 256. Any camera can be assigned to any one ofthe 9 or 16 video windows.
  • Any ofthe 9 or 16 video windows can by zoomed up by 2x back to full CCIR601 size (to occupy the space of 4 half size windows).
  • Audio playback can be from a designated active camera or from a mix of several cameras.
  • the client GUI makes it easy to select forward or backward play ofthe stored video from any camera.
  • Video can be stepped backwards or forward by a single frames, or played backward or forward at any speed ranging from zero to real time to 5 Ox real time.
  • Playback in either direction at any speed is essentially smooth from the user's perspective. Any or all ofthe 9 or 16 video windows run in either real time mode or historical review mode. All video windows in historical review mode can share a common tkne ⁇ base.
  • the system supports a networked main GUI client application utilizing several viewer station computers and displays.
  • One viewer station computer is able to run the main GUI while several slave computers provide video window display capability at higher resolution and frame rate than that available on the GUI computer. This feature facilitates broadcast production or control room video wall type applications in larger installations.
  • each camera sees the network traffic ofthe other cameras. To minimize data loss in the from the cameras, they are on their own sub net, with managed traffic flow and limited to no more than 35Mbps for the subnet.
  • the server(s) routes UDP traffic from the camera subnet to the Client subnet BUT NOT BACK.
  • Data on the client subnet includes re-broadcast UPD camera data and historical data from the servers, requested by the clients.
  • data from the servers is many times real time, hence the need for the IGbps client network.
  • the server keeps an index of file seek positions and their time codes for MPEG video key (I) frames.
  • the server stores the MPEG data files in "manageable" size chunks, typically 1 hour chunks of approximately 1GB each.
  • the time information is encoded in the file name, allowing for easy human retrieval of data files, i.e.: scamO 1-2002022216. Decoded, that's smartcamOl on Feb. 22, 2002 at 16 hours UTC. Time base is based on UTC internally and converted to local time in the client GUI.
  • the control GUI communicates with the individual decoder video windows over a network friendly protocol such as CORBA, enabling video decode windows under the control ofthe GUI to run either locally or remotely.
  • a network friendly protocol such as CORBA
  • the SmartCam client server software and hardware architecture provide previously unattainable “trick mode” and “off speed playback” capability in a client server video surveillance system.
  • Video and audio data is provided to the client in PES (packetized elemental stream format) and is provided in integral units of GOP (Group of pictures) for video and frames for " audio.
  • PES packetized elemental stream format
  • GOP Group of pictures
  • Video and audio are separated and sent separately for 2 reasons: 1) is easier to send integral units of GOPs and audio frames when they are separated than if they were multiplexed together, 2) audio can be sent on a higher priority channel that video which is useful since data loss in audio is far more noticeable and unpleasant that data loss in video.
  • Requests for data from the server are in the form of XML structured messages. The message format specified the start and end time ofthe requested data segment as well as the stream id and an additional mode parameter used for play speeds faster than 1.5X real time.
  • Playback speed optimization can also be performed by filtering out pictures that will not be displayed, on the server side.
  • IBP frame MPEG b pictures are discarded first since they rely on I and P pictures for decoding.
  • Trick mode playback which includes slow and fast, smooth motion playback in forward or reverse direction at any ofthe specified speeds, and single frame jog-shuttle is provided by a double buffered decoded system as shown, where the UTC time code is maintained for each decoded frame (video and audio).
  • Double buffering allows the MPEG decoder to decode an MPEG video GOP in the normal forward direction (the only way it can be done) while the display takes video frames for a previously decoded buffer in reverse order for reverse play. Maintaining decoded frames in memory also makes single stepping in either direction relatively simple.
  • the display manager manages display of decoded frames or video and audio based on direction of play, ratio of real time to playback speed, current time and time code ofthe decoded frame.
  • the client playback controller manages the whole process and keeps the MPEG decoder input buffer refreshed based on the direction and speed of play. It may also pause playback and signal a pause to the user though the GUI if data is not yet available from the server.
  • System events synchronized to audio and video.
  • the present invention provides a very nice way to retrieve information provided by system events (i.e., alarms, sensor signals and data, system conditions, personnel access, on-site personnel complement list) synchronized to related video sequences, because they are inherently time-synchronized when recorded and stored with the time stamped video.
  • Multi-media playback Video and audio clips can be saved directly to DVD-R and played on a conventional DVD player.
  • the SmartServer Configuration manager provides Simple web tools (available in English,
  • An Event Database is enabled by High performance, SQL database, that records events, such as alarms and alerts, system status, officer watch log, etc. Also, the Relational database allows full SQL query searches and includes Multi-language support, secure access through SSL. Alternatively, the system provides redundant parallel database option.
  • the SCSS SmartCam video storage and retrieval security system provides Digital Video/Audio/Event data capture, storage, transmission and display of Real Time and Historical audio, events and full-motion, DVD quality video data, throughout the entire system from capture to display.
  • SCSS Highly scalable distributed architecture, Industry standard protocols.
  • the SCSS (SCSS) is a scalable, distributed architecture, uses common Internet protocol standards for transcoding and communication Powerful, time-synchronized data retrieval and review capability.
  • Video, audio and alarm events are readily captured, stored, accessed and reviewed.
  • the SCSS integrates with motion sensors, access controls, face recognition systems, data acquisition and other event sensors.
  • the digital video frame data record automatically provides encoded video data, encoded audio data, and event data a synchronous time-stamp.
  • the capture, transmission, storage and retrieval of data inherently provides time synchronization between real-time and recorded video/audio and event data at every point in the system.
  • Transport stream data files (typically about one-hour long), are identifiable by a human readable File-Name-Identifier: e.g., a file name easily communicating time-origin. Secure, encrypted data Architecture.
  • SmartCam Cameras, network, Viewers and Server are all protected by built-in encryption.
  • System Events Are Time Stamped. All system events are stored in System Event log and are Time stamped because they synchronized to digital video-transport-stream data from the camera(s). Events and Video/audio data digitally stored and logged for easy retrieval. Typical types of events logged and synchronized to SmartCam video: system, camera, and non-camera events
  • the physical cabling for the SmartCam network communications in accordance with the present invention are simpler than prior security systems. These previous systems typically require multiple sets of separate cables; e.g., video coax plus twin lead for access control devices. In addition previous systems typically require multiple sets of USB cables, and sets of twin lead for audio, phone, power etc.
  • the SmartCam server connects to a number of remote video cameras through one portion of the system network. These typically include a plurality of remote SmartCam intelligent video camera units.
  • SmartCamTM SCSS provides high 'Spoof proof and "hacker proof barriers against hostile attack.
  • the SCSS continuously monitors data streams from SmartCam cameras. If one is logged off (removed by external agent), that log-off is also recognized as a stored time-stamped event. The logged-off camera has to go through complete identity verification before it is logged on again.
  • An External agent could be natural cause, e.g., lighting strike, power fail, etc.
  • SmartCam video-transport data streams are continuously monitored; interruptions are recognized and logged as another event which the system can use to set an alarm, notify personnel and "de-certify" camera data following the interruption, until the SmartCam is verified by a trusted independent Certification.
  • Levels of Certification can be selected: from Certification auto-generated by the SmartCam system, by another trusted source, up to a hard boot-up Admin logon by only the highest level authorized user can be necessary to recover.
  • the Certification origin can be allowed from anywhere through admin console or restricted to a single secure console. All these events are logged, and are recorded for review by admin before logging cameras [sensors] back into system again.
  • Video and audio is received in encrypted, compressed MPEG2 format on the SmartCam side network interface ofthe SmartServer.
  • Compressed video and audio is stored to disk in real time, time stamped and logged in the Content Manager database.
  • Encrypted compressed video and audio is received by SmartServer, it is decrypted for storage on disk.
  • a decryption process verifies that the source ofthe data is from the actual installed camera and that the camera time clock is identical to system time. This is to guard against any possibility of another MPEG2 video source mimicking a SmartCam.
  • SmartServer If SmartServer detects any irregularity with its registered SmartCams, including disconnection from the network, interruption in service, encrypted data errors, etc, it will generate an alarm in the system. (See Event handling system below) As SmartServer captured camera feeds to disk, eventually the disks will become full.
  • the Content Manager is a part ofthe system manager and manages policy on disk usage.
  • the Content Manager allows the system administrator to configure the File Store disk fullness thresholds, and how to handle purging (removing) files from the system to free up disk space.
  • File purging can be configured on a camera-by-camera basis and or and event history basis (least active cameras or zones get purged sooner).
  • Purged files can be saved to backup tape, another file server or deleted.
  • the file manager keeps track of all SmartCam data files on the SmartServer and files purged to backup or deleted.
  • SmartServer stores MPEG2 video and audio files in one hour segments. Each segment is uniquely identified by its Site Id, Camera Unit Id, and Time Code. In the unlikely event of a corruption ofthe Content Managers database, the data files are still usable based on this information and the Content Manager database can be reconstructed.
  • SmartViewer can view SmartCam data files stored on disk by the SmartServer running either on the SCSS Camera Unit cameras or the SCSS Server Unit.
  • the simple SCSS network interface supports rapid, available historical playback of achieved video and audio content because ofthe data is stored and transmitted with the compact MPEG-2 encoded transport stream format.
  • SCCS video data retrieval and data management capability
  • the SCSS greatly improves security personnel capability.
  • Security systems are normally all closed systems. In a bank for example one doesn't necessarily want to have someone outside looking in. With the SCSS capability one can see what's going on inside a closed area, such as bank, without knowledge of any "bad guy" participants inside. If there is a need or want to show such activity to someone outside, such as a SWAT team, the SmartCam system can be configured to connect to an outside rescue team's equipment, for example with an 802 .11 radio.
  • the SWAT team may then drive up near the location, the System administrator can wirelessly connect them to standard equipment such as an 802 .11 radio and have data transcoded it to them in MPEG 4 H.261. Enabled by the Administrator, they can look at the SmartViewer display on a computer notebook, and even control the SmartCam Camera Unit position through a remote SmartViewer interface. If allowed access, a rescue team can see everything, e.g., where all the perpetrators are, so they can plan to set up entry, defense, or whatever.
  • standard equipment such as an 802 .11 radio and have data transcoded it to them in MPEG 4 H.261.
  • Enabled by the Administrator they can look at the SmartViewer display on a computer notebook, and even control the SmartCam Camera Unit position through a remote SmartViewer interface. If allowed access, a rescue team can see everything, e.g., where all the perpetrators are, so they can plan to set up entry, defense, or whatever.
  • a security system typically has a main camera mounted on the ceiling in a dome. Perpetrators have been known to spray paint the dome thinking that they're blinding the camera. With auxiliary SmartCam units mounted as pinhole cameras hooked up in the system, the Camera Unit camera's can have a vantage point of shooting through an essentially invisible pinhole in a wall. Pe ⁇ etrators can be observed and not be able to see a visible camera.
  • SCSS could run five or more pinhole cameras in a room, and not even have the dome. Now one can take multiple views, switching between alternate Camera Unit cameras.
  • the security Supervisor on Duty may be configured to have extra privileges in the system above their own authorization level. All logging in and out ofthe system actions are logged in the event database.
  • Backup security Supervisor on Duty Is similar to the above. Only certain users, marked in the user database as “can be supervisor” or “can be backup supervisor” can log in as backup Supervisor on Duty.
  • List of security staff on duty A list of all other none supervisor level users logged into the system. This may include level
  • Purged data files on backup tape or backup servers can be loaded back onto hard disk and made available to one or more SmartViewer stations on demand.
  • the file manager also manages events.
  • SCCS can select specific events or groups of events or ranges easily. SCCS can go to the event, and jump before it, play through it, or play it backwards in real time, or show a still picture. SCCS can also do post processing of events, for example, with MPEG 7.
  • the physical architecture of one preferred embodiment ofthe present security system invention is organized into three major functional groups, the remote video surveillance units comprised of a SmartCam camera, camera group or camera groups; the central Server running the SmartServer system application and one or more client workstations each running a [5 SmartViewer application module.
  • the three groups typically are physically separated within a building or complex of buildings and communicating over a (primarily, although not exclusively) wired network.
  • the SmartServer is a centralized security system server with multiple I/O ports, high speed, mass file storage, and a data base server, second, one or more sets of individual remote .0 SmartCam video camera surveillance units with the camera units of each set individually coupled to a local Camera Unit Network through a common communication node that connects to one ofthe Server I/O ports.
  • the SmartCam applications provides each camera unit with the capability of performing a number of predetermined security functions.
  • Each particular camera unit's security functions are
  • the security functions built-in to a particular camera include motion control for camera positioning, local video, audio and event data capture, video data and post- recording data processing and storage, motion adaptive video compression, video field motion
  • the third major functional component ofthe present invention is a group of one or more multi-screen video operator-monitor stations supporting the SmartViewer application module and connected to another ofthe Server I/O ports.
  • the centralized Server (SmartServer) for the present SmartCam security system invention includes a system controller (operating system software and hardware), mass storage for recorded video, audio and event data (AVE data) from the surveillance camera (SmartCam) units and an AVE relational database server for the video monitor (SmartViewer) workstations.
  • SmartServer hardware/software is a high performance computer with high-speed networking and large disk storage.
  • the SmartServer is configured based on the system specification for the number of SmartCam units, the number of SmartViewer stations, and required storage capacity.
  • One embodiment ofthe present security system invention is SmartServer_l is an embodiment ofthe video/audio storage and database server for the present invention. It saves MPEG2 compressed video and audio from the cameras to hard disk, saves all alarm and alert events in a real time database and acts as a system controller. It typically is a Pentium4 based PC, running Redhat7.2 Linux, and loaded with hard disk and network interfaces.
  • the SmartServer provides Video and audio storage and retrieval, and includes SmartStore storage manager.
  • SmartStore allows for full rate (30fps constant), or space saver (2-20fps adaptive) storage of video and audio to hard disk.
  • Alternative versions of SmartCam TM support adaptive frame rate and variable bit rate encoding for improved storage capacity. Because video and audio is stored digitally, on hard disk, there is not loss of quality when copying or backing up data, repeat access or long-term storage.
  • the SmartStore storage manager provides full rate (30 FPS) redundant storage of video and audio to hard disk at motion adaptive variable bit rate; Controllable near, medium and long-term storage configuration options provide for super high-quality, full rate video and near term, dropping to lower frame rate and quality to save storage space for long-term archival.
  • High-quality MPEG 2 (same as DVD and digital satellite) is used for recording video data at motion video compression at 2.7 Mbps to 14 Mbps for full Dl (720 by 480) video.
  • the range of compression options includes Half Dl (352 X. 480) and CIF (352 X. 240) and encoding options at lower space saving bit rate.
  • SmartServer is the core ofthe SmartCam Present invention security system. It provides medium and long-term storage of archived video, audio and events, such as alarms, alerts and security staff watch logs, etc. It provides system level setup, configuration, management and diagnostic functions to the system administrator/security supervisor.
  • Hardware for one preferred embodiment ofthe SmartServer includes: ⁇ SmartServer 1-S: Pentium4 2GHz, 512MB SDRAM, 120GB HDD, 1 100BaseT NIC, 1 lOOOBaseT NIC. 1U, 19" rack mount. Supports up to 16 Smartcaml inputs and 1 SmartViewer.
  • ⁇ SmartServer 1-M Pentium4 2GHz or above, 512MB SDRAM or above, 120GB HDD + 480GB RAIDO HDD, optional 66GB tape backup unit, up to 4 100BaseT NICs and 1 lOOOBaseT NIC. 3U 19" rack mount, dual redundant power supplies.
  • Hardware for another preferred embodiment ofthe SmartServer includes: ⁇ Dual AMD AthlonTM and Intel PentiumTM. 2000 CPUs ⁇ Double data rate PC 2100 SDRAM 512 to 1024 Mb ⁇ Multiple gigabit network interfaces ⁇ Optional DVD-R/ DVD writer for creating DVD videodiscs ⁇ Linux "UNIX" operating system
  • a preferred embodiment ofthe present invention uses a compact CCTV video camera, MTV-54G2H, made by Mintron and available from Sentry Security Systems, Beverly, Ontario, Canada
  • Video, audio and event selection, control and monitoring is provided by a User-friendly point and click SmartCam Hand Controller or by SmartViewer workstation keyboard function keys or by point and click SmartViewer video screen control metaphor icons (FDI Operators [ 5 Manual inco ⁇ orated herein by reference).
  • FDI Operators 5 Manual inco ⁇ orated herein by reference.
  • the typical SmartCam camera units include a built-in remote control mechanical drive that provide a wide range pan, tilt zoom drive capability through dual stepper motors and controllers, integrated power control chip and power drivers.
  • the SmartCam Absolute Pointing System provides high accuracy location of objects in the SmartCam view field.
  • Each SmartCam Camera unit receives, stores and transmits video and sensor data it receives.
  • Video images detected by the SmartCam visual field and data received from its other sensors is compressed, recorded, stored locally (optionally) and transmitted on the network to the SmartServer (and optionally to the SmartViewer) 5 Motion Adaptive Video Compression.
  • SmartCam motion adaptive digital video compression provides security systems with high data transmission rates, fast data storage and retrieval speeds, high capacity, low cost data storage, and enables efficient utilization of operation time and floor space (small foot-print).
  • Standard Signal Filtering Functions
  • MPEG filter functions in the SmartCam are provided by an industry standard chip and used by the SmartCam SmartCoder electronics to extend security system features.
  • SmartCam on board code is encrypted, and the SmartCam processor's built-in decryption 1 st decodes instructions from local storage before operation. Transmissions over the network are also encrypted between the camera, the network, the server, and the display, and especially including the Output transport data stream from the SmartCam cameras.
  • the SmartCam unit processor electronics include an integrated microprocessor, DSP, RAM, ROM, flash memory, network sensor and additional video camera I/O, and a timing crystal.
  • Onboard connectors provide capability for adding an expansion board to add even more extra functionality to the SmartCam.
  • the optional hard-disk storage provides many hours of video capture (130 hours for a 30 GB disk) directly on board each camera unit.
  • the on-board Processor power and generation of compressed, motion compensated digital A/V/E data captured by the SmartCam provide previously unattainable autonomy, security and backup features that make the SmartCam camera unit and SCSS system unique.
  • Local SmartCam Data caching is provided by optional High capacity local hard disk storage.
  • the SmartCam camera unit has Generic device interfaces for connecting to local access devices and environment sensors, e.g., magnetic and optical bar-code card-readers, passive environmental condition sensors (temperature, IR, microphone, etc.)
  • local access devices and environment sensors e.g., magnetic and optical bar-code card-readers, passive environmental condition sensors (temperature, IR, microphone, etc.)
  • a UDP O/P connection provides a high-speed, unidirectional video and audio transport data stream, (no handshake).
  • a TCP-IP I/O connection provides a lower speed, bi-directional interchange for communicating condition and status messages between the camera rest ofthe SCSS (Sometimes called Data Events and Control ⁇ D-E&C ⁇ ).
  • the processor equipped SmartCam units can be programmed for autonomous conditional actions (e.g., delay, sleep, sense, wake, store, and report). For power saving, some functions can be disabled or run in a low power, sampling mode. If a particular condition is detected, e.g., a change in the field of view or a change in a local environmental variable from one ofthe sensors connected to it, total SmartCam function can be restored, and the full resources ofthe SmartCam can execute powerful, high-level processing of its locally stored and real-time data and/or notify the server or broadcast a message through the network. Then the network (client or server) can request, e.g., the last five minutes of data before and the five minutes after the event.
  • autonomous conditional actions e.g., delay, sleep, sense, wake, store, and report.
  • some functions can be disabled or run in a low power, sampling mode. If a particular condition is detected, e.g., a change in the field of view or a change in a local environmental variable
  • the SmartCam program code is encrypted and the processor is configured with passwords and real-time encryption-decryption capability for security.
  • the SmartCam Processor provides autonomous ability to discriminate, recognize and store motion, motion patterns, access event patterns and the like from the video images captured by the camera such as infrared wand or hand symbol strokes for access.
  • the SmartCam Processor also enables autonomous recognition of negative access symbols, e.g., signals or motion patterns that will give physical access to a local threat while simultaneously setting a silent alarm or System notification to summon backup support.
  • An optional hand controller is provided for easy selection and navigation of multiple SmartCam view scenes displayed at each ofthe SCSS SmartViewer display workstations.
  • the SCSS has a heretofore-unavailable "client/server MPEG multi-playback architecture" that permits viewing of stored, compressed digital video in multi-speed forward, reverse, and still modes without any loss or sacrifice in video quality. Supporting documents inco ⁇ orated herein by reference are listed in the Table above.
  • FIG. 1 is an architectural block diagram of the present SCSS invention.
  • FIG. 2 is a block diagram ofthe system software architecture for the security system of Fig.
  • Fig. 2A, 2B and 2C are detail views ofthe block diagram of fig. 2.
  • Fig. 3 is a perspective view for a hardware layout of security system 100.
  • FIG. 4 are perspective and detailed views of a SmartCamTM video camera unit of system 100.
  • FIG. 5 is a plan view of a SmartCoder board for the SmartCamTM of fig. 4.
  • FIGS. 5 A, 5B, 5C and 5D are detail views of major components for the SmartCoder board of fig. 4.
  • FIG. 6 A depicts a block diagram of a preferred embodiment ofthe SCSS dynamic I/O buffer scheduler/manager.
  • Fig. 6B shows a detail block diagram of a preferred embodiment ofthe double buffered trick mode and off speed play system (DSS) for the SCSS invention in fig.
  • DSS double buffered trick mode and off speed play system
  • Fig. 6C and 6D illustrate additional detail block diagram views ofthe DSS shown in fig. 6B.
  • Fig. 6E depicts a block diagram ofthe SCSS dynamic I/O scheduler (DSS) and near-real- time read buffer (NRB) ofthe present invention in combination with the basic system OS I/O scheduler.
  • DSS SCSS dynamic I/O scheduler
  • NRB near-real- time read buffer
  • FIGS. 7 - 7G illustrate video display metaphors for the SCSS system of fig. 1.
  • Fig. 1. Security System Block Diagram Description. Camera groups, server and workstations.
  • Fig. 1 is a system hardware architecture diagram of an embodiment 100 ofthe present security system invention.
  • the system diagram of figure 1 is a top-level view of a preferred embodiment of a hardware architecture implementation 100 in combination with system and application software functions described below that provide the features ofthe present SCSS invention.
  • Each SmartCamTM subnet 124 is a connected camera group of up to 16 separate video cameras arranged in a star configuration with 100BaseTX subnet switch 122.
  • the 16 camera subnet 124 can include multiple types of video cameras 126-156, e.g., one or more ofthe following types: SmartCamTM PTZ 126, SmartCamTM-F 128, SmartCamTM PTZ-ext 130, SmartCamTM-F-ext 132.
  • the subnet 124 can also include "dumb" video cameras (i.e., without added intelligence or extra capabilities) such as a conventional digital video camera 154 and a conventional analog video camera 156 if they are connected to the subnet switch through a separate SmartCoder board 158.
  • An additional 10 cameras 134-152 (not shown) selected from of any the aforementioned types 126-132 and 154, 156 can also be connected to the switch 122 to complete a 16 camera subnet.
  • An archive server 102 is connected to SCSS digital video camera subnet groups 104, 106 each servicing a Camera Unit group 124 of multiple remote SmartCamTM Camera Unitsl24-132 and 128-156 Each camera subnet 124 communicates to the SmartServer 102 by a CAT-5 USB interface 108 through the SmartServer's 100BaseT subnet switch 122.
  • the SmartServer 102 in turn communicates to one or more remote Client Unit Video display and control monitors 110a- 110c through a 1000 BaseT switch and LAN 120.
  • the physical cabling for the SmartCam network communications (108, 120) in accordance with the present invention are simpler than existing security systems which typically require multiple sets of separate cables as in present systems; e.g., video coax plus twin lead for access control devices.
  • present systems typically require multiple sets of USB cables, and sets of twin lead for audio, phone, power etc.
  • Fig. 2 ⁇ C ⁇ data flow: SmartCam, SmartServer, SmartViewer.
  • FIG. 2 depicts a preferred software architecture implementation in accordance with the present invention operating in the hardware architecture 100 of figure 1.
  • the SCSS typically is configured with a SmartCam module 201 running on each ofthe Camera Units (cameras 126 128, — ,) in the subnets 104, 106, a SmartServer module 213 running on the Server Unit 102 and a SmartViewer module 215 running on each ofthe Client Unit workstations 110[].
  • Network 108 employs TCP-IP protocol 202 to exchange event and control data 204 between SmartCamTM running on the Camera Units and SmartServer running on the Server Unit 102.
  • SmartCamTM uses UDP protocol 206 to communicate compressed audio-video stream data 208 from the Camera Unit camera to the SmartServer in the Server Unit.
  • Camera Unit block 200 details.
  • FIG. 2a there is shown a detail of block diagram 200 for SmartCamTM Camera Unit 126 in camera subnet 104 of Fig. 1.
  • Camera Unit 126 represents a SmartCamTM-PTZ digital video camera unit 126 in accordance with the present invention.
  • the SmartCamTM-PTZ camera unit 126 includes a basic SmartCamTM video camera body 224 (including zoom control), a pan/tilt positioning and controller 226, a SmartCoder 228, [including an MPEG coding chip 230].
  • the pan tilt controller 226 receives pan and tilt body position control commands from the SmartCoder PT serial port 232 that drives the SmartCamTM body positioning mechanism (described further below) in the video camera body 224.
  • a serial-Z port 234 connects lens-positioning commands provided by SmartCoder 228 to a built-in zoom positioning mechanism (described further below) in the video body 224.
  • Camera Command and Control data signals 204 are exchanged by TCP/IP protocol 202 over network 108 between the SmartCamTM 126 and SmartServer 102.
  • Compressed Video data signals 208 and event-sensor data signals 212 provided by the SmartCamTM 222 are sent by UDP protocol 206 through network 108 to server 102.
  • the SmartCamTM 126 is provided with a built-in SmartCamTM network interface 236 that supports data encryption 238 ofthe signals 204, 208, 212 providing security for the TCP/IP and UDP transmissions between remote camera 126 and the server 102.
  • the SmartCamTM 126 has an on-board configuration server 240 closely coupled to a camera control application 242.
  • Configuration server 240 provides data storage and retrieval for the compressed video/audio 208 and event/alert data signals 212 as directed by control application 242, through local bus 248 to local SmartCamTM disk storage 246.
  • Control application 242 is coupled to the SmartCamTM interface 236 by control bus 250.
  • the control application 242 communicates to MPEG coding chip 230 over bus/driver 252.
  • the MPEG chip 230 does the video compression of digital video signal 251 generated by the SmartCamTM digital camera body 224.
  • SmartCamTM autonomous/remote control positioning
  • Pan-tilt-zoom (PTZ) server control block 252 in application 242 provides the interface to the camera position mechanism through the serial ports 232 and 234.
  • “Hacker proof capability is provided by encryption of data streams transmitted over the network and instructions in local storage, e.g., camera code.
  • the camera processor has to decrypt instructions from local storage before it operates.
  • the video, audio, and-event data and instructions are encrypted before transmission and decrypted when they are received.
  • the external agent causing a camera logged off could be from natural causes, e.g., lighting strike, power fail, etc. Or it could be could be a deliberate attempt by a hostile agent to invade the system.
  • the advanced logon necessary to recover can be from anywhere through an Administration console function.
  • the SmartCam camera has the ability to go to sleep and wake up on events, store and retrieve what triggered it to start with. Then the network (client or server) can request, e.g., the last five minutes of data before and the five minutes after the event.
  • the duplicate data caching i.e., server and local camera storage, provides additional robust capability for data retrieval in the event of accidental or deliberate damage to one or the other.
  • the local processor is equipped with pattern recognition built-in to the camera unit able to recognize access events, e.g., card swipes in access reader; infrared wand or hand symbol strokes for access.
  • Local processor capability includes Recognition of negative access symbols to give access but that sets off all the alarms and brings backup support. For example, postulate a negative access event scenario: suppose one is being held by an armed assailant who threatens the one unless a safe is opened. One can describe, e.g. make a backward "S" symbol by moving one's hand in the field of camera view that does provide access however that also simultaneously sets off all the silent alarms, and calls the police anyway.
  • Negative symbol access pattern recognition can provide backup or rescue attempts; it could also be a trigger complete system lock down so that no one can open the facility.
  • One can give can give visual cues for positive or negative access while in the field of view ofthe SmartCam video camera. The power of this function provided by the present invention is appreciated by only a few at this time.
  • the motion-adaptive MPEG system improves over prior art systems that use motion JPEG for digital video.
  • motion JPEG security systems once the massive amount of data required for motion JPEG is stored, there is still no economic, efficient way to use any event information that may have been activated during the video recording, because there is no relation or connection with the JPEG video data.
  • the motion adaptive video compression security system invention creates a very elegant, efficient, and economical way to synchronize data and information.
  • events e.g., alarms, sensor data, personnel access, on-site personnel complement changes, and the like are recorded also and stored for later retrieval by reference to the time stamped historical video.
  • the time stamp provided by the digital video camera is basic, but the SmartCam video camera can provide more than just time stamp.
  • the SmartCam camera and its processor can detect motion in its view field. With the camera set looking at view field in a room, something moving in the room, for example, an opening door is detected as an event. The detection of door motion is automatically synchronized to the video recorded.
  • Areas ofthe view field can be masked.
  • a camera set up to look outside a building with a bush in the field of view that is constantly moving from the wind can be provided with a mask to ignore the motion of the bush.
  • sensors that can be connected to the camera generic interface: e.g., fire sensors, temperature sensors, and building controls.
  • a temperature sensor mounted in a fan bearing assembly temperature for a HVAC system an over temperature condition on the bearing can cause the camera to pan, tilt and zoom to go look at the bearing, zoom in to give a high resolution image at the bearing, then it can set off an alarm which automatically alerts maintenance personnel, for example by calling on their pagers. They can then go to any Internet terminal to actually see the bearing or the scene shown by the camera.
  • the SmartCam video camera has superior infrared capability. What your eye sees as total darkness, it sees as fully illuminated. It has high sensitivity: about 0.001 -lumen sensitivity.
  • the camera is optionally equipped with illuminators such as conventional lights, IR illuminators, or recently developed white LED lights, that light up an entire room, in color.
  • illuminators such as conventional lights, IR illuminators, or recently developed white LED lights, that light up an entire room, in color.
  • a SmartCam video equipped with one single white LED lights can image a scene in full-color.
  • the SmartCam video camera has a port for connecting to five additional video inputs. For example it can connect to a finge ⁇ rint camera, an iris reading camera for reading retinal - patterns, and the like. Their outputs can also be switched to the server.
  • the SmartCam camera board There are two encoders in the SmartCam camera board. Two cameras can have their video scenes encoded at the same time, with their output streams simultaneously delivered to the server over the network.
  • the SmartCam video camera can be encoding, transmitting, and storing video and events from two cameras at the same time through the single camera controller board.
  • the SmartCam Camera Unit there is sufficient processing power to do local biometrics.
  • the SmartCam can do finge ⁇ rint recognition processing on board of.
  • There is enough storage capacity in the single board Camera Unit to store an entire employee finge ⁇ rint database, to compare with the finge ⁇ rint data captured by the camera.
  • a Camera Unit can store the finge ⁇ rints of 100 employees who are supposed to come through a particular access door, take the video of their finge ⁇ rints as they enter, do the comparison right on the camera board and accept or reject entry automatically.
  • the SmartCam has an input to the video iris at the MB 1 connector. So the camera can allow access for the employee by itself, without human intervention.
  • That database is stored in local flash memory, up to 128 MB (or more, depending on the flash memory package selected) in the flash cartridge on the board
  • processors on the board, a 400 MIPS RISC processor the [MIPS RISC] and an additional processor [processor 2, e.g., a DSP chip], DMA circuitry, an encoder [dual encoder chip], the video input port MB 1.
  • the Camera Unit only needs one board.
  • the single board Camera Unit implementation has 40 GOPS of processing power.
  • the single board implementation can do full dual camera, access management, biometrics processing, e.g. finge ⁇ rints recognition, iris recognition, Histographic display processing, while processing video images with at least 725x480 pixels, standard CD density.
  • biometrics processing e.g. finge ⁇ rints recognition, iris recognition, Histographic display processing
  • processing video images with at least 725x480 pixels, standard CD density standard CD density.
  • the SmartCam architecture provides for an expansion board in the Camera Unit for enhanced capability of normal applications and for additional applications.
  • the SmartCam PC Board has two expansion connectors that allow all ofthe PC Board data buses (audio, video, and PCI) to connect to another card that is the same size as the first.
  • the second card provides expansion capability, e.g., add in full biometrics processing.
  • Full biometrics processing expands on finge ⁇ rint recognition to include for example face recognition; for really complex applications, an MRI can be processed locally.
  • the SmartCam can run an MPEG-2 high-definition TV output.
  • a SmartCam Camera Unit can run a stand-alone multimedia kiosk for a venue like the Olympics, e.g., a camera, a 16 by 9 display with a browser running in the MIPS processor.
  • the dual board implementation provides high-definition TV and advanced biometrics, e.g. face recognition.
  • the SmartCam video camera can do complex biometrics processing, e.g. face recognition, target recognition, MPEG 21 processing, MPEG 7, content searching, HDTV input and output, e.g., HD density, CCIR.601 is 3.2 million pixels vs. 800,000 pixels for CD standard.
  • biometrics processing e.g. face recognition, target recognition, MPEG 21 processing, MPEG 7, content searching, HDTV input and output, e.g., HD density, CCIR.601 is 3.2 million pixels vs. 800,000 pixels for CD standard.
  • An alternative advanced generation SmartCam HDTV camera has even more capability.
  • the SmartCam HDTV camera uses two CCDs with a single prism; one CCDs with a green filter, the other CCD with a red-blue V% resolution mask filter.
  • the filtered and masked CCDs provide full- color when combined with the prism.
  • the red blue CCD filter has half the resolution ofthe green to match the spatial resolution and sensitivity characteristics ofthe human eye (57 percent green, 33 percent red, and the balance blue).
  • SmartCamTM video cameras 126-158 may have a wide range of performance.
  • SmartCamTM Camera Units range from the complete PTZ SmartCamTM capability including pan/tilt/zoom stepper motor driven cameras 126 with full onboard microprocessor intelligence and local storage, to "dumb" digital cameras (SmartCamTM-C 154) with little or no added onboard intelligence) connected through a "SmartCoder” Board 158.
  • Conventional analog video cameras 156 are usable with the security system ofthe present invention if they interface through another SmartCoder board 158.
  • SmartCam Camera Unit options include: • SmartCam-PTZl : a Pan/Tilt/Zoom camera with embedded SmartCoder (MPEG compressor, 100BaseT network).
  • Smartcaml -F a fixed position version of Smartcaml -PTZ. It uses a similar camera mount cradle but is not motorized. It is positioned by hand.
  • Smartcaml -PTZ-Ext an exterior (outdoor, weather proof housing) version of Smartcaml -PTZ.
  • Smartcaml -F-Ext an exterior (outdoor, weathe ⁇ roof housing) version of Smartcaml -F).
  • Smartcam 1 -PTZ-C a stripped-down version of Smartcam 1 -PTZ with the camera and motor system of Smartcaml -PTZ, but without the SmartCoder MPEG module. It connects using conventional coax and RS232 serial cables.
  • SmartCoder MPEG module takes several configurations.
  • Smartcoderl is the MPEG2 encoder module used in Smartcaml, but in a stand-alone configuration. It is used for interfacing existing, conventional video cameras and other video sources, into the present invention security systems network.
  • Smartcoder2 is configured for HD resolution or MPEG4.
  • SmartCoder configurations include:
  • Smartcoderl-XS 1 channel MPEG2 encoder with 100BaseT NIC and RS232 device control port in a plastic box with external power and optional hard drive.
  • the box is the SmartCam base unit and is about 6"x7"x2".
  • Smartcoder 1-S 2 channel MPEG2 encoder with 2 100BaseT NICs and internal power supply in a 1U, 19" rack mount, metal case, approximately 9" deep.
  • Smartcoder 1-M 6-channel MPEG2 encoder with internal 100BaseT switch and 1 100BaseT NIC. 1U, 19" rack mount, metal case, approximately 24" deep.
  • Smartcoder 1-L modular system supporting up to 16 MPEG2 boards in a 5U, 19" rack, metal case.
  • a selected Real-time compressed video stream 209 from SmartCamTM units can optionally be passed directly to SmartViewer stations through parallel UDP channel 214 by a built-in Server function (Linux mrouted 216) communicating through the network/switch 120 with the SmartViewer stations 110.
  • a built-in Server function (Linux mrouted 216) communicating through the network/switch 120 with the SmartViewer stations 110.
  • Fig. 2b illustrates a block diagram in accordance with the present invention, ofthe SmartServer 102 shown in Fig.l.
  • the SmartServer 102 ofthe present invention provides users with greatly improved capability to play MPEG video backwards, forward, in still, and in slow motion in either direction. These are capabilities previously extremely difficult or impossible to do with prior methods, making it easy to do what have sometimes been referred to as playing "trick modes".
  • the SmartServer also provides the capability to associate and select related video streams on user command, for delivery to the SmartViewer stations for playing them together, e.g. on adjacent monitors.
  • the SmartServer architecture can be implemented either in hardware or software. It was originally implemented as a prototype with an 850 board. However, a current implementation ofthe SmartCam/SmartServer (with a single, fast, PentiumTM IV processor), can do simultaneous decodes of up to 8 compressed video streams. It is expected that performance will scale as processor speeds continue to increase.
  • SmartServer 102 receives data from SmartCam 126 through SmartCam network interface 250.
  • Incoming UDP data is routed in parallel by the network interface 250 to go in two directions: through a standard Linux mrouted function 254 directly to the SmartViewer block, and through decryption/verification block 256 to demux block 260.
  • the decryption /verification block 256 decrypts and verifies the incoming transport stream and feeds the demux block 260 in audio/video block 258.
  • the demux block 260 separates the decrypted transport file stream into video data stream 270 and audio data stream 272.
  • the two data streams 270, and 272 are fed through respective video, audio file segmenting blocks 274 and 276 that segment the file streams into corresponding synchronized one-hour segments with unique file identification.
  • Segmented video data 278 files and segmented audio data 280 files are stored on the server hard disk in separate files: video file segments stored in video file 282 to and audio files segments stored in audio file 284.
  • File segmenting blocks 274 and 276 also output segmented event and sensor data files with related unique file identification that are stored on a third server disk file 286 ofthe SmartServer File Store. When demanded by a server client, video, audio, and data files 288, 289 and 290 are retrieved by data retrieval block 292.
  • the encryption/verification block 256 also communicates real-time event data 294 2 and event data block 296.
  • Event data block 296 has an event logger 224 and configuration manager 222.
  • the event data block 296 operates an SQL database 297 for handling configuration-state, event data retrieval.
  • the SQL database 297 exchanges SQL commands and data 298 regarding configuration state and events with event logger 224 to store and identify event data received from the SmartCam 126.
  • the SQL database 297 also exchanges SQL commands and data regarding configuration state and events with the SmartViewer through its network interconnection 112 using TCP/IP protocol.
  • Transcoding for the data server and multimedia is inco ⁇ orated in the system software as described in U.S. patent 6,188,428 "Transcoding video file server and methods for its use "., which is inco ⁇ orated herein by reference.
  • SmartServer archival store every event is automatically time stamped by its corresponding video frame.
  • the time-stamp is created by the time encoded with the video frame data.
  • the stored data is organized for convenience as three kinds of files: video, audio, and data.
  • the continuous streaming video/audio/event-data is divided into one-hour segments.
  • the files are segmented into one-hour segments for easier storage and retrieval.
  • the motion adaptive audio/video transport streaming is generated in the camera.
  • the SCSS system includes a very powerful, sophisticated and flexible alarm and alert handling system.
  • the event handling system logs all system events, such as motion, sound, door, window, smoke and fire alarms, alerts such as access to secure areas, internal events such as SmartCam encryption and network errors, and administrative events such as security log in, log out, administrator accesses, configuration changes, etc.
  • the SCSS uses a standard relational database for event handling. Preferably selected from one of a number of conventional full featured, highly reliable SQL database applications.
  • a relational database provides features for "data mining", i.e. find all occasions when a particular combination of alarms occurred and a certain staff member was on duty, or plot a graph ofthe occurrence of a certain alarm at each hour ofthe day for the last 2 years.
  • Each event record contains the following information: o Current alarm state (For each event state, New, Acknowledged and Reset (see below), the following information is saved) o Event code o Event code optional argument
  • Each approved user ofthe system has a numeric identifier between 1 and 2**32. Users, even ex employees, are never removed from the system, just marked as "defunct" and their code is never reused.
  • User names are displayed on the user interface via the language database to support alternate character sets, i.e. Western, Chinese, Kanji, etc.
  • Each user's record in the user database includes an authorization level. The number of authorization levels and the descriptions are configurable in the user database.
  • a typical setup may be:
  • 0 no authorization (may be used for system administrators or people who need to be tracked in the system but do not have not authorization in the event system)
  • l low level user
  • each user has system administration levels which art different from authorization levels used for event handling. System administrators have special privileges allowing them to change system settings). As discussed above, every type of event in the event configuration database has authorization levels for acknowledge and reset operations. If a user with insufficient authorization attempts to acknowledge or reset an event, the action will fail.
  • An event has a "lifetime” in the Evolution system.
  • an event such as motion detection occurs
  • an event record is created by the event logger and marked with the state "new".
  • this may sound an alarm buzzer or voice, flash a red alarm light on the SmartViewer user interface or call the police.
  • the event will be logged into the event database as described below.
  • an approved security office will "Acknowledge" the event. This may be accomplished by entering a password into a SmartViewer station or if available, touching finge ⁇ rint recognition pad.
  • the state ofthe "Current event state” tag in the database is changed to acknowledge and an additional record is added to the event record to save the information relating to the acknowledge, as described below.
  • the event is either moved to a "acknowledge list" for the color is changed from red to yellow.
  • Resetting an event is similar to acknowledging and event. Both event acknowledge and reset can be configured on an event-by-event basis, to only allow operators of a given authorization level to act on that event. So for example, it can be configures in the configuration manager to allow none supervisor level operators to acknowledge a certain event, but require a supervisor to reset it. Like acknowledge, resetting an event changes the value ofthe "Current Event State" record to "Reset" and adds information to the event record containing the following data:
  • Event Code optional argument It is sometimes convenient to group together related events, such as “server disk 80% full",
  • event ids with optional arguments.
  • Optional arguments follow a format, familiar to computer programmers called the "printf format.
  • the English text version of this event string will be "server disk %d full” and the optional argument will be the actual percentage.
  • the user interface will convert the %d into a decimal number for display.
  • the argument types can be numbers or strings (text).
  • the event code is a 32-bit number (16 billion possible codes) that is translated to a human readable text message (and optionally voice message) by the language database. This allows the database be read by operators in any ofthe available languages in the Evolution system without any change to the database. Language selection is chosen on the SmartViewer user interface and it is possible to support multiple languages on different SmartViewer stations in the same facility, simultaneously.
  • Event source id The Evolution system comes pre-configured with standard events such as “motion detected”, “fire detected”, “door opened”, “camera network failure”, “supervisor login”, etc. in all supported languages. Standard events use event codes 0-999,999. The system administrator can add site-specific events above 1,000,000. Event source id:
  • Each event source such as a camera, a motion detector, access control pad, the system its self, has a unique Event Source Id number from 1 to 2**32.
  • the event sources are divided into groups, 0 for the system, 1-999,999 for cameras, 1,000,000 to 1,999,999 for sensors, etc.
  • the Event Source Id is converted into text via the language database.
  • the SmartViewer user interface can be set to UTC or Local time display, but this has no effect on the SmartServer. It always works in UTC.
  • Standard system configuration The SmartCam system 100 comes pre-configured with standard events such as “motion detected”, “fire detected”, “door opened”, “camera network failure”, “supervisor login”, etc. in all supported languages. Standard events use event codes 0-999,999. The system administrator can add site-specific events above 1,000,000.
  • This event is unique to ABC Bank Branch 123 and has the code 0054257.
  • Code numbers is created by the system by adding 1 to the last event code.
  • Time codes in the Evolution system are base on Universal (formally known as Greenwich) Time Codes. This time code is the same all over the world does not change between summer and winter and is unambiguous. Internally, time codes are save in seconds and milliseconds (l/1000 th second) since Midnight, Jan 1 st , 1970 in Greenwich, England. Time is displayed in the user interface in the standard format: ⁇ : MM: DD-HH: MM:SS in the 24hr clock system. All event transitions (New, Acknowledged, Reset) are tagged with this time code.
  • Zone id A zone is a grouping of cameras and other sensors into a logical group, usual a physically separate area within a building, such as "main lobby", "vault” etc.
  • the system administrator based on the physical layout ofthe facility creates zones. It is not necessary to create zones but it can be a useful tool for managing related events.
  • a unique Zone Id is a number between 1 and 2**32 assigned to each zone. Zone Ids are converted into text by the language database. Event handling system
  • the SmartCam Evolution system includes a very powerful, sophisticated and flexible alarm and alert handling system.
  • the event handling system logs all system events, such as motion, sound, door, window, smoke and fire alarms, alerts such as access to secure areas, internal events such as SmartCam encryption and network errors, and administrative events such as security log in, log out, administrator accesses, configuration changes, etc.
  • the event handling system is based on a full featured, highly reliable SQL relational database.
  • a relational database provides features for "data mining", i.e. find all occasions when a particular combination of alarms occurred and a certain staff member was on duty, or plot a graph ofthe occurrence of a certain alarm at each hour ofthe day for the last 2 years.
  • Each event record contains the following information:
  • SmartViewer Client Unit (monitoring and control) stations detail data flow.
  • SmartViewerl is the user interface to the Present invention security system. It provides a graphical user interface with a 4x4 grid (16 camera views) of video images on the primary graphics display. It also supports 4 slave monitors, each displaying 4 full size, real time, and video windows.
  • the user interface provides PTZ (Pan Tilt Zoom) control for each SmartCam on the Present invention security network.
  • the user interface also provides access to the alarm/alert database and supports historical review of achieved video and audio from the SmartServer.
  • the historical review feature allows for random access, frame by frame, or jog- shuttle type control and playback of video at play speeds of 1/30 th real time to 5 OX real time in forward or reverse direction.
  • SmartViewer is a Pentium4 based PC running Redhat7.2 Linux, with a high performance 2048x1536 pixel resolution AGP video card and up to 4, 4 channel MPEG2 real time decoder cards.
  • SmartViewerl components may include: Pentium4 2GHz processor, 512MB SDRAM, a 60GB ATAIOO HDD, a lOOOBaseT NIC, an Nvidia Geforce2 Titanium AGP4X, 4xTL850(FT) MPEG2 decoders.
  • a standard commercial chip provides 2D and 3D Identify and reference filtering functions for the system.
  • the SCSS system automatically generates an event log that's time-base-stamped, directly to the video. From the SmartViewer GUI interface, one can go directly to the video associative with an alarm or alert event and see what happened at (and around) that event. No existing system provides such time synchronization between recorded video, audio and an associated alarm or alert event.
  • UDP protocol i.e., unidirectional transmission, is used on the network between the camera and the server for the video and audio transport streaming.
  • TCP-IP protocol i.e., bi-directional transmission
  • TCP-IP is used on the network between the camera and the server for event-data and control.
  • TCP-IP is also used between the server and the GUI interface monitors.
  • SmartViewers include a configuration browser 218 communicating with the SmartServer 102 through an Apache Web server link 220 to exchange database commands 221 and retrieve stored audio/video data 223.
  • the SmartCam system can also do post processing of events, for example, with MPEG 7.
  • the SmartViewer 241 also has a SmartCam network interface 250 with a decryption/verification block 256 that receives the routed UDP data 252 from routed block 254 in the server 102 over the separate network 112.
  • the routed data 252 is decrypted and verified by the SmartViewers' decryption/verification block 256 and fed to a stream routing block 243 which feeds demux 247 inside HDTV decode block 245.
  • Each driver 255, 257, 259, 261 drives 4 displays to give a total of 16 displays.
  • the API driver combination replicates four times. These are HDTV decoders for up to 16 windows.
  • Demux 247 splits the routed data 252 in to MPEG audio streams 252 and MPEG video stream 251.
  • the video streams 252 and 251 are recoded by video PIP 253.
  • Audio, video streams 252, 251 are combined by four stream multiplexer 253 to feed video drivers 255, 257, 259, 261.
  • Each video driver is connected to one ofthe four video display units 114-120, that each can display up to 16 windows.
  • FIG. 3 there is shown a perspective view of a hardware implementation in accordance with the present invention, ofthe SCSS system shown in figures 1 and figure 2.
  • Fig. 3 represents a hardware implementation ofthe software architecture shown in figure 2 and the system diagram shown in figure 1.
  • Two SmartCam subnets are shown connected to the archive server disposed in a server rack for example in one floor of a building.
  • the server connects tlirough the 1000 BaseT switch and the Ethernet connection to a separate SmartCam
  • SmartViewer station located for example on a different floor ofthe same building or even a nearby building removed from the archive server and camera subnets by the distance limited only by the particular capability ofthe Ethernet connection.
  • Each 16-camera subnet 124 connects to the archive server 102 over the 100BaseTX network 108 by a single CAT-5 cable 302.
  • a single cable 302 can serve up to 16 cameras for example on one floor of a building, or one large room of one floor.
  • the archive server 102 and 1000 BaseT switch are typically located in close proximity to one another, for example on a server room rack 302 with the switch communicating to remote SmartViewer station 110.
  • the SmartViewer station 110 acts as a master station that can drive up to the four separate monitors 114,116,118, 120 for displaying multiple video windows or full- size video images (as described further below).
  • the security system 200 can be populated with a plurality 212 of additional SmartViewer master stations 110 at other remote locations by simple scaling ofthe switch 112 interconnections.
  • Fig. 4 SCSS SmartCamTM Camera unit.
  • Figs. 4A-4E depicts several alternative perspective views of a SmartCam camera unit 400.
  • Camera 400 is one member of one ofthe camera subnets 124, 204 shown in fig. 1, fig. 2 and fig. 3.2.
  • the SmartCam unit 400 is a Remote Pan, Tilt, Zoom Video Cameras with an embedded Web Server described below.
  • Fig. 4 A shows one camera body 400, with zoom lens 402 projecting out from lens 402 focuses along zoom axis 403 by a programmable zoom factor according to the specifications of camera body 400.
  • the body 400 is supported and positioned by pan/tilt mechanism 404.
  • the pan/tilt mechanism is mounted on base unit 406. Hidden from view in fig. 4, but described further, below.
  • Each camera electronics module separately receives power and exchanges video, audio, data and control signals with the server 102 through its connection with subnet lOObaset switch 122 and network 108.
  • Cable bundle 408 provides electrical connection between the camera electronics module inside base unit 406 and the camera body signals provides electronic interface from the server 102 to the camera body and the positioning mechanism 404 through the system network cables 302, switches 122and the
  • Electrical connections 410 communicate power, control and digital video signals between the camera bodies 400 and communicate power, control and digital video signals between the camera body 400 and the base unit 406.
  • the mechanism 404 includes a bifurcated yoke 412 mounted to rotate about a first rotating axis (indicated by arrow 414) for panning the camera lens around an orthogonal horizontal plane when the base unit is mounted with the first axis 414 oriented vertically.
  • the yoke is fitted with a second rotating axis assembly 416 orthogonal to the first axis.
  • the second assembly 416 supports and positions the camera body so the lens can be angled to view up and down over a range of elevation (pitch angle) relative to the horizontal plane.
  • the Camera Unit 400 is a SmartCam PTZ video camera unit is a High resolution Pan/Tilt/Zoom Video Camera with embedded MPEG2 encoder and 100BaseT network.
  • SmartCam provides new levels of image quality, features and affordability for professional video surveillance and security applications. Based on a super sensitive CCD imager and professional grade optics, SmartCam provides crystal clear images at levels as low as 0.02 lux at zoom ratios of up to 220X.
  • a high quality, stepper motor drive system provides years of trouble free use with high positional resolution and accuracy.
  • the advanced user interfaces provides user-friendly control of camera position and zoom via a point and click interface.
  • the embedded MPEG2 encoder produces DVD quality, real time, and full motion video at data rates from 2.2 to 14 mpbs.
  • the image quality available to security staff is far higher than conventional analog and motion JPEG systems.
  • MPEG2 is the most efficient storage format for high quality video and audio; hence it's use in DVD and digital satellite broadcasting.
  • An optional internal hard disk drive provides for backup video storage in the camera in case of damage or destruction of the control file server.
  • An optional, high quality directional microphone allows for recording on sound with the video.
  • SmartCam also provides a variety of general-pvnpose digital and analog input and output signals for low cost connections to motion sensors, door and window contact closure switches, electronic door locks, temperature, fire and smoke sensors etc.
  • Microphone Specifications Electret condenser microphone elements deliver sonic clarity and high output. ° Super- cardioid polar pattern picks up only those sounds in the view ofthe camera. Rejects any disturbing ambient noise.
  • Variable or fixed bit rates 2.2 to 14 mbps at full Dl resolution • Half Dl and SIF optional
  • Onboard Storage Specifications Up to 160 GB of in camera storage for local storage of up to 12 hours at full-size and full frame rate.
  • IP Based Can be viewed from any Internet connection, with proper security passwords.
  • Embedded Linux for high reliability and security.
  • Pan and Tilt - Point and Click Interface that allows the user to control pan and tilt ofthe camera by clicking on the image. The location that was selected is now "centered" in the image.
  • Zoom Factor - Point and Click Interface that allows the user to adjust the zoom from 1 to 220x percent.
  • Video Input - 5 auxiliary inputs, the video-input format (Composite, S-Video, RGB, YUV) can be selected through the use of a menu pull down.
  • Image Quality The image quality can be adjusted through the use of a menu pull down.
  • Image Resolution The displayed image resolution can be selected from 4 resolutions (160 x 120, 320 x 240, 480 x 360, 640 x 480, 720 x 483 and 720 x 580) through the use of a menu pull down.
  • Compression Selection - Compression parameters (bit rate, GOP size, etc.) can be selected through the use of a menu pull down.
  • SmartCam unit 400 connects to the server 102 through the network 108.
  • SmartCam video unit has an Ethernet port (described below) that provides the network 108 UDP data and TCP/IP data streams simultaneously.
  • the unit 400 also provides RS 232/RS 422 serial port (port RSP-1) access to other sensors or access ports.
  • each camera unit 400 there is provided an access port (not shown) port MB1 to connect up to five additional camera ports
  • an access port (not shown) port MB1 to connect up to five additional camera ports
  • one SmartCam 400 connects to a finge ⁇ rint camera, an iris for reading retinal patterns, etc.
  • SmartCam 400 outputs can also be switched through the Ethernet port EP-1.
  • the SmartCam 400 unit can optionally run two cameras at once with two built in encoders provided. With two cameras selected both video streams can be encoded at the same time, with their output streams delivered simultaneously through the Ethernet port EP-1.
  • the one SmartCam unit can be encoding, transmitting, and storing video and events from two video camera bodies at the same time through the single SmartCam controller board.
  • FIG. 5 there is shown a plan view of an implementation of a SmartCoder board 500 in accordance with the present part ofthe SmartCam invention, shown in figure 4.
  • the SmartCoder board 500 provides features and functions described in U.S. patent
  • SmartCoderTM is a network appliance that performs real-time network video processing functions.
  • the SmartCoderTM board 500 has the same form factor as a 3.5-inch hard disk drive. This small form is accomplished by utilizing several state ofthe art system on a chip IC's.
  • SmartCoderTM utilizes a fast CPU 502 (300Mhz 64Bit MIPS CPU) running the Linux operating system. SmartCoder is controlled remotely via a 100Mbit Ethernet network interface 504. SmartCoder can stream compressed video to IP networks using standard network protocols.
  • SmartCoder board also supports SmartCodecTM and SmartNetworkTM expansion boards (not shown) having the same form factor as the SmartCoder.
  • Analog Component Video, or CCIR656 Digital video is compressed in by real time by a MPEG systems encoder 506 (NEC uPD61051).
  • the MPEG Encoder outputs a MPEG1 or MPEG2 Video Program, and MPEG Layer 2 Audio Program encapsulated in a MPEG Transport stream at ether variable or constant bit rates of up tol6Mbits per second.
  • the MPEG Encoder 506 has a dedicated MPEG transport output 507 interface and is connected directly to one ofthe 3 transport inputs of a commercial multi-function STB/PVR Controller 508 (Teralogic TL811 PVR Set-top controller).
  • the controller 508 provides multiple functions: multiple transport demux, descramblers, conditional access, PCI bridge, CPU local bus, I2C, and SmartCard interfaces, IDE interface, UART and GPIOs.
  • the STB/PVR Controller 508 is the central nervous system ofthe SmartCoder design. All
  • Systems buses originate from the TL811.
  • System memory is controlled by the TL811.
  • SmartCoder utilizes 64Mbytes of PC 100 SDRAM 510 for system memory.
  • the STB/PVR Controller 508 contains three function specific RISC processors, MPEG2 transport stream processors 548 (2 each) and IOP (I/O Processor) 550.
  • the MPEG2 transport stream processors 548 are capable of indexing, de-multiplexing, and routing transport packets al data rates of up to 80Mbits/sec.
  • the IOP 550 offloads I/O processing from MIPS CPU.
  • the IOP controls UDMA66 Scatter/Gather SMA 552, Network DMA Engine 554, Smartcard 556, I2C devices 558, Local Bus DMA engine 560 and UARTS 562.
  • the SmartCoder board's video processing starts with a video decoder 564 (Philips Semiconductors SAA7118).
  • the video decoder 564 is connected to the STB/PVR Controller 508 by an I2C control bus 566.
  • Analog video received by the SAA7118 is converted to CCIR656 digital video 568.
  • the CCIR656 video is then routed by a digital cross bar switch 570 to either the MPEG2 Video encoder 506 or to an expansion bus 572 (connectors 572A and 572B).
  • the MPEG ENCODER 506 performs all ofthe on board video compression processing. It receives CCIR656 video input 584 from the digital cross bar switch 570, receiving input from one of three possible sources: source one is the SAA7118 video decoder, source two is CCIR656 dedicated input video connector 588, and source three is from the expansion bus connector 572.
  • the CCIR656 video 584 is compressed under program control using either MEPG1 or MPEG2 coding at constant or variable bit-rates from IMbs to 14Mbs. Two channels of PCM audio are also compressed using MPEG Audio layer II coding. The compressed audio/video stream is output as a MPEG2 Transport stream.
  • the STB/PVR Controller 508 receives the compressed video/stream from the MPEG encoders transport stream output port 528. Using one ofthe transport stream processors 548 it then generates a frame accurate time index from the compressed audio/video stream. The frame index and the compressed audio/video stream are then either stored on local SmartCam unit hard drives (not shown) or encapsulated in " an IP network streaming protocol and sent by 100BaseT Ethernet to a decode client, e.g., a SmartViewer workstation.
  • Y + C inputs 580 There are eight analog Y + C inputs 580 (or four analog component inputs 582) with embedded or separate sync, There are also two CCIR656 video inputs 584 and one CCIR656 video output 586. One CCIR656 input is on dedicated connector 588 and the other is on expansion bus connector 572. Two camera connectors 590 are available: each camera connector utilizes the same Y + C input channel. The remaining analog video inputs are on D4 video breakout connector 594. Break out cables 596 then provide a combination of Y+C and/or component video inputs.
  • CCIR656 Video 584, 586 Three P15C16214 12Bit 3 to 1 Mux/Dux devices 598 are arranged as cross bar switch 570 for the CCIR656 Video 584, 586.
  • CCIR656 video can be routed to/from the MPEG Encoder, Expansion Connector or the Video decoder. Analog video inputs are switched via the video decoder 564.
  • a brief summary of some expansion boards optionally included in a SmartCam unit in accordance with the present invention are listed below:
  • High Definition MPEG Decoder for conferencing applications.
  • a Video DSP for machine vision and real-time imaging processing applications.
  • Ethernet port EP-1 PC card. Document 11-1] this is the controller; UDP data and TCP/IP data streams come out ofthe Ethernet port simultaneously.
  • RS 232/RS 422 port connects to sensors or access ports; that is the serial port port MBl P 4.
  • [MB1-P4] connect to five additional video inputs. For example it connects to a finge ⁇ rint camera, an iris for reading retinal patterns, etc.
  • Each Camera Unit can optionally run two cameras at once. There are two encoders in the camera so with two cameras selected their video streams can be encoded at the same time, with their output streams delivered through the Ethernet port EP-1.
  • the camera can be encoding, transmitting, and storing video and events from two cameras at the same time through the single camera controller board.
  • FIGs. 6A-6D there is illustrated preferred client/server video playback dataflow architecture in accordance with the present SCSS security system invention.
  • the SCSS design philosophy and architecture enable uniquely efficient trick mode and off speed playback of MPEG data.
  • the Futuretel SmartCam MPEG2 based digital video surveillance system is a client/server based live video streaming system with multi-channel DVR (Digital Video Recording) and playback. In order to support video surveillance applications, it provides the following playback features:
  • Figure 6A depicts one view ofthe SmartCam camera, SmartServer server and SmartViewer client architecture for describing trick mode, off speed and reverse playback of MPEG digital Video and audio captured by the SmartCam.
  • the SmartCam digital Video and audio is digitized and compressed in the MPEG encoder, then buffered in the encoder streamer buffer and sent to the network, which has its own network driver buffer.
  • the network read has some internal buffering, followed by a write buffer and a dynamic I/O server scheduler to optimize disk I/O.
  • the File Store Hard drives I/O performance is improved when large blocks of data can be written and read as a single contiguous unit. If the disk heads need to seek to access many small data chunks, performance is significantly reduced.
  • Disk write and read buffers of about 2MByte per channel are found to work best on better performing, currently available IDE RAIDO and RAID5 configurations.
  • 2MByte of buffering at an average data rate of 2.2Mbps gives about 7 seconds of buffering.
  • the dynamic server scheduler provides another incremental gain for SCSS performance over previous systems.
  • I/O request issued to the OS file system are treated with out regard to the criticality of response time, from the point of view ofthe application submitting the data.
  • the OS generally has its own strategy about the order in which it will schedule an I/O request it receives.
  • the OS scheduler in the file system my divert some applications I/O request into a temporary memory buffer, while it sends a later arriving I/O request to the disk because of a built-in disk efficiency strategy ordering the sequence that I/O requests are scheduled to the disk.
  • I/O Buffer size is selected On the SmartViewer client, there is a network read buffer, a decoder input buffer and decoded video and audio frame output buffers.
  • One ofthe most powerful capabilities ofthe SCSS system is the ability to pause live video and quickly play back video from the SmartServer storage to within 10 seconds of real time. Since the data path from camera, through the buffering, hard disk and network is about 30 seconds, it is necessary to provide a disk i/o pipeline bypass buffer in addition to a normal disk write buffer.
  • a disk write buffer will be sized to allow accumulation of a chunk of input data without overflow while the previously read chunk of data is written to disk.
  • This buffer allows the SmartViewer Client to request data from the server that has not yet written to the File Store hard disk, or is not yet available for read back from the hard disk. It is a memory buffer that accumulates the already written contents ofthe disk write buffer for a period of time long enough for the normal disk path to take it's place, usually about 30 seconds.
  • the Camera unit can run SmartCam and SmartServer; the Server Unit can run SmartServer and SmartViewer; the Client Unit can run SmartViewer.
  • the combination ofthe pipeline bypass (write buffer extension) buffer invention the smart scheduler/manager invention enable improved multi-channel streaming performance and enhanced near-real-time display performance.
  • the SCSS system can be scaled up to many Camera Units each running their own SmartCam module, a stand-alone Server Unit running SmartServer and many Client Units running their own SmartViewer modules, communicating with each other over the network.
  • the SmartCam DSS is inte ⁇ osed between the Linux OS and the Linux I/O scheduler, to intercept all SmartCam application disk access commands (i.e., I/O requests) places them in a queue while they are pending and schedules and manages pending and manages. Any SmartCam application disk access commands (i.e., I/O requests) places them in a queue while they are pending and schedules and manages pending and manages. Any SmartCam DSS is inte ⁇ osed between the Linux OS and the Linux I/O scheduler, to intercept all SmartCam application disk access commands (i.e., I/O requests) places them in a queue while they are pending and schedules and manages pending and manages. Any
  • SmartCam disk I/O command of which there are ⁇ 40 in Linux, (e.g., reading/writing, directory, change attributes, change ownership), that is, any command that access the disk, is supported by DSS instead ofthe normal Linux I/O process.
  • DSS intercepts SmartCam application disk access commands (i.e., I/O request) instead ofthe Linux I/O scheduler: particularly e.g., write data (for recording video camera MPEG transport stream data), or read data (recover stored video frame data for the SmartViewer to display), DSS requires three additional parameters with a SCSS I/O request: Priority Class, Initial Priority Level and a Pointer to a Priority Level priority routine. If not supplied by an I/O request, then DSS will assign default values (pre-determined by the System Administrator or equivalent).
  • Priority Class i. A fixed priority for each class of I/O request.
  • Priority Class value for an I/O request is fixed throughout the life ofthe I/O request, until it is removed from the DSS Queue.
  • Priority Level i.
  • the calling application assigns an initial Priority Level to the I/O request when passed to the DSS queue manager.
  • Priority Level is a dynamic priority value evaluated by a Priority Level priority routine, iii.
  • the Priority Level priority routine is a routine identified by a Priority Level priority routine parameter passed to the DSS as part ofthe I/O request.
  • the calling application assigns the Priority Level priority parameter when the I/O request is passed to the DSS.
  • Different Priority Level priority routines can be defined by an application so that the Priority Level for a particular I/O request by an application can be evaluated by a different Priority Level priority routine while it is in the DSS queue. vi. It is an option to have different Priority Level priority routines for each class and different Priority Level priority routines for different I/O requests within a class, vii.
  • the DSS has default values for Priority Class, Initial Priority Level and a default Priority Level priority routine for I/O requests issued without a
  • Priority Class, or Initial Priority Level, or Priority Level priority routine Priority Class, or Initial Priority Level, or Priority Level priority routine.
  • the DSS queue manager periodically evaluates the Priority Level for each I/O request according to the specified (or default) Priority Level priority routine for that I/O request.
  • the Priority Level evaluation for each I/O request in the DSS queue is done whenever there is an I/O return from the OS file manager system.
  • the I/O manager in the DSS will then issue the I/O request with the highest priority, in the highest Priority Class, when the OS file system manager is ready to accept another I/O request. xi.
  • the DSS by itself can be implemented in a general OS; either as a module of an application (or applications) dedicated to acting on just that (those) applications; or, as a driver within the OS, intercepting all OS I/O calls.
  • a preferred embodiment ofthe SmartCam security system includes the Dynamic I/O System Buffer Scheduler/Manager (DSSM) for optimizing disk I/O request scheduling and management and the write buffer extension pipeline buffer for enhancing near real-time read optimization.
  • the preferred embodiment ofthe SmartCam security system operates in a Linux OS environment.
  • the SmartServer receives multiple input data streams from the remote cameras and sensors. These data streams have different data rates that fluctuate from moment the moment depending on the particular scene viewed by each camera. The range and magnitude of data rates and large number of data streams place extraordinarily high demands on the capability of a server to properly service each channel I/O request without significant risk of losing important data.
  • the competition for disk I/O is not limited to the competing SmartCam remote video recorders and for requests for stored data by the SmartViewer clients. Disk I/O is also demanded by other applications that may be running on the system and the processes ofthe Linux OS itself. This is compounded by the fact that standard versions of most general pu ⁇ ose Operating
  • I/O requests are received in the OS by conventional I/O scheduler module and are placed in a pipeline sequence while the OS performs housekeeping routines; such as grouping requests for efficient disk utilization, etc.
  • pending I/O requests are then serviced in first in first-out order with no particular priority for individual I/O requests.
  • the data associated with pending Linux I/O write requests may be buffered by temporary storage in the high-speed RAM memory ofthe machine running the SmartCam application and the (Linux) OS while the I/O request waits for it's turn to actually write the data to the disk.
  • the present SmartCam security system invention with a dynamically optimized I/O scheduler embodiment provides the capability to execute a dynamic disk write and near real-time data buffer sizing strategy.
  • This buffer resizing strategy is preferably based on the results of periodically evaluating the number of active channels and their data rates, then assigning new buffer lengths as a result of each evaluation.
  • the natural evaluation events to choose are synonymous with the I/O resource availability event used to evaluate channel I/O priority.
  • the present SmartCam security system invention implementation has 32 recorder channels with data rates ranging from about 1.5 up to about 5 1/2 megabit/sec, although lower and higher data rates can be accommodated, e.g., up to 8Mbps with corresponding adjustment to disk write and near-real-time data buffer length. Selecting the number of channels, channel data rates and buffer lengths for particular implementations ofthe SCSS can be done readily by persons having knowledge in computer architecture and programming.
  • the disk data write buffer size is determined by the disk write latencies and by the minimum efficient data write block selected.
  • Optimal Buffer size is a function of write block size, allowable disk seeks/sec, disk write latency, disk seek latency, system I/O pipeline latency, number of expected (or allowed) pending system I/O requests, disk size, system memory capacity, etc., and are well understood by knowledge computer architects and programmers.
  • Small write block size means fast write latency, but more seeks/sec and thus more seek latency
  • total disk I/O latency is the combination ofthe disk access latency and the OS file system I/O latency.
  • Disk access latency is a combination of seek plus track plus write delay and is a function ofthe block write size. If the Minimum Efficient Write Block Size is chosen, then total I/O latency includes the OS file system service delays (Linux I/O scheduler latency) in addition to the disk access latency.
  • Linux I/O scheduler latency is a function ofthe total number of pending (active) Linux I/O requests (read plus write).
  • Disk write buffer size should be at least twice the minimum disk write data block size because you need to be able to receive incoming data at high data rates while the previous disk data write block is being transferred. At higher data rates the concern is the disk data write block transfer time. That is, the delay time while data write block transferred to the OS I/O handler being long relative to the incoming data rate and the data write block length, i.e., incoming data may overrun the buffer space allowed.
  • the disk write buffer length should be at least twice the minimum disk data write block size, and preferably is three times that to allow an extra safety margin. If the disks write buffer length is too long, then at low data rates it may take a very long time to fill the data write buffer to the point where the next write I/O request will occur. The result of this has an effect on the proximity of near real-time read access to current time. [Explained later.]
  • the disk write data block size is selected to optimize disk I/O efficiency within the constraints of maximum disk seek rates e.g. 100 seeks per second in the preferred embodiment ofthe present invention.
  • the near real-time data buffer length (write data buffer pipeline extension) is typically significantly longer than the disk data write buffer. Its length being selected based on the actual disk write delay for the selected disk write data block length and the hidden, variable delay introduced by the Linux file system buffer when the disk write block is held in memory due to the presence of other applications and system processes competing for I/O resources.
  • Trick mode and off speed playback is typically significantly longer than the disk data write buffer. Its length being selected based on the actual disk write delay for the selected disk write data block length and the hidden, variable delay introduced by the Linux file system buffer when the disk write block is held in memory due to the presence of other applications and system processes competing for I/O resources.
  • FIG. 6B shows a different detail view ofthe SCSS architecture to more readily explain how trick mode and off speed playback are achieved in the present client server architecture invention.
  • Video and audio data is provided to the SmartViewer Client in separate PES (packetized elemental stream format) and is provided in integral units of GOP (Group of pictures) for video and frames for audio.
  • Video and audio are sent separately for 2 reasons: 1) is easier to send integral units of GOPs and audio frames when they are separated than if they were multiplexed together, 2) audio can be sent on a higher priority channel that video which is useful since data loss in audio is far more noticeable and unpleasant that data loss in video.
  • Requests for data from the server are in the form of XML structured messages.
  • the message format specified the start and end time ofthe requested data segment as well as the stream id and an additional Mode Parameter used for play speeds faster than 1.5X real time.
  • Playback speed optimization can also be performed by filtering out pictures that will not be displayed, on the server side.
  • IBP MPEG b pictures are discarded first since they rely on I and P pictures for decoding.
  • the high speed playback, server side filter is configured to discard unneeded picture frames according to a speed related discard algorithm or table in the SmartServer.
  • Trick mode playback which includes slow and fast, smooth motion playback in forward or reverse direction at any ofthe specified speeds, and single frame jog-shuttle is provided by a double buffered decoded system.
  • the SCSS double buffer decode system includes two picture frame GOP buffers and associated picture frame index buffers. One GOP buffer and a related index buffer for currently decoding frames and another GOP buffer and related index buffer for previously decoded frames.
  • the UTC time code is maintained for each decoded frame (video and audio). Double buffering allows the mpeg decoder to decode an MPEG video GOP in the normal forward direction (the only way it can be done) while the display takes video frames from the previously decoded buffer in reverse order for reverse play. Maintaining decoded frames in memory also makes single stepping in either direction relatively simple.
  • the Display Manager, Synchronizer Manages display ofthe decoded frames of video and audio is based on direction of play, the ratio of real time to playback speed, the current time and the time code of decoded frame.
  • the SmartViewer Client playback controller manages the whole process and keeps the mpeg decoder input buffer refreshed based on the direction and speed of play. It may also pause playback and signal a pause to the user though the GUI if data is not yet available from the SmartServer.
  • FIG. 6C and Fig 6D depicts a dataflow diagram for the Client/Server MPEG Playback Architecture 600 of a preferred embodiment ofthe SCSS invention of Fig. 1.
  • the Playback Architecture 600 is divided into two major sections; the Server Unit section 602 (representing SmartServer 102 disk file store 280, 282, 286), and the Client Unit section 604 with video display window(s) indicated as 606.
  • MPEG2 transport stream data store 608 represents a segmented sequential portion ofthe video, audio, and events data retained in disk file store 280, 282, 286.
  • the data sequence 608 was previously received as a continuous MPEG-2 transport stream from one ofthe SmartCam Camera Unit cameras 126 in Fig. 1 and Fig.
  • Each one hour segment 600 through 620 is identified by a unique identifier, e.g. 20020427.m2t, 20020427-0 l.m2 T.... these identifiers are readily translated as the segment belonging to the year 2002, April 27th, 00 being the first hour, 01 being the second hour and so forth.
  • Each segment 600-6 20 is a sequential string of MPEG GOPs from the camera unit 126. With this code each segment is uniquely identified by its filename.
  • An index files store 622 on the server hard disk stores index-pointer addresses 622 [J], The prefix ofthe index pointer addresses 622 [J] is the same as the prefix ofthe GOP file segment identifier. Each index pointer address points to the location of its related one-hour GOP file segment 610 through 620.
  • File store manager 280-m generates the addresses 622 during data sequence storage in the server 102. This helps to make subsequent data retrieval faster.
  • the one-hour GOP file Segments do not necessarily start and end exactly on the GOP boundaries. For example, segment 612 and segments 614 divides GOP 624 between them. This would pose insurmountable difficulties for previous MPEG playback systems.
  • segment 612 and segments 614 divides GOP 624 between them. This would pose insurmountable difficulties for previous MPEG playback systems.
  • SCSS invention uses the known Linux NFS function by standard methods to manage reading of partial GOP segments such as 624 in correct order from the storage server 602 by using the index pointer addresses 622 [J] to select the desired GOP file segments in the required order addresses) in the server index file 622.
  • Forward, backward, still and slow motion playing ofthe MPEG stored video is performed in the architecture 604 by the combination of five primary functions; local cache control 626 transport stream local cache 628 BOC decoder core 630 decoder manager 632 and display buffers 634.
  • Decoded frames 636 are sent to display 606 by rendered control 638.
  • SmartCam GUI 640 provides the interface for the operator ofthe system to request desired information.
  • the local cache control 626 uses the Linux NFS function to manage reading partial GOP file segments when they occur in a requested video sequence.
  • Control 626 uses the index marker points (the IDX) GOP file index addresses to recalculate file addresses for the local cache index 642.
  • the decoder manager 632 manages local cache control, BOC and rendered control to ensure data is available for decoding in either forward or reverse playing direction and that decoding is synchronized with the GUI time base.
  • Fig. 6C and 6D is primarily a data flow diagram to describe here how the SCSS invention plays MPEG forwards and backwards. It's inherently difficult to play MPEG backwards because it's based on motion project motion prediction. Basically MPEG video takes one complete picture of a scene as a video frame (I-frame picture) takes 14 more pictures (frames) ofthe changes from that I-frame. Because those changes are based on forward changes in time, playing backwards has previously been very tricky, because one can't go at random to any particular picture (frame) of that scene and expect to decode it in isolation in either direction.
  • This data flow mechanism helps to explain how the ⁇ ⁇ system plays video backwards and of course, forwards.
  • Security cameras are never stopped, they are screening and recording continuously all the time 24 hours a day, 7 days a week every week ofthe year, for years.
  • a motion JPEG system could't and wouldn't store and reproduce an (essentially infinitely long data file.
  • Fileservers can't handle infinitely long data files.
  • the One important and interesting feature about the SCSS system invention is the way file transfer from the server to the client is done.
  • data files are separated into long chunks, e.g., one-hour sections of video. That ends up being about 2 or 3 GB, which is a reasonable size file.
  • An hour section segment is a good-sized length of video file data for current technology.
  • Each camera generates a continuous MPEG data stream of contiguous sequential GOPs.
  • the continuous sequence of GOPs are stored in sequential order (GOPs and GOP frames with incrementing addresses).
  • Synchronously and exactly every hour a one-hour GOP file section is stored in the data store 602 on the server.
  • the server stores and records the one-hour GOP file sections in the MPEG 2 transport-stream data store with the segments identified by sequential ID numbers: i.e., No. 20020427-00.M2t, 20020427-01. M2t, 20020427-02.M2t, 20020427-03.M2t, 20020427-04.M2t - - -. This is the date and the hour ofthe day.
  • GOPs are groups of pictures, a decodable entity; GOPs consist of standard MPEG data; basically an I frame followed by a group of predicted frames, P and b.
  • the most likely condition for the segmented data file in data store 608 is to have many fragmented GOPs like GOP (N+2), where part of GOP (N+2), is in file section 612; GOP (N+2b) and part is in section 614; GOP (N+2e).
  • the VLC decoder core 630 On the client side 604 we want to avoid making the decoder client, in this case the VLC decoder core 630 to have to deal with fragmented GOPs e.g., GOP (N+2) having arbitrary boundaries between two hour-long GOP file sections.
  • the server In order to do this in forward play, when the client requests a particular time period (video data sequence) to be played, the server first copies the client's requested data sequence from the data store 608 into the MPEG 2 transport sequence local cache 628 in the forward direction. It appends successive GOPs form the server 102 (GOP n+7 indicated by the tail of arrow 640) to the end ofthe local cache 628 (GOP n +7 indicated by the point of arrow 642).
  • the MPEG2 PS local cache 628 could be a fairly large file (maybe 10 or 15 GB long data file) that represents several hours of video on the local hard drive but is not too large for practical storage devices.
  • This is a big, contiguous, linear file, suitable for the VLC decoder 630 that de- muxes and decodes the MPEG 2 video and audio data stream to YUV frame buffers and audio frames (display buffers 634) for subsequent presentation by Render Control 638.
  • Render Control 638 is a conventional function that receives decoded frames from buffer 634 in the conventional continuous stream for displaying images on a SmartCam display (window or windows) 606 through a standard Decoded Frame Write-To-Buffer path indicated by arrow 636.
  • the cache 628 is read by the VLC decoder core 630 in the time forward direction indicated by arrow 650, i.e. incrementing GOP addresses. This is consistent with forward play when GOPs are appended to the file end. Time is indicated as increasing in the direction ofthe arrow 650. Thus the file 628 is getting bigger as the data is added or appended to its end at 642.
  • the decoder 630 is reading from the file 628 a little bit behind, in time, (indicated by the tail of arrow 652 at GOP n+5 where it was appended before GOP N+7 at 642.
  • the appended data is shown as the incomplete GOP n+7 and the data being read by the decoder indicated by the "forward play read pointer" arrow from GOP n+5.
  • Reading from the server 102 to the cache 628 is much faster than reading from the cache 628 to the decoder 630, so the appended data 642 is always available to the decoder. In other words the decoder can't creep up on the data being appended to the file because the data from the server is coming over a gigabit network 112. Thus, the decoder core 630 never runs out of data.
  • Playing forward is not too difficult, because we've read this data into the cache 628 in the direction it was first recorded and stored in the server, the VLC 630 decodes in the forward direction, and the data was presented to the decoder 630 in the forward direction from the local cache 628 (by arrow 652).
  • Another system function typically known in server systems, will close the playback down and then restart the process, if at some point, the local cache file 628 gets too big. That may only happen every 10 hours or so, so it is unlikely to be noticeable. If it is, the user will only see a slight delay, perhaps a half second, while the local cache 628 is closed and the decode process begins again. .
  • Reverse play is trickier.
  • the system must read files forward, the direction in which they are written, from the nature of disk operation. So the SCSS system provides a mechanism for delivering the GOP file sequence to the decoder with the GOP files in reverse sequential order but with each OOP's coded frame data in its original chronological order. This enables the decoder to chronologically decode each GOP into it's constituent set of completely decoded, chronologically ordered set of picture frames, each individually ready to present for display in isolation, although not yet ready to be displayed in reverse chronological order since each GOP frames set is now in reverse chronological order to the GOP sequence order.
  • Process 660 reads the GOP N (and it's coded frame data) into a pre-allocated local cache file 628 by a process indicated by arrow 660.
  • Process 660 is called "reverse play, write into pre-allocated file” and writes at the point indicated by arrowhead of 660.
  • This is a known LINUX file system process (NFS) and is critical to this operation working. In Linux one can pre-allocate the file and start writing at the beginning of the file location rather than the end.
  • NFS LINUX file system process
  • control 626 is constantly filling out the pre-allocated file 628 with data from the server, and then the client (decoder 630), again, is reading slightly behind (in time) the arrow head 660 insert point.
  • the insert point being the head ofthe "reverse play" arrow, at the incomplete GOP n
  • the sequence ofthe GOPs stored in the local cache and reverse play vs. forward play whether they are placed in time sequential order in both cases or in forward time sequence in forward play and reverse time sequence and reverse play.
  • the diagram shows the GOP sequence in reverse time order and the decoder frames in forward time order. If the display buffers are read in forward time than for reverse play the decoded frame must also be in reverse time sequence in addition to the reverse time sequence ofthe GOPs. In other words in reverse play GOP n+ 3 should come before GOP n+ 2, and in each GOP the decoded frame the 14 should precede decoded frame bl3, b2... decoded frame bl, decoded frame 10.
  • the local cache control uses the NFS file system management to read partial file chunks from the storage server based on Index Marker Points in a Server Index File.
  • the Index Files Store contains the beginning and ending index addresses for an each of GOP in the MPEG 2 Transport-Stream Data Store.
  • the Index File Store contains index files (n-n.idx), corresponding to the each hour-long segment (n-n.m2t).
  • the index files contain the starting address and ending address (GOP-begin and GOP-end) for each ofthe GOPs in the segment. These starting and ending address is are used by the forward play and reverse play pointers in reading from the server data end writing to the transport-stream local cache.
  • one hour-long file segment has time stamp 20020427-00.m2t.
  • the corresponding index file has an index file number of 20020427-00.1 DX and contains the starting and ending address GOP-begin and GOP-and for each ofthe GOPs in that segment.
  • the Index File Cache Recall computes and stores index addresses into the Local Cache Index in the client.
  • the Decode Manager in the client manages Local Cache Control, VLC and Render Control to ensure data is available for decode by the VLC Decoder Core in either forward or reverse direction, and said the decode is synchronized with the Graphic User Interface time base.
  • the Decode Manager has communications links to the Graphic User Interface, to the Local Cache Control, the Local Cache Index, the Render Control, and the VLC
  • the VLC Decoder Core in the client demuxes and decodes MPEG 2 video and audio (from the GOP file provided by the forward play or reverse play read pointer; e.g. GOP N+2 in reverse play, and GOP N+5 in forward play) to YUV frame buffers and audio frames, e.g. Decoding Frame B2 (selected by the Decoded Frame Write to Buffer pointer) in the Display Buffers
  • the display buffers are an instance of double buffering that is fairly common in display applications. We're decoding 15 frames from GOP end plus 2 into one buffer indicated by decoded frame write to buffer arrow pointing to decoded frames in the GOP end plus 2 buffer, while we're displaying GOP end plus three decoded frames from another buffer (GOP end plus three display buffer). In this case indicated by the decoded frame write to buffer from GOP end plus 3. The fully decoded frames are played or displayed from the two buffers alternately. At the end ofthe frames in one buffer we switch to the other buffer and play through those frames. The so we alternately store and display to full GOPs of video by using the double buffer.
  • the key points to the SCSS invention are: dividing the contiguous captured video into separated sections of moderate length by having a storage mechanism, e.g. the server, segmenting the video into manageable divided chunks e.g. one-hour sections, then recovering and re-storing a selected sequence ofthe sections (for display in either forward or reverse play) and joining the sections in the original sequential order in another storage media, e.g. the local cache, to eliminate problems with dividing GOPs at GOP boundaries and to enable storage and retrieval of essentially indefinitely long video files.
  • Another key point is using inverse incrementing of recovered GOPs before decode in reverse play in order in order to meet the requirement that disks read forward, since they have to be read forward to feed the decoder MPEG GOP data in proper frame order.
  • the other key point is to have the ⁇ dsta rates between reading from the server then writing to the local cache sufficiently different so the server read always leads the cache writes.
  • Yet another key point in reverse play is reordering the decoded GOP frames so they are also in reverse chronological order in line with the reverse chronological order ofthe GOPs.
  • Still another key point is the double " buffering between decode and display to allow continuous reads ofthe decoded frames from one GOP buffer while the frames ofthe next GOP to be displayed are being stored are being decoded and stored in the other GOP buffer.
  • Double buffers are known, for example in TIVOTM video recorders, but they only generate a fairly short single file of definite length for an entire TV program that they record, say two hours, four hours long or whatever it is. We're dealing with a situation with essentially an indefinite length video stream.
  • the double buffered inverse -incrementing segmented file/server organization is a way to deal with that.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Le domaine technique de cette invention concerne d'une façon générale des systèmes de sécurité en réseau avec de nombreuses unités de surveillance autonomes, à distance et des stations de surveillance multiples permettant un accès interactif à des alarmes de système, à des événements et à des données vidéo stockées (figure 1). En particulier, un serveur (102) de fichiers de données vidéo/événement comprend un sous système de stockage/archivage de données à accès aléatoire (sous bloc A/V) et un sous système de stockage/archivage de données événement (sous bloc de données événement) destinés à stocker des données vidéo compressées compensées par animation horodatée et des événements horodatés ( par exemple des alarmes, des accès). En réponse aux commandes des stations de surveillance (110), le serveur (102) de fichiers vidéo transmet des données audio/vidéo et événement compressées aux stations de surveillance (110) via un réseau ou il reçoit des données vidéo compressées compensée par animation horodatée et des événements horodatés des unités de surveillance.
PCT/US2003/003076 2003-01-20 2003-01-31 Systeme de securite pour video numerique animee adaptative mpsg (scss) WO2004068855A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/US2003/003076 WO2004068855A1 (fr) 2003-01-20 2003-01-31 Systeme de securite pour video numerique animee adaptative mpsg (scss)
AU2003210799A AU2003210799A1 (en) 2003-01-20 2003-01-31 Mpeg adaptive motion digital video (scss) security system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
USFTEL001 2003-01-20
PCT/US2003/003076 WO2004068855A1 (fr) 2003-01-20 2003-01-31 Systeme de securite pour video numerique animee adaptative mpsg (scss)

Publications (1)

Publication Number Publication Date
WO2004068855A1 true WO2004068855A1 (fr) 2004-08-12

Family

ID=32823173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2003/003076 WO2004068855A1 (fr) 2003-01-20 2003-01-31 Systeme de securite pour video numerique animee adaptative mpsg (scss)

Country Status (2)

Country Link
AU (1) AU2003210799A1 (fr)
WO (1) WO2004068855A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006021434A1 (fr) * 2004-08-24 2006-03-02 Macrosystem Digital Video Ag Dispositif de surveillance de securite par imagerie
WO2006068463A1 (fr) * 2004-12-24 2006-06-29 Ultrawaves Design Holding B.V. Traitement d'image reparti intelligent
WO2006114353A1 (fr) * 2005-04-25 2006-11-02 Robert Bosch Gmbh Procede et systeme de traitement de donnees
US20070188608A1 (en) * 2006-02-10 2007-08-16 Georgero Konno Imaging apparatus and control method therefor
WO2008071482A1 (fr) * 2006-12-14 2008-06-19 Robert Bosch Gmbh Afficheur pour représenter une évolution
EP2186326A1 (fr) * 2007-09-10 2010-05-19 Thomson Licensing Lecture vidéo
EP2333735A1 (fr) * 2009-12-09 2011-06-15 Honeywell International Inc. Filtrage d'événements vidéo dans une zone sécurisée utilisant un couplage desserré dans un système de sécurité
EP2341710A3 (fr) * 2009-12-29 2012-09-05 Funkwerk plettac electronic GmbH Procédé de transmission sécurisée de signaux vidéo de plusieurs sources vidéo par le biais d'un réseau vers plusieurs moniteurs
WO2015130903A1 (fr) * 2014-02-28 2015-09-03 Hall Stewart E Système de caméra vidéo d'urgence
US9278283B2 (en) 2005-06-24 2016-03-08 At&T Intellectual Property I, L.P. Networked television and method thereof
US9910701B2 (en) 2014-12-30 2018-03-06 Tyco Fire & Security Gmbh Preemptive operating system without context switching
US10129579B2 (en) 2015-10-15 2018-11-13 At&T Mobility Ii Llc Dynamic video image synthesis using multiple cameras and remote control
WO2019035391A1 (fr) * 2017-08-17 2019-02-21 Sony Corporation Serveur, procédé, support lisible par ordinateur non transitoire et système
US10269384B2 (en) 2008-04-06 2019-04-23 Taser International, Inc. Systems and methods for a recorder user interface
US10297128B2 (en) 2014-02-28 2019-05-21 Tyco Fire & Security Gmbh Wireless sensor network
US10878323B2 (en) 2014-02-28 2020-12-29 Tyco Fire & Security Gmbh Rules engine combined with message routing
US11164601B2 (en) 2016-01-20 2021-11-02 Vivint, Inc. Adaptive video playback
US11689697B2 (en) 2018-04-27 2023-06-27 Shanghai Truthvision Information Technology Co., Ltd. System and method for traffic surveillance

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6166729A (en) * 1997-05-07 2000-12-26 Broadcloud Communications, Inc. Remote digital image viewing system and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6166729A (en) * 1997-05-07 2000-12-26 Broadcloud Communications, Inc. Remote digital image viewing system and method

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006021434A1 (fr) * 2004-08-24 2006-03-02 Macrosystem Digital Video Ag Dispositif de surveillance de securite par imagerie
WO2006068463A1 (fr) * 2004-12-24 2006-06-29 Ultrawaves Design Holding B.V. Traitement d'image reparti intelligent
WO2006114353A1 (fr) * 2005-04-25 2006-11-02 Robert Bosch Gmbh Procede et systeme de traitement de donnees
US9278283B2 (en) 2005-06-24 2016-03-08 At&T Intellectual Property I, L.P. Networked television and method thereof
US20070188608A1 (en) * 2006-02-10 2007-08-16 Georgero Konno Imaging apparatus and control method therefor
WO2008071482A1 (fr) * 2006-12-14 2008-06-19 Robert Bosch Gmbh Afficheur pour représenter une évolution
CN101563923B (zh) * 2006-12-14 2012-01-11 罗伯特·博世有限公司 用于显示变化曲线的显示
US8929712B2 (en) 2006-12-14 2015-01-06 Robert Bosch Gmbh Display for displaying progress
EP2186326A1 (fr) * 2007-09-10 2010-05-19 Thomson Licensing Lecture vidéo
US11854578B2 (en) 2008-04-06 2023-12-26 Axon Enterprise, Inc. Shift hub dock for incident recording systems and methods
US10269384B2 (en) 2008-04-06 2019-04-23 Taser International, Inc. Systems and methods for a recorder user interface
US11386929B2 (en) 2008-04-06 2022-07-12 Axon Enterprise, Inc. Systems and methods for incident recording
US10872636B2 (en) 2008-04-06 2020-12-22 Axon Enterprise, Inc. Systems and methods for incident recording
US10446183B2 (en) 2008-04-06 2019-10-15 Taser International, Inc. Systems and methods for a recorder user interface
US10354689B2 (en) 2008-04-06 2019-07-16 Taser International, Inc. Systems and methods for event recorder logging
EP2333735A1 (fr) * 2009-12-09 2011-06-15 Honeywell International Inc. Filtrage d'événements vidéo dans une zone sécurisée utilisant un couplage desserré dans un système de sécurité
EP2341710A3 (fr) * 2009-12-29 2012-09-05 Funkwerk plettac electronic GmbH Procédé de transmission sécurisée de signaux vidéo de plusieurs sources vidéo par le biais d'un réseau vers plusieurs moniteurs
CN106463030A (zh) * 2014-02-28 2017-02-22 泰科消防及安全有限公司 应急摄像机系统
US10268485B2 (en) 2014-02-28 2019-04-23 Tyco Fire & Security Gmbh Constrained device and supporting operating system
US10289426B2 (en) 2014-02-28 2019-05-14 Tyco Fire & Security Gmbh Constrained device and supporting operating system
US10297128B2 (en) 2014-02-28 2019-05-21 Tyco Fire & Security Gmbh Wireless sensor network
WO2015130903A1 (fr) * 2014-02-28 2015-09-03 Hall Stewart E Système de caméra vidéo d'urgence
US10379873B2 (en) 2014-02-28 2019-08-13 Tyco Fire & Security Gmbh Distributed processing system
US11747430B2 (en) 2014-02-28 2023-09-05 Tyco Fire & Security Gmbh Correlation of sensory inputs to identify unauthorized persons
US10854059B2 (en) 2014-02-28 2020-12-01 Tyco Fire & Security Gmbh Wireless sensor network
US9851982B2 (en) 2014-02-28 2017-12-26 Tyco Fire & Security Gmbh Emergency video camera system
US10878323B2 (en) 2014-02-28 2020-12-29 Tyco Fire & Security Gmbh Rules engine combined with message routing
US10402221B2 (en) 2014-12-30 2019-09-03 Tyco Fire & Security Gmbh Preemptive operating system without context switching
US9910701B2 (en) 2014-12-30 2018-03-06 Tyco Fire & Security Gmbh Preemptive operating system without context switching
US10631032B2 (en) 2015-10-15 2020-04-21 At&T Mobility Ii Llc Dynamic video image synthesis using multiple cameras and remote control
US11025978B2 (en) 2015-10-15 2021-06-01 At&T Mobility Ii Llc Dynamic video image synthesis using multiple cameras and remote control
US10129579B2 (en) 2015-10-15 2018-11-13 At&T Mobility Ii Llc Dynamic video image synthesis using multiple cameras and remote control
US11164601B2 (en) 2016-01-20 2021-11-02 Vivint, Inc. Adaptive video playback
WO2019035391A1 (fr) * 2017-08-17 2019-02-21 Sony Corporation Serveur, procédé, support lisible par ordinateur non transitoire et système
US11689697B2 (en) 2018-04-27 2023-06-27 Shanghai Truthvision Information Technology Co., Ltd. System and method for traffic surveillance

Also Published As

Publication number Publication date
AU2003210799A1 (en) 2004-08-23

Similar Documents

Publication Publication Date Title
WO2004068855A1 (fr) Systeme de securite pour video numerique animee adaptative mpsg (scss)
US10497234B2 (en) Monitoring smart devices on a wireless mesh communication network
JP6753902B2 (ja) ビデオソースデバイスからストリーム配信されるデータの格納管理
EP1855482A2 (fr) Surveillance vidéo avec accès de communication satellite
US8842179B2 (en) Video surveillance sharing system and method
US8427552B2 (en) Extending the operational lifetime of a hard-disk drive used in video data storage applications
US20060171453A1 (en) Video surveillance system
US7859571B1 (en) System and method for digital video management
CA2656826C (fr) Dispositif integre de capture multimedia
US8160129B2 (en) Image pickup apparatus and image distributing method
US20100097464A1 (en) Network video surveillance system and recorder
JP4426780B2 (ja) 映像の記録再生システムおよび記録再生方法
US7821533B2 (en) Wireless video surveillance system and method with two-way locking of input capture devices
US20060070109A1 (en) Wireless video surveillance system & method with rapid installation
WO2006046234A2 (fr) Systeme et appareil de surveillance multi media
US20060070108A1 (en) Wireless video surveillance system & method with digital input recorder interface and setup
US7508418B2 (en) Wireless video surveillance system and method with DVR-based querying
US20060072757A1 (en) Wireless video surveillance system and method with emergency video access
KR20040106964A (ko) 실시간 동영상 원격 감시 시스템
KR101211229B1 (ko) 사용자 요청에 의한 cctv 카메라의 선택적 원격 제어, 동적 그룹화 모니터링 시스템 및 그 방법
US10440310B1 (en) Systems and methods for increasing the persistence of forensically relevant video information on space limited storage media
JP2003244683A (ja) 遠隔監視システム及びプログラム
JP2002262272A (ja) ディジタル監視カメラシステムおよび制御装置
Solutions Video
JP4755710B2 (ja) 映像監視システム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP