GB2408166A - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
GB2408166A
GB2408166A GB0425267A GB0425267A GB2408166A GB 2408166 A GB2408166 A GB 2408166A GB 0425267 A GB0425267 A GB 0425267A GB 0425267 A GB0425267 A GB 0425267A GB 2408166 A GB2408166 A GB 2408166A
Authority
GB
United Kingdom
Prior art keywords
controller
processor
video
input
media
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0425267A
Other versions
GB0425267D0 (en
Inventor
Frank Roche
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HOMENET COMM Ltd
Original Assignee
HOMENET COMM Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HOMENET COMM Ltd filed Critical HOMENET COMM Ltd
Publication of GB0425267D0 publication Critical patent/GB0425267D0/en
Publication of GB2408166A publication Critical patent/GB2408166A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19656Network used to communicate with a camera, e.g. WAN, LAN, Internet
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19658Telephone systems used to communicate with a camera, e.g. PSTN, GSM, POTS
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19671Addition of non-video data, i.e. metadata, to video stream
    • G08B13/19673Addition of time stamp, i.e. time metadata, to video stream
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19695Arrangements wherein non-video detectors start video recording or forwarding but do not generate an alarm themselves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)

Abstract

A media controller 1 is connected to up to eight cameras 2, microphones 3, and sensors 4. The controller 1 communicates wirelessly via GSM with remote computers 5 and mobile devices 6 via a mobile network 7. The controller 1 has a very low power consumption as it is most of the time in an idle state. It is activated to enter an active state by the sensors 4. The sensors 4 are of the type used in the alarm industry to detect proximity, such as IR and magnetic sensors. Inputs from the sensors 4 cause the controller 1 to raise events which cause the controller to activate. An example is an IR sensor sensing a person at an illegal dump site. Upon being activated, the controller 1 starts recording an image stream from at least one camera and transmits some frames to a subscribed PC via a wireless network.

Description

"A Monitoring System"
INTRODUCTION,
The invention relates to audio and visual monitoring of activities for purposes such as remote monitoring, alarm activated event monitoring, remote surveillance, general security, and audio/visual information services delivered to PCs and audio/video enabled mobile devices.
It is known to use monitoring cameras in closed circuit TV systems for such purposes. However, there is a requirement for a person to watch the camera images and take the appropriate actions.
The invention is therefore directed towards providing a monitoring system which ëee : 15 allows more versatility and less human time input. me
SUMMARY OF THE INVENTION
e-Be. :
A
According to the invention, there is provided a media controller comprising a media . e 20 interface to at least one camera, a network interface for communication with remote systems, and a processing means for uploading media to remote systems via a network.
In one embodiment, the network interface transmits media wirelessly via a mobile network.
In another embodiment, the processing means maintains an idle state with low power consumption until a trigger is received, and captures media upon receipt of a trigger.
In a further embodiment, the processing means directs upload of a short media stream via the network interface and captures a longer media stream to a storage device. - 2
In one embodiment, the media is video, and the short stream comprises a number of frames of video.
In another embodiment, further comprises a sensor interface, and the processing means treats detection of a configured activity by a sensor as an event to trigger an upload.
In a further embodiment, the sensor interface receives inputs from a proximity sensor, and the processing means treats such an input as an event.
In one embodiment, the processing means treats an instruction from a remote system via the network interface as an event.
In another embodiment, the media interface comprises a DSP media stream input : 15 processor. ... me
In a further embodiment, an input processor is connected to a content stream e.e . interface.
20 In one embodiment, the input processor dynamically monitors the input rate from the content stream interface and the transfer rate to the network interface, and increases activity if the transfer rate becomes lower...DTD: In another embodiment, the input processor performs video and audio compression and buffering.
In a further embodiment, the network interface comprises an output processor, the input and output processors access a shared RAM, and the input processor determines the input rate from interrupts to the RAM, and determines the transfer rate from accesses by the second processor to the RAM. - 3
In one embodiment, the output processor dynamically monitors activity of the network interface, and transfers network interface operations to the input processor if congestion occurs in the network interface.
In another embodiment, the output processor monitors network conditions before streaming a next session, and the input processor dynamically adjusts audio/video encoding accordingly.
In a further embodiment, the processing means comprises an encoder and a session control processor, in which the session control processor comprises means for dynamically choosing a frame frequency suited to available bandwidth of each session.
In one embodiment, the session control processor comprises means for dividing I : 15 frame frequency to choose frame frequency. ... ...
In another embodiment, the session control processor comprises means for including or omitting P frames to choose frame frequency. .c
.. . 20 In a further embodiment, the session control processor maintains in memory a table .
associated with each of a plurality of session bandwidth levels, and dynamically writes frames for each session to a relevant table.
In one embodiment, the session control processor maintains a record associated with each session in each table, each record being indexed on a session address.
In another embodiment, the session address is an IF address.
In a further embodiment, the processing means further comprises a video stream logic circuit for performing pre-processing of a video stream.
In one embodiment, the video stream logic circuit writes pre-processed data to a random access memory shared with a host processor.
In another embodiment, the video stream logic circuit and the host processor are both connected to the RAM by shared data and control buses.
In a further embodiment, the video stream logic circuit attempts RAM access only after generating a pre-set level of pre-processed data.
In one embodiment, said preset level is 32 pixels.
In another embodiment, the video stream logic circuit continues access to the RAM until all data is transferred.
In a further embodiment, the video stream logic circuit performs a RAM write within twelve cycles. .- : 15 ...
. In one embodiment, the pre-processing comprises allocating the incoming data into segments in odd and even streams, and interleaving said streams. ... : .
In another embodiment, the video stream logic circuit comprises means for the 20 conversion of video 4:2:2 formatted data into 4:2:0 formatted data. . ë
In a further embodiment, the video stream logic circuit converts interleaved raster scan into progressive raster scan data.
In one embodiment, an event monitor framework of the host processor sends a notification using email, SMS or pager in response to detection of an event.
In another embodiment, the event monitor framework allows association between a recipient and one or more input zones.
In a further embodiment, the event monitor framework allows association between an input zone and one or more cameras.
In one embodiment, the event monitor framework allows association between an input zone and one or more audio channels.
In another embodiment, the processing means operates in an idle powersaving mode, and activates to an operation mode in response to a detected event.
DETAILED DESCRIPTION OF THE INVENTION
Brief Description of the Drawings
The invention will be more clearly understood from the following description of some embodiments thereof, given by way of example only with reference to the accompanying drawings in which: ë : 15 Fig. l is a schematic diagram illustrating a monitoring system of the invention; ..e.
Fig. 2 is a block diagram of the circuit of a controller of the system; ë . 20 Figs. 3 to 6 are more detailed diagrams of the circuit. . ..
Description of the Embodiments
Referring to Fig. l a media controller l is connected to up to eight cameras 2, microphones 3, and sensors 4. The controller l communicates wirelessly via GSM with remote computers 5 and mobile devices 6 via a mobile network 7.
The controller l has a very low power consumption as it is most of the time in an idle state. It is activated to enter an active state by the sensors 4. The sensors 4 are of the type used in the alarm industry to detect proximity, such as IR and magnetic sensors.
Inputs from the sensors 4 cause the controller l to raise events which cause the controller to activate. An example is an IR sensor sensing a person at an illegal dump site. Upon being activated, the controller l starts recording an image stream from at - 6 least one camera, the camera(s) being selected according to the sensor to which detects an event. The controller l automatically transmits, via the mobile network, the first three video frames, and possibly also a sound clip, depending on the configuration. The media "clip" is transmitted to a configured recipient as an alert for decision-making. The controller l proceeds to capture a prolonged media stream of, say, ten minutes duration. The parameters for both the "clip" which is uploaded wirelessly and the stored stream are configurable.
In one embodiment, the controller performs image processing to identify a target object, and only continues with recording the media stream if the target object is detected. The target objects are stored as references in an object library, the references being defined by colour and/or pattern.
The recorded media streams may be later uploaded via a broadband link or ..
: 15 physically removed in a storage cartridge. .*- A.
The controller l may upload short media clips according to different criteria such as e.e at timed intervals of a full recording. Thus, the triggers to both "wake up" and for -e transmitting short media clips may be inputs from the sensors 4, or messages . 20 received from the wireless interface, and/or triggers driven by a timer. A..
The controller l is contained in a weather proof case which may be easily hidden.
For maintenance or insertion/removal of storage cartridges the case may be opened.
Because the controller is most of the time in the idle state, power consumption is low and it may be battery-powered for long periods.
Time stamps for captured video images are captured from the Internet from a select number of public timeservers using the NTP protocol. This achieves verifiable time stamping.
The cameras may be placed for applications such as for remote monitoring or covert surveillance, automatic number plate recognition, alarmactivated remote surveillance, or traffic monitoring. - 7
The users do not need to continuously watch images from all cameras, and instead view images selected according to the trigger criteria. Notification of these trigger events can additionally be through a pager device or via SMS. This monitoring can be from any location with Public Switched Telephone Network access, GSM/GPRS/HSCSD/EDGE mobile networks and Internet/Intranet access.
Referring to Fig. 2 a controller control circuit 10 comprises a digital signal processor (DSP) 12, which receives an input content stream from a content stream interface 13.
The DSP 12 is connected to a random access memory (RAM) 14 for use during content processing. The circuit 10 also comprises a host processor 15, which also performs content processing. The host processor ("host") 15 is connected to a read only memory (ROM) 16, to a random access memory (RAM) 17, to a wide area network (WAN) interface 18, and to a local area network (LAN) interface 19. The : 15 interfaces 18 and 19 are used for output of content via the relevant network to the ...
. destination address and either or both of the interfaces 18 and 19 may be activated according to the application. -.e. : I*.
An alarm control panel 20 is used to connect to the sensors 4 for the purpose of generating events. These events drive video and/or audio recording and notifications * , through email, SMS, or pager.
In more detail, referring to Fig. 3 the host 15 comprises a hardware layer 21, a software platform of layers 22 to 26, and an applications layer of components 27 to 36.
In the software platform, hardware device drivers 22 are separated from upper layers by a hardware abstraction layer 23. This allows modularity of hardware drivers and hence of the underlying hardware processor 20. A kernel 24 is in this embodiment that of the Linux operating system, and choice of kernel is allowed by a software abstraction layer 25. A network layer 26 comprises TCP/UDP/IP communication protocols.
At the applications level the host 15 comprises a media streaming server component 27, and the following other components: audio encode 28, video encode 29, data encode 30, accounting 3 1, IKE/IPSEC 32, load balance 33, event monitoring framework 34, media application framework 36 various communication protocol components 35, namely RTSP, SNMP, SMTP, HTTP, DHCP, and FTP components.
The DSP 12 is illustrated in more detail in Fig. 4. The components are, in order upwardly, hardware 50, device drivers 51, *e DSP BIOS 52, a processes 53, "r.. 20 MP4x processes 54, *. . ,, DSP API 55, and a DSP multimedia application program 56.
It will be appreciated that the circuit 10 is an embedded real time audio and video encoder and media streaming server having real time embedded operating system and content processing capability. In this embodiment the content handled is audio and video content and the circuit 10 may be deployed in various contexts including: delivering recording of event triggered audio/video content using email attachments to PCs, mail servers, or mobile terminals, delivering video and audio content in real time via the Internet/Intranet, and delivering media services over wireless using GPRS/EDGE or UMTS mobile networks.
For real time streaming, audio and video content is received at the interface 13 and is pre-processed by the DSP 12. This pre-processing includes encoding and the DSP RAM 14 is used for caching data during processing and for executing applications that are downloaded to this RAM during an initialization period. The encoded content is routed to the host 15, which performs final processing and controls output via one of the interfaces 18 or 19.
Preparation of an output content message is performed at the TCP or UDP and IP levels by the host 15 using the host RAM 17, and at the data link level by the relevant interface 18 or 19. In more detail, the media server component 27 of the host media application framework 36 writes all messages to the host RAM 17. The host 15 writes one message at a time to the host RAM 17 for a particular item of content to be delivered to multiple IP addresses. Instead of generating a fresh packet stream for all destinations the host 15 simply changes the destination field and the *,. 15 checksum at the IP level in all of the output messages. The relevant interface 18 or 19 then converts down to the data link level and transmits the packet streams to the relevant destinations.
' The DSP 12, the content stream interface 13, and the RAM 14 are operated as an audio/video sub-system ("DSP system") operating in tandem with the host 15. The DSP 12 dynamically monitors the camera input content rate by monitoring interrupts from the interface 13. The DSP 12 also monitors the output rate from the DSP system by monitoring messages from the host 15 indicating that encoded data has been taken out of the RAM 14. If the output rate becomes significantly less than the input rate the DSP 12 dynamically increases its activity by increasing the number of P pictures and/or increasing the quantisation factor andlor changing the frame per second rate. In this embodiment the level of audio and video compression and buffering by the DSP 12 is increased or decreased as appropriate.
In addition, the host 15 monitors the output rates of the WAN and LAN interfaces 18 and 19 respectively. This is achieved by determining the number of active sessions and the bit rate supported for each session, determining the number of lost packet - 10 reports from the network, and determining the rate at which the media streaming socket buffers are being filled and emptied.
The media server 27 resides on the host 15, as do the kernel 24 and the Web server applications 34. Therefore, the host 15 has a priori knowledge to adjust the DSP operations to provide an optimum encoded data stream for the prevailing network conditions before a user session is started. The host can do this because it knows from the web server what content has been accessed and is being requested, it knows from the kernel the number of supported sessions and it knows from the network protocols the present traffic and the undelivered or errored packets on the network.
An advantageous function of the DSP 12 is to encode the incoming video content stream to generate I frames and P frames. The choice of frequency of I frames has heretofore been dependent on network quality. For example a good network may .
eeeee 15 only require one in 10 frames to be an I frame. ëee
If there is not much motion in the video content then one frame in 25 may be an I ëeees frame, if there is a lot of motion in the video content then one frame in 5 may be an I frame.
e 20 ë cece The DSP 12 encodes the incoming audio and video content according to the following criteria of fixed frame rate or fixed bit rate or fixed quality level. The frames are transferred to the host 15. Referring to Fig. 5 the host 15 establishes a table in the RAM 17 associated with each of a number of session bandwidth levels.
In one embodiment a Table 1 is associated with a bandwidth level to 28Kbps, a Table 2 with a bandwidth level to support 56Kbps, and a Table 3 with a bandwidth level to support 128Kbps.
All tables have a maximum capacity of 100 session rows, each row being capable of storing frames for a time period of 1 second.
As the encoded frames from the DSP are received at the host 15 they are linked to a particular session. The relevant table for the session is determined, and the relevant - 11 number of frames are written to the associated row in the table. In this embodiment no P frames and only every fourth I frame is written if the session is in Table 1. If in Table 2 all I frames but no P frames are written. If in Table 3 all frames are written.
Delivery of output frames from the tables is performed as follows. The host transfers the packet of information to the WAN/LAN communication device and while the packet of information is in the communication device it will for forwarded as per the IP address designations held in the relevant table.
It will be appreciated that the host 15 achieves optimum use of the diverse available bandwidths for the different sessions.
A single processor may, in another embodiment, perform both encoding and output session control.
. 15 Referring to Fig. 6 the hardware of the circuit 1 is shown in more detail. In the input interface 13 a video decoder is connected to a video stream logic (VSL) circuit and : an audio encoder/decoder is connected to DSP 12. The VSL is in turn connected to a DSP data bus and to a DSP control bus providing access to DSP RAM. The DSP 12 20 also uses the buses to access its RAM. ë
Connection of the DSP 12 to the host (CPU) 15 is illustrated. The host 15 performs downstream processing for onward routing of real time audio and video data to the remote devices.
The VSL of the input interface 13 is programmed to monitor the incoming 4:2:2 formatted video data stream and allocate the monochrome (Y) and colour (Cb/r) data into separate contiguous segments from odd and even fields of incoming video data.
The Y luminance data is supplied into a FIFO and when more than 32 pixels of data is in the FIFO, during an active line of video, the VSL will request the DSP RAM from the DSP 12, and transfer data when access to the DSP RAM is granted. At the - 12 end of an active line of video the VSL will request the DSP RAM from the DSP 12 to empty the Y FIFO before the next line of active video starts.
The incoming Cr and Cb colour data each have separate FIFOs and when more than 32 pixels of data is in the FIFO, the VSL will request the DSP RAM from the DSP 12, and transfer data when access to the DSP RAM granted. At the end of an active line of video the VSL will request the DSP RAM from the DSP 12 to empty the Cr/Cb FIFOs before the next line of video starts.
The VSL monitors the number of lines of video in the incoming video stream and if the number of lines of video is equal to or less than 288, the VSL will support 50 fields per second. If the number of lines of video is greater than 288 lines of video the VSL will support 25 frames per second.
The VSL decimates the incoming 4:2:2 formatted video data into 4:2:0 formatted .
data for onward processing. The VSL interleaves the odd and even fields of video . data when the number of lines of video is greater than 288, and when equal to or less ... than 288 lines does not interleave the odd and even fields, and writes the data to the DSP RAM. This action converts an Interlaced video formatted data into Progressive . 20 video formatted data. The VSL is configured to perform the thirty- two pixel data ë write in twelve clock cycles, the clock frequency being up to 100 MHz. Thus, the DSP 12 is locked out for only a relatively short duration, being typically 3% of the time on average.
Advantageous aspects of operation of the DSP 12 are as follows. The DSP is locked out typically 3% as opposed to 15% in traditional implementations. The Incoming Interlaced video data stream is converted into Progressive video data stream. This allows more time for the DSP compression algorithms to run and not have to do the Interlaced to Progressive video formatting.
The Incoming Interlaced 4:2:2 video data stream is converted into Progressive 4:2:0 video data for the DSP algorithm. This allows more time for the DSP compression algorithms to run and not have to do the 4:2:2 to 4:2:0 video formatting. - 13
The Incoming Interlaced video data stream is converted into separate contiguous Y/Cr/Cb blocks of memory. This allows more time for the DSP compression algorithms to run and not have to convert the video data into 4:2:0 contiguous memory blocks.
The VSL supplies three separate Y/Cr/Cb blocks in memory so that while one block is being filled by the VSL, another block can be emptied by the DSP. This allows more time for the DSP compression algorithms to run and not have to wait for the VSL to fill a block of memory.
Referring again to Figs. 2 and 3, the alarm control panel 20 consists of 1 l input zones and 4 outputs. Inputs are programmable for level and edge detection operation.
The event monitor framework 34 handles events detected on the input zones. The framework 34 is capable of generating a number of consequential actions. These .e actions include, but are not limited to, the following.
as. A recording of audio and video associated with the event is initiated.
ale. An email notification identifying the event is sent to one or more pre-defined recipients. An email with attached multimedia file is sent to one or more pre . . . 20 defined recipients. .
An SMS message identifying the event is sent to one or more predefined recipients.
A pager notification identifying the event is sent to one or more predefined recipients.
The framework 34 allows configuration of all zones and outputs. Specifically each input zone can be associated with between O to 8 cameras and or 0 to 8 audio channels. Each zone can be set for normally closed or normally open input events.
Additionally each zone has an associated activation sensitivity tolerance to prevent spurious activation. - 14
In addition, the framework 34 provides triggering of events from audio inputs.
Specifically the event can be generated when the audio level exceeds a pre-defined threshold.
In addition, the framework 34 allows configuration of multiple recipients for alarm events. Each recipient can register for event notification from one or more of the input zones. This notification can be via email, SMS, or pager.
In another embodiment the framework 34 can use the zone inputs to interface with external equipment. In this instance external equipment will activate selected input zones.
.. In addition the event monitor uses a power management scheme to preserve ë power when operating from a battery. All peripheral components are turned off in the absence of an event. The main processors are placed in a suspended .
state. On the occurrence of an event the main processors are removed from their suspended state and power is made available to peripheral .. components. Advantageous aspects of this operation are that the event .. 20 monitor 34 can operate in remote locations for long periods of time without human intervention.
In addition, the event monitor 34 can periodically indicate its operational status through text or video-based status notifications. This notification can be sent at a user configurable interval. In a further embodiment that event monitor allows configuration so that events at certain times are ignored.
Advantageous aspects of this operation are that the event monitor 34 can conserve power by operating only when required to do so.
The invention is not limited to the embodiments described but may be varied in construction and detail. -

Claims (37)

  1. Claims 1. A media controller comprising a media interface to at least one
    camera, a network interface for communication with remote systems, and a processing means for uploading media to remote systems via a network.
  2. 2. A controller as claimed in claims 1, wherein the network interface transmits media wirelessly via a mobile network.
  3. 3. A controller as claimed in claims 1 or 2, wherein the processing means maintains an idle state with low power consumption until a trigger is received, and captures media upon receipt of a trigger.
  4. 4. A controller as claimed in claim 3, wherein the processing means directs ee . 15 upload of a short media stream via the network interface and captures a A. . longer media stream to a storage device. .... .
  5. 5. A controller as claimed in claim 4, wherein the media is video, and the short stream comprises a number of frames of video. I 20
    :.
  6. 6. A controller as claimed in any of claims 3 to 5, further comprising a sensor interface, and the processing means treats detection of a configured activity by a sensor as an event to trigger an upload
  7. 7. A controller as claimed in claim 6, wherein the sensor interface receives inputs from a proximity sensor, and the processing means treats such an input as an event.
  8. 8. A controller as claimed in any of claims 3 to 7, wherein the processing means treats an instruction from a remote system via the network interface as an event. - 16
  9. 9. A controller as claimed in any preceding claim, wherein the media interface comprises a DSP media stream input processor.
  10. 10. A controller as claimed in claim 9, wherein an input processor is connected to a content stream interface.
  11. 11. A controller as claimed in claim 10, wherein the input processor dynamically monitors the input rate from the content stream interface and the transfer rate to the network interface, and increases activity if the transfer rate becomes lower.
  12. 12. A controller as claimed in claim 11, wherein the input processor performs video and audio compression and buffering.
  13. 13. A controller as claimed in claim 12, wherein the network interface comprises . . an output processor, the input and output processors access a shared RAM, I:. and the input processor determines the input rate from interrupts to the RAM, .. and determines the transfer rate from accesses by the second processor to the RAM. . I. 20 A....
  14. 14. A controller as claimed in claim 13, wherein the output processor dynamically monitors activity of the network interface, and transfers network interface operations to the input processor if congestion occurs in the network interface.
  15. 15. A controller as claimed in claims 13 or 14, wherein the output processor monitors network conditions before streaming a next session, and the input processor dynamically adjusts audio/video encoding accordingly.
  16. 16. A controller as claimed in any preceding claim, wherein the processing means comprises an encoder and a session control processor, in which the session control processor comprises means for dynamically choosing a frame frequency suited to available bandwidth of each session. - 17
  17. 17. A controller as claimed in claim 16, wherein the session control processor comprises means for dividing I frame frequency to choose frame frequency.
  18. 18. A controller as claimed in claims 16 or 17, wherein the session control processor comprises means for including or omitting P frames to choose frame frequency.
  19. 19. A controller as claimed in any of claims 16 to 18, wherein the session control processor maintains in memory a table associated with each of a plurality of session bandwidth levels, and dynamically writes frames for each session to a
    relevant table.
  20. 20. A controller as claimed in claim 19, wherein the session control processor -.e maintains a record associated with each session in each table, each record .
    being indexed on a session address. ..-
    ..
  21. 21. A controller as claimed in claim 20, wherein the session address is an IF address. . .
    ...-e
  22. 22. A controller as claimed in any preceding claim, wherein the processing means further comprises a video stream logic circuit for performing pre-processing of a video stream.
  23. 23. A controller as claimed in claim 22, wherein the video stream logic circuit writes pre-processed data to a random access memory shared with a host processor.
  24. 24. A controller as claimed in claim 23, wherein the video stream logic circuit and the host processor are both connected to the RAM by shared data and control buses. - 18
  25. 25. A controller as claimed in any of claims 22 to 24, wherein the video stream logic circuit attempts RAM access only after generating a pre-set level of pre- processed data.
  26. 26. A controller as claimed in claims 25, wherein said preset level is 32 pixels.
  27. 27. A controller as claimed in claims 22 to 26, wherein the video stream logic circuit continues access to the RAM until all data is transferred.
  28. 28. A controller as claimed in any of claims 22 to 27, wherein the video stream logic circuit performs a RAM write within twelve cycles.
  29. 29. A controller as claimed in any of claims 22 to 28, wherein the preprocessing comprises allocating the incoming data into segments in odd and even .
    streams, and interleaving said streams.
    . - . ale.
  30. 30. A controller as claimed in any of claims 22 to 29, wherein the video stream .. logic circuit comprises means for the conversion of video 4:2:2 formatted data into 4:2:0 formatted data. . -:
  31. 31. A controller as claimed in any of claims 22 to 30, wherein the video stream logic circuit converts interleaved raster scan into progressive raster scan data.
  32. 32. A controller as claimed in any preceding claim, wherein an event monitor framework of the host processor sends a notification using email, SMS or pager in response to detection of an event.
  33. 33. A controller as claimed in claim 32, wherein the event monitor framework allows association between a recipient and one or more input zones.
  34. 34. A controller as claimed in claims 32 or 33, wherein the event monitor framework allows association between an input zone and one or more cameras. - 19
  35. 35. A controller as claimed in any of claims 32 to 34, wherein the event monitor framework allows association between an input zone and one or more audio channels.
  36. 36. A controller as claimed in any of claims 6 to 35, wherein the processing means operates in an idle power-saving mode, and activates to an operation mode in response to a detected event.
  37. 37. A controller substantially as described with reference to the drawings. ëe- .ee ë. ..
    ë-e-. e eeee e -
    -
GB0425267A 2003-11-17 2004-11-17 Monitoring system Withdrawn GB2408166A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IE20030857 2003-11-17

Publications (2)

Publication Number Publication Date
GB0425267D0 GB0425267D0 (en) 2004-12-15
GB2408166A true GB2408166A (en) 2005-05-18

Family

ID=33524004

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0425267A Withdrawn GB2408166A (en) 2003-11-17 2004-11-17 Monitoring system

Country Status (1)

Country Link
GB (1) GB2408166A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2902080A1 (en) * 2006-06-13 2007-12-14 Airbus France Sas Aircraft monitoring device for use in aeronautic field, has management module passing device from waiting to working modes by controlling supply of acquisition module by supply source when signal passes from monitoring to waiting state
GB2442049A (en) * 2006-07-26 2008-03-26 Joseph Farrell-Dillon Proximity sensor activated camera
US8441350B2 (en) 2009-09-10 2013-05-14 Sony Corporation Apparatus and method for operation of a display device to provide a home security alarm
WO2015000085A3 (en) * 2013-07-02 2015-02-26 Unsworth Andrew Baillie A system for activating a recording device to capture the reaction of a recipient, subject, and/or sender during a media or gift exchange
EP2779134A3 (en) * 2013-03-15 2016-08-10 MivaLife Mobile Technology, Inc. Security system power management
WO2020091949A1 (en) * 2018-10-30 2020-05-07 Carrier Corporation Performance-guaranteed channel access control for security alarm and image sensors

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2064189A (en) * 1979-11-09 1981-06-10 Ascotts Ltd Surveillance System
GB2256771A (en) * 1991-06-11 1992-12-16 Cable Srl Slow-scanning remote surveillance system using mobile cellular radio communications
CA2228679A1 (en) * 1998-02-04 1999-08-04 Gridzero Technologies Inc. Surveillance systems
DE19913841A1 (en) * 1999-03-27 2000-09-28 Hansjoerg Klein System to protect boats, holiday homes, kiosks, churches, museums or galleries; has hidden unit with movement sensor and camera and remote control receiver with monitor to display images
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
US6385772B1 (en) * 1998-04-30 2002-05-07 Texas Instruments Incorporated Monitoring system having wireless remote viewing and control
US20020057342A1 (en) * 2000-11-13 2002-05-16 Takashi Yoshiyama Surveillance system
WO2002065420A1 (en) * 2001-02-12 2002-08-22 The Johns Hopkins University Commandable covert surveillance system
GB2377068A (en) * 2001-06-26 2002-12-31 Intamac Systems Ltd Security system with remote communication
US20030102967A1 (en) * 2001-11-30 2003-06-05 Eric Kao Wireless solar energy burglar-proof device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2064189A (en) * 1979-11-09 1981-06-10 Ascotts Ltd Surveillance System
GB2256771A (en) * 1991-06-11 1992-12-16 Cable Srl Slow-scanning remote surveillance system using mobile cellular radio communications
CA2228679A1 (en) * 1998-02-04 1999-08-04 Gridzero Technologies Inc. Surveillance systems
US6385772B1 (en) * 1998-04-30 2002-05-07 Texas Instruments Incorporated Monitoring system having wireless remote viewing and control
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
DE19913841A1 (en) * 1999-03-27 2000-09-28 Hansjoerg Klein System to protect boats, holiday homes, kiosks, churches, museums or galleries; has hidden unit with movement sensor and camera and remote control receiver with monitor to display images
US20020057342A1 (en) * 2000-11-13 2002-05-16 Takashi Yoshiyama Surveillance system
WO2002065420A1 (en) * 2001-02-12 2002-08-22 The Johns Hopkins University Commandable covert surveillance system
GB2377068A (en) * 2001-06-26 2002-12-31 Intamac Systems Ltd Security system with remote communication
US20030102967A1 (en) * 2001-11-30 2003-06-05 Eric Kao Wireless solar energy burglar-proof device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2902080A1 (en) * 2006-06-13 2007-12-14 Airbus France Sas Aircraft monitoring device for use in aeronautic field, has management module passing device from waiting to working modes by controlling supply of acquisition module by supply source when signal passes from monitoring to waiting state
US8866909B2 (en) 2006-06-13 2014-10-21 Airbus Operations Sas Device and method for the surveillance of an aircraft
GB2442049A (en) * 2006-07-26 2008-03-26 Joseph Farrell-Dillon Proximity sensor activated camera
US8441350B2 (en) 2009-09-10 2013-05-14 Sony Corporation Apparatus and method for operation of a display device to provide a home security alarm
EP2779134A3 (en) * 2013-03-15 2016-08-10 MivaLife Mobile Technology, Inc. Security system power management
WO2015000085A3 (en) * 2013-07-02 2015-02-26 Unsworth Andrew Baillie A system for activating a recording device to capture the reaction of a recipient, subject, and/or sender during a media or gift exchange
WO2020091949A1 (en) * 2018-10-30 2020-05-07 Carrier Corporation Performance-guaranteed channel access control for security alarm and image sensors

Also Published As

Publication number Publication date
IE20040765A1 (en) 2005-06-29
GB0425267D0 (en) 2004-12-15

Similar Documents

Publication Publication Date Title
US6185737B1 (en) Method and apparatus for providing multi media network interface
US8259816B2 (en) System and method for streaming video to a mobile device
US20110119716A1 (en) System and Method for Video Distribution Management with Mobile Services
EP1825672B1 (en) Method and apparatus for controlling a video surveillance display
US7075567B2 (en) Method and apparatus for controlling a plurality of image capture devices in a surveillance system
US7636340B2 (en) System and method for communicating over an 802.15.4 network
US20070024705A1 (en) Systems and methods for video stream selection
CN110121114B (en) Method for transmitting stream data and data transmitting apparatus
US20050099492A1 (en) Activity controlled multimedia conferencing
US20190261007A1 (en) Adaptive encoding in security camera applications
US10547376B2 (en) System and method for communicating over an 802.15.4 network
US20070185989A1 (en) Integrated video surveillance system and associated method of use
US20110126250A1 (en) System and method for account-based storage and playback of remotely recorded video data
CN103141089A (en) System and method for controllably viewing digital video streams captured by surveillance cameras
US20140328578A1 (en) Camera assembly, system, and method for intelligent video capture and streaming
US20080313683A1 (en) Moving image communication device, moving image communication system and semiconductor integrated circuit used for communication of moving image
CN109089131B (en) Screen recording live broadcast method, device, equipment and storage medium based on IOS system
GB2408166A (en) Monitoring system
IE84148B1 (en) A monitoring system
EP1624695A1 (en) Controlling the distribution of a video stream at different frame rates to various recipients
WO2006119576A1 (en) Method and system for transmitting video to a mobile terminal
WO2018084991A1 (en) Image quality management
KR20140072668A (en) Network camera server and method for processing video stream thereof
Arun et al. Innovative solution for a telemedicine application
JP4261229B2 (en) Method of distributing monitoring data in network type monitoring device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)