US20060210110A1 - Monitoring device - Google Patents

Monitoring device Download PDF

Info

Publication number
US20060210110A1
US20060210110A1 US10/549,227 US54922705A US2006210110A1 US 20060210110 A1 US20060210110 A1 US 20060210110A1 US 54922705 A US54922705 A US 54922705A US 2006210110 A1 US2006210110 A1 US 2006210110A1
Authority
US
United States
Prior art keywords
image
monitoring device
camera
areas
camera device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/549,227
Other languages
English (en)
Inventor
Ralf Hinkel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mobotix AG
Original Assignee
Mobotix AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mobotix AG filed Critical Mobotix AG
Assigned to MOBOTIX AG reassignment MOBOTIX AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BORCHERS, KLAUS, HINKEL, RALF
Publication of US20060210110A1 publication Critical patent/US20060210110A1/en
Priority to US12/286,129 priority Critical patent/US7801331B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19606Discriminating between target movement or movement in an area of interest and other non-signicative movements, e.g. target movements induced by camera shake or movements of pets, falling leaves, rotating fan
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19667Details realated to data compression, encryption or encoding, e.g. resolution modes for reducing data volume to lower transmission bandwidth or memory requirements
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19691Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention concerns the preamble claim and therefore involves monitoring devices, in particular cameras.
  • Another known realization is designed to align a system of mirrors that is placed in the optical path, with which a panoramic view is possible and to correct the image that is distorted by the mirror.
  • the image resolution, in particular in the strongly distorted border regions is in spite of the correction poor and observation with a higher resolution is no longer possible or useful.
  • the task of the present invention consists in providing novelties for the commercial utilization.
  • the present invention proposes therefore in a first main concept a monitoring device with a multi-camera device and an object tracking device for the high resolution observation of moving objects, in which it is arranged that the object tracking device contains an image integration device for the creation of a total image from the individual images of the multi-camera device and a cut-out defining device for the, independent from the borders of the individual images, definition of a cut-out that will be observed at high resolution.
  • a first essential aspect of the present invention therefore is that, by the arrangement of a multitude of individual cameras, whose partial images are combined into a single total image, and the arrangement of an additional device that defines which areas are observed with high resolution, independent from the borders of the individual images, a completely in electronic form realized high resolution object tracking is guaranteed, that moreover can track the single object much faster than is possible with pan-and-tilt units of a conventional fixed camera or a moveable camera that is placed in a dome housing. Furthermore, because of the multitude of individual cameras in the multi-camera device, sensors, that are of indeed high but not of the highest resolution, can be used, making possible a very economical design of the system given the present state of sensor technology.
  • the multi-camera device will comprise typical digital individual cameras, that is, for example, CCD and/or CMOS arrays or something similar. These can deliver their images in digital format.
  • CCD and/or CMOS arrays or something similar. These can deliver their images in digital format.
  • already resolutions result with the present sate of the technology with, as an example, 1,280 horizontal dots that already with lesser sensors, for example four sensors, can no longer be meaningfully displayed on a single monitor. For this reason it is arranged that the image information of the overview image that is requested from a sensor, respectively a range of sensors, can be transmitted to the multi-camera device with reduced resolution. This lowers the network load of the image transmission when the image transmission takes place from, for example, typical LAN and/or ISDN capable camera systems.
  • the multi-camera device can comprise data compression means for the reduction of the data transmission rate.
  • the image areas of the individual cameras will at least at the borders overlap. Otherwise parts of the observed space, if necessary however only as very narrow bands, can remain unobserved.
  • an overlapping can vary from multi-camera device to multi-camera device.
  • the installation of the individual cameras can take place without high precision, for example, by gluing the individual cameras, and it is then only ensured that the creation of a total image incorporates the overlapping correctly for the corresponding individual case.
  • Relevant calibration information can, for example, be placed in the multi-camera device, in order to make a precise and exact image reproduction possible, independent from the respective control station and with information that is available there.
  • autodidactic software on the multi-camera device, or a control station that is connected to it can be used.
  • each spatial area is observed by at least two cameras, at least in critical areas such as doors, bank counters, and the like. In this way a greater reliability of the total system in case of camera failure is achieved.
  • the individual cameras may differ from each other in the way that they create an image.
  • every sensor with an automatic exposure appropriate to the lighting conditions of the spatial area that it observes.
  • a first sensor can be aimed at a very dark, not illuminated spatial area, and can accordingly be adjusted to be more sensitive, while another sensor observes a brightly illuminated area of the same space.
  • the image integration device is preferably constructed to correct, preferably automatically, for such differences, in as far as that does not take place in the multi-camera device. If required, status information concerning current illumination information, etc., for the correction, can be transmitted together with the image.
  • the monitoring device will be coupled to a total image display, which has a typical lower resolution as the one that results from the maximal possible combined resolution of the individual image cameras.
  • Preferred detail cut-out display means are then provided that depict detailed information of certain, selected areas. This detail cut-out display can be designated as image cut-out to one and the same monitor and/or it can be assigned to separate monitors.
  • a motion detection is possible that in the cameras themselves detects pronounced image changes, in order to define interesting image cut-outs.
  • a user it is possible for a user to focus on per se interesting areas, that is, image cut-outs that must be displayed principally and/or continuously and/or rather more detailed, for example, the area close to a bank counter or something of a similar kind.
  • a corresponding signal can be passed from the camera, to which the first sensor belongs, to the camera to which the second sensor belongs, so that a continuous observation of the interesting object becomes possible with higher resolution and/or is avoided, and that objects, already identified as uninteresting, can be further followed.
  • sound locating can take place, directional microphones can track and/or several microphones can be evaluated.
  • FIG. 1 a representation of a monitor device according to the invention.
  • the monitoring device 1 in general denoted with 1, comprises a multi-camera device 2 and an object tracking device 3 for the high resolution observation of moving objects, in which it is arranged that the object tracking device 3 contains an image integration device 5 for the creation of a total image 6 from the individual images 6 a , 6 b , etc., of the multi-camera device and a cut-out defining device 7 for the definition, independent from the borders of the individual images, of cut-outs 4 a , 4 b that will be observed at high resolution.
  • the object tracking device 3 contains an image integration device 5 for the creation of a total image 6 from the individual images 6 a , 6 b , etc., of the multi-camera device and a cut-out defining device 7 for the definition, independent from the borders of the individual images, of cut-outs 4 a , 4 b that will be observed at high resolution.
  • the monitoring device 1 serves in the present example to observe the public area of a banking hall and comprises two, through a conventional computer network cable (LAN) 8 connected, areas, namely, on the one hand, the banking hall in which the multi-camera device 2 is arranged, and, on the other hand, a control station area where the observation of the banking hall takes place and in which the object tracking device 3 is arranged.
  • LAN computer network cable
  • the multi-camera device 2 comprises in the present example an array of 3 ⁇ 3 individual cameras, each with CCD arrays 2 a and comprising a projection optics 2 b as well as an evaluation circuit 2 c for each individual camera.
  • the evaluation units 2 c are connected to each other via a bus system 8 a inside the multi-camera device and through the cable 8 to the control station via similar plug-and-socket arrangements 8 b .
  • a communication control unit 9 is connected to the bus 8 a for controlling the communication between the individual cameras and for communication via the cable 8 with the control station.
  • the various CCD sensors 2 a 1 , 2 a 2 , etc. are identical to each other and provide a solution, as they are readily available and economical.
  • the sensors and optics 2 a , 2 b are oriented with respect to each other in such a way that the spatial areas that are obtained from an individual camera overlap. This is in each case clearly illustrated, by dot and dash, broken, and dotted lines, in the depicted layout by means of the overlap of the fields of view originating from the optics 2 b 1 , 2 b 2 , 2 b 3 .
  • the optics 2 b is immobile and economical.
  • the overlapping areas of these optics are, as is obvious in the figure, not identical, that is, the width of overlapping of the optics 2 a 1 and 2 a 2 is different from the width of the overlapping of the optics 2 a 2 and 2 a 3 .
  • the sensors 2 a 1 , 2 a 2 , 2 a 3 , etc. can be read with different sensitivity by their evaluation circuit 2 c 1 , 2 c 2 , 2 c 3 , etc., in order that without problems a response to a brightness difference of an adjacent area of view is possible, so that, at that time, an adequate sensitivity can be chosen.
  • the evaluation circuit 2 c is designed to condition the image data received from the sensor 2 a and, as is required for the transmission, to reduce them. For disclosure purposes be it referred to the data reduction possibilities according to the aforementioned applications.
  • the evaluation circuit 2 c is moreover in particular designed to send image data, of the area parts of the designated sensor array with a higher compression and data of other area parts of the sensor array without compression, over the cable 3 through the internal bus 8 a and the interface 8 b .
  • each unit 2 c is designed to capture motions of objects 10 inside its designated sensor view array and to define a motion cut-out and to determine the direction of motions. Furthermore, it is designed to transmit to the unit 9 motions in directions towards the sensor borders and to receive from it corresponding information. Such motions of objects 10 a , 10 b are depicted in the figure by arrows 11 a , 11 b.
  • the object tracking device 3 in the monitoring control station comprises a display that, on the one hand, can display a total image that is composed from the individual images 6 of the cameras in the multi-camera device 2 , whereby the display for the reception of the total images is connected to an image integration device 5 , and on the other hand, with which the, likewise from the image integration device 5 received and with the cut-out definition device 7 defined, cut-outs 4 a , 4 b can be displayed with high resolution.
  • the display device could be a conventional, high resolution computer monitor.
  • the individual images 6 a , 6 b are not different and/or isolated from each other.
  • the dot and dash line 12 only serves for visualization during the clarification of the invention.
  • the overlapping areas are only simply represented, thus do not show up twice, even though they are captured by several cameras.
  • the object tracking device 3 accordingly is designed in such a way that a continuous, overlap free image results.
  • a stage can be provided in which the overlap area is cut.
  • the cut-outs 4 a , 4 b are in the present example placed around two moving persons 10 , 10 b and, as is clear, the cut-outs are displayed with higher resolution on a part of the monitor's screen.
  • a display on a separate monitor is also possible.
  • a multi monitor system they can be connected to one and the same computer.
  • it is possible to route to a dedicated monitor by providing a certain local intelligent signal directly to the display of the cut-outs that are to be observed locally.
  • the image integration device 5 consists in the present case of a PC, that is fed via a LAN connection 8 the individual camera images that have reduced resolution as well as reduced data.
  • individual camera image an image, respectively an image sequence
  • image integration device 5 is moreover advantageous, in order to keep the data transmission rate on the network 8 low and it thus makes possible in particular a precise remote observation via ADSL, ISDN, or the like.
  • the total image 6 is built, as mentioned before, from the individual images 6 a , 6 b , etc.
  • cut-out definition device 7 serves initially a touch sensitive area of the display area, on which the total image is displayed and on which a user can mark with the finger or an appropriate object a momentarily relevant area. In the depicted example two of such areas 4 a and 4 b are marked.
  • an automatic evaluation can take place in the image integration device 5 or the communication control unit 9 , for example, by taking into consideration image areas of individual cameras that recently have changed considerately, areas of the total images, etc., and/or by taking into account particularly interesting or less interesting areas, such as decoration areas 6 a .
  • automatic selection and/or object tracking information concerning the current particular high resolution areas can be made available on the cable 8 by the communication control unit 9 .
  • the transmission cables 8 are conventional transmission cables that, however, implicitly may not and/or shall not be loaded with the data rate of the fully resolved individual images.
  • image data and/or other data such as control information, tone signals, etc.
  • control signals from the image integration device 5 can be transmitted, concerning in particular those areas that a user, according to a, with the cut-out definition device 7 made-up cut-out definition, wants to display with high resolution.
  • the communication control unit 9 is designed to evaluate the single image data obtained from the evaluation circuit 2 c in particular concerning the there detected motion of objects, in order to, when an object that has to be represented with high resolution oversteps an image boundary, automatically transmit a message to every neighboring individual camera of the multi-camera device 2 , with which the moving object will be captured as the next view of the observed motion, as is represented by arrow 11 a or 11 b.
  • the multi-camera device 2 is assembled with a multitude of identical cameras without an expensive alignment taking place during the assembly. Then the multi-camera device 2 is mounted in the area that has to be observed, and namely in such a way that it captures with the individual cameras the whole interesting spatial area. After this mounting and the one-time alignment the multi-camera device 2 is fixated so that subsequently all cameras continuously will observe an in every case fixed area. Now the multi-camera device 2 is connected to the cable 8 and at first a learning phase is started, in which through the evaluation of image signals it is ensured that objects that occur in the overlapping image areas, are represented only once in the total image. This can take place in an autodidactic manner with pattern recognition and/or through a calibration phase with known patterns, if necessary, a calibration before mounting is possible in the manufacturing process.
  • an observer in the control station can create on the area of a touch sensitive monitor 7 that displays the total image 6 , a corresponding marking and thereby demand high resolution images of the persons.
  • the areas that should be displayed at high resolution are demanded via the cable 8 independently of the borders of the individual images and are by means of the borders of the image area, that is to be transmitted at high resolution, assigned by the communication control unit 9 to the individual cameras.
  • a cut-out choice can be made-up by the user and transmitted to the multi-camera device 2 and/or an automatic choice takes place in the multi-camera device 2 , in particular in a processor circuit that implements if possible the communication control unit 9 and the image integration stage 5 in an integrated manner.
  • An image compression dependent on the position and motion of different resolution stages, can be produced, whereby then, for example, image areas that are in the in the multi-camera device 2 identified as uninteresting during magnification of the total image and/or cut-out choice, which can in particular take place on the receiver side, can be displayed blurred,

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Noodles (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)
  • Burglar Alarm Systems (AREA)
  • Alarm Systems (AREA)
US10/549,227 2003-03-10 2004-03-10 Monitoring device Abandoned US20060210110A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/286,129 US7801331B2 (en) 2003-03-10 2008-09-26 Monitoring device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10310636.7 2003-03-10
DE10310636A DE10310636A1 (de) 2003-03-10 2003-03-10 Überwachungsvorrichtung
PCT/DE2004/000471 WO2004081895A1 (de) 2003-03-10 2004-03-10 Überwachungsvorrichtung

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/286,129 Continuation US7801331B2 (en) 2003-03-10 2008-09-26 Monitoring device

Publications (1)

Publication Number Publication Date
US20060210110A1 true US20060210110A1 (en) 2006-09-21

Family

ID=32920733

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/549,227 Abandoned US20060210110A1 (en) 2003-03-10 2004-03-10 Monitoring device
US12/286,129 Expired - Fee Related US7801331B2 (en) 2003-03-10 2008-09-26 Monitoring device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/286,129 Expired - Fee Related US7801331B2 (en) 2003-03-10 2008-09-26 Monitoring device

Country Status (5)

Country Link
US (2) US20060210110A1 (de)
EP (1) EP1614080B1 (de)
AT (1) ATE385013T1 (de)
DE (3) DE10310636A1 (de)
WO (1) WO2004081895A1 (de)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2091253A1 (de) * 2006-11-07 2009-08-19 Sony Corporation Sendeeinrichtung, videosignalsendeverfahren in einer sendeeinrichtung, empfangseinrichtung und videosignalempfangsverfahren in einer empfangseinrichtung
US20110187536A1 (en) * 2010-02-02 2011-08-04 Michael Blair Hopper Tracking Method and System
CN102176246A (zh) * 2011-01-30 2011-09-07 西安理工大学 一种多相机目标接力跟踪系统的相机接力关系确定方法
EP2659433A2 (de) * 2010-12-30 2013-11-06 Pelco, Inc. Inferenzmaschine für auf videoanalysemetadaten basierende ereigniserkennung und forensische suche

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2505831C (en) 2002-11-12 2014-06-10 Intellivid Corporation Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
DE102004052980A1 (de) * 2004-10-29 2006-05-04 Divis Gmbh Verfahren zum Überwachen eines Objekts
JP2008527806A (ja) * 2005-01-03 2008-07-24 ブミー インコーポレイテッド 夜間監視のシステムおよび方法
DE602006020422D1 (de) 2005-03-25 2011-04-14 Sensormatic Electronics Llc Intelligente kameraauswahl und objektverfolgung
GB0507869D0 (en) * 2005-04-19 2005-05-25 Wqs Ltd Automated surveillance system
CN100525395C (zh) * 2005-09-29 2009-08-05 中国科学院自动化研究所 多摄像机下基于主轴匹配的行人跟踪方法
DE102006033133B3 (de) * 2006-07-18 2007-09-06 Abb Patent Gmbh Einrichtung zur Erfassung von Personen mittels Kamera, insbesondere Videotürsprechanlage
EP1953699A1 (de) * 2007-02-01 2008-08-06 Sunvision Scientific Inc. System und Verfahren zum Speichern von Bildern mit variabler Auflösung
DE102007031538A1 (de) * 2007-07-05 2009-01-08 Neomer Gmbh Verfahren und Vorrichtung zur Datenübertragung
JP4959535B2 (ja) 2007-12-13 2012-06-27 株式会社日立製作所 撮像装置
DE102008038701B4 (de) * 2008-08-12 2010-09-02 Divis Gmbh Verfahren zum Nachverfolgen eines Objektes
US9998697B2 (en) 2009-03-02 2018-06-12 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US20130314536A1 (en) * 2009-03-02 2013-11-28 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
DE102009021974A1 (de) 2009-05-19 2011-03-03 Mobotix Ag Digitale Videokamera
FR2953088B1 (fr) * 2009-11-26 2012-03-02 Defiboat Technology Procede de transmission de donnees d'image et installation correspondante
DE102011103378B3 (de) 2011-06-03 2012-08-23 Dallmeier Electronic Gmbh & Co. Kg Überwachungseinrichtung
EP2662827B1 (de) 2012-05-08 2016-01-13 Axis AB Videoanalyse
US9124778B1 (en) * 2012-08-29 2015-09-01 Nomi Corporation Apparatuses and methods for disparity-based tracking and analysis of objects in a region of interest
CN103700106A (zh) * 2013-12-26 2014-04-02 南京理工大学 一种基于分布式摄像机的多视角运动目标统计与定位方法
KR20160094655A (ko) * 2015-02-02 2016-08-10 주식회사 일리시스 복수의 고해상도 카메라들을 이용한 파노라마 영상 감시 시스템 및 그 방법
DE102022202620A1 (de) 2022-03-17 2023-09-21 Robert Bosch Gesellschaft mit beschränkter Haftung Überwachungsanordnung zur Darstellung von bewegten Objekten Überwachungsbereich, Verfahren zur Darstellung von einem bewegten Objekt in einem Überwachungsbereich, Computerprogramm und Speichermedium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability
US6011901A (en) * 1995-05-18 2000-01-04 Timepres Corporation Compressed digital video record and playback system
US6072903A (en) * 1997-01-07 2000-06-06 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US6147709A (en) * 1997-04-07 2000-11-14 Interactive Pictures Corporation Method and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience
US6215519B1 (en) * 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
US20030053658A1 (en) * 2001-06-29 2003-03-20 Honeywell International Inc. Surveillance system and methods regarding same
US20030081813A1 (en) * 1999-05-26 2003-05-01 Brian Astle Motion tracking using image-texture templates
US20040183679A1 (en) * 2003-03-17 2004-09-23 Paximadis John Matthew Thermal signature intensity alarmer
US6810152B2 (en) * 2001-01-11 2004-10-26 Canon Kabushiki Kaisha Image processing apparatus, method of processing images, and storage medium
US7139410B2 (en) * 2001-10-30 2006-11-21 Denso Corporation Apparatus for protecting occupant in vehicle
US7257236B2 (en) * 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
US7319776B2 (en) * 2001-11-26 2008-01-15 Fujitsu Limited Image processing method and image processing program

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3640129A1 (de) * 1986-11-25 1988-06-01 Philips Patentverwaltung Schaltungsanordnung zur simultanen wiedergabe aus einer anzahl von quellen uebertragener bildsignale
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
DE19639728C2 (de) * 1996-09-26 1998-12-24 Siemens Ag Video-Überwachungseinrichtung
US6069655A (en) * 1997-08-01 2000-05-30 Wells Fargo Alarm Services, Inc. Advanced video security system
US7131136B2 (en) * 2002-07-10 2006-10-31 E-Watch, Inc. Comprehensive multi-media surveillance and response system for aircraft, operations centers, airports and other commercial transports, centers and terminals
US7023913B1 (en) * 2000-06-14 2006-04-04 Monroe David A Digital security multimedia sensor
DE19956266A1 (de) * 1999-11-23 2001-06-21 Schuetz Dich Entwicklungs & Ve Überwachungsanlage
US6704545B1 (en) 2000-07-19 2004-03-09 Adc Telecommunications, Inc. Point-to-multipoint digital radio frequency transport
DE10042935B4 (de) * 2000-08-31 2005-07-21 Industrie Technik Ips Gmbh Verfahren zum Überwachen eines vorbestimmten Bereichs und entsprechendes System
DE10049366A1 (de) * 2000-10-05 2002-04-25 Ind Technik Ips Gmbh Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System
AU2002307545A1 (en) * 2001-04-20 2002-11-05 Corp. Kewazinga Navigable camera array and viewer therefore
WO2002093916A2 (en) * 2001-05-14 2002-11-21 Elder James H Attentive panoramic visual sensor
DE10261501A1 (de) 2002-12-23 2004-07-15 Mobotix Ag Verfahren zur Datenreduktion
US20050146606A1 (en) * 2003-11-07 2005-07-07 Yaakov Karsenty Remote video queuing and display system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability
US6011901A (en) * 1995-05-18 2000-01-04 Timepres Corporation Compressed digital video record and playback system
US6072903A (en) * 1997-01-07 2000-06-06 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US6147709A (en) * 1997-04-07 2000-11-14 Interactive Pictures Corporation Method and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience
US6215519B1 (en) * 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
US20030081813A1 (en) * 1999-05-26 2003-05-01 Brian Astle Motion tracking using image-texture templates
US6810152B2 (en) * 2001-01-11 2004-10-26 Canon Kabushiki Kaisha Image processing apparatus, method of processing images, and storage medium
US20030053658A1 (en) * 2001-06-29 2003-03-20 Honeywell International Inc. Surveillance system and methods regarding same
US7139410B2 (en) * 2001-10-30 2006-11-21 Denso Corporation Apparatus for protecting occupant in vehicle
US7319776B2 (en) * 2001-11-26 2008-01-15 Fujitsu Limited Image processing method and image processing program
US7257236B2 (en) * 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
US20040183679A1 (en) * 2003-03-17 2004-09-23 Paximadis John Matthew Thermal signature intensity alarmer

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2091253A1 (de) * 2006-11-07 2009-08-19 Sony Corporation Sendeeinrichtung, videosignalsendeverfahren in einer sendeeinrichtung, empfangseinrichtung und videosignalempfangsverfahren in einer empfangseinrichtung
US20100269137A1 (en) * 2006-11-07 2010-10-21 Sony Corporation Transmission device, video signal transmission method for transmission device, reception device, and video signal reception method for reception device
EP2091253A4 (de) * 2006-11-07 2010-11-17 Sony Corp Sendeeinrichtung, videosignalsendeverfahren in einer sendeeinrichtung, empfangseinrichtung und videosignalempfangsverfahren in einer empfangseinrichtung
US9143637B2 (en) 2006-11-07 2015-09-22 Sony Corporation Transmission device, video signal transmission method for transmission device, reception device, and video signal reception method for reception device
US20110187536A1 (en) * 2010-02-02 2011-08-04 Michael Blair Hopper Tracking Method and System
EP2659433A2 (de) * 2010-12-30 2013-11-06 Pelco, Inc. Inferenzmaschine für auf videoanalysemetadaten basierende ereigniserkennung und forensische suche
EP2659433A4 (de) * 2010-12-30 2014-04-02 Pelco Inc Inferenzmaschine für auf videoanalysemetadaten basierende ereigniserkennung und forensische suche
US9226037B2 (en) 2010-12-30 2015-12-29 Pelco, Inc. Inference engine for video analytics metadata-based event detection and forensic search
CN102176246A (zh) * 2011-01-30 2011-09-07 西安理工大学 一种多相机目标接力跟踪系统的相机接力关系确定方法

Also Published As

Publication number Publication date
US7801331B2 (en) 2010-09-21
US20090067674A1 (en) 2009-03-12
DE10310636A1 (de) 2004-09-30
DE502004006047D1 (de) 2008-03-13
EP1614080A1 (de) 2006-01-11
JP2006520144A (ja) 2006-08-31
EP1614080B1 (de) 2008-01-23
WO2004081895A1 (de) 2004-09-23
DE112004000009D2 (de) 2005-02-03
JP5230933B2 (ja) 2013-07-10
ATE385013T1 (de) 2008-02-15

Similar Documents

Publication Publication Date Title
US7801331B2 (en) Monitoring device
KR101002066B1 (ko) 추적감시용 카메라 장치 및 이를 채용하는 원격 감시 시스템
US6850282B1 (en) Remote control of image sensing apparatus
EP1585332A1 (de) Fernvideoanzeigeverfahren, videoerfassungseinrichtung, verfahren dafür und programm dafür
US20080079809A1 (en) Surveillance camera and surveillance camera system with laser positioning function
WO2007015631A1 (en) Smart video monitoring system and method communicating with auto-tracking radar system
KR100822017B1 (ko) Cctv를 이용한 지능형 감시시스템 및 지능형 감시방법
KR101120131B1 (ko) 지능형 광역 감시 카메라, 그 제어회로 및 제어방법, 이를 이용한 영상 감시 시스템
KR101502448B1 (ko) 좌우 360도 상하 360도의 화각을 가지는 영상감시 시스템 및 감시방법
KR20070041492A (ko) 비디오 플래시 라이트를 수행하기 위한 방법 및 시스템
JP2006333133A (ja) 撮像装置、撮像方法、プログラム、プログラム記録媒体並びに撮像システム
KR20130130544A (ko) 감시 영상 표시 방법 및 시스템
US7239344B1 (en) Camera and device for switching optical filters
JP4865587B2 (ja) 設置型撮像装置
JPH0822586A (ja) 多地点監視システム
US6982749B2 (en) Remote control camera system and image transmission method
WO1997039580A1 (en) Imaging system
US20030071896A1 (en) Multiple camera arrangement
EP1595392B1 (de) Kamera mit panorama- und/oder neigungsfunktionalität
KR20030063810A (ko) 인터넷을 이용한 감시 및 관리 시스템용 중계장치
US20130155232A1 (en) Surveillance camera, surveillance system and method for configuring an or the surveillance camera
WO2008037127A1 (en) Monitoring camera and monitoring system having function of laser positioning
JP5230933B6 (ja) 監視装置
JPH0715646A (ja) ネットワーク対応カメラ
KR20100008881U (ko) 이미지센서 및 투광기를 이용하여 차량의 앞유리 상부의 햇빛 차단막을 스크린으로 하여 측후 방의 영상을 보여주는영상전달장치의 구조 및 동작 방식

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOBOTIX AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HINKEL, RALF;BORCHERS, KLAUS;REEL/FRAME:016664/0314

Effective date: 20050927

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION