EP0749619A1 - System und verfahren zur bildauswertung - Google Patents
System und verfahren zur bildauswertungInfo
- Publication number
- EP0749619A1 EP0749619A1 EP95911286A EP95911286A EP0749619A1 EP 0749619 A1 EP0749619 A1 EP 0749619A1 EP 95911286 A EP95911286 A EP 95911286A EP 95911286 A EP95911286 A EP 95911286A EP 0749619 A1 EP0749619 A1 EP 0749619A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- image data
- objects
- data
- predetermined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19604—Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
Definitions
- the invention relates to a system and a method for image evaluation according to the preambles of claims 1 and 20, respectively.
- a method for image evaluation in which a video signal supplied by a surveillance television camera is digitized by means of an A / D converter, stored in a correspondingly dimensioned image memory and with the digitized video signal of a subsequent picture is compared. From the difference between the two pieces of image information, changes of the smallest extent can be determined and very precisely localized, the achievable resolution being determined by the camera used.
- the high information content of the image information obtained in this way results in a low reaction speed when responding to changes due to the limited processing speeds of the processors normally used.
- DE-PS 36 28 816 discloses a system for image evaluation of the type mentioned in the introduction, in which a predetermined number of sample values belonging to a fixed subfield are combined to form a total value which is stored in a reference memory. For each new full image, the difference between the current and the stored total values of corresponding subfields is formed and compared with a threshold value. If the threshold is exceeded, an alarm is triggered for the corresponding subfield.
- a threshold value If no further information about the change in the image content is evaluated in the known motion detectors, the problem arises that image changes due to changing lighting conditions (sun, clouds, reflections on reflecting surfaces, etc.), fluctuations of Mounting masts of the surveillance camera or other irrelevant movements (snow, birds, etc.) lead to frequent false alarms.
- the invention is therefore based on the object of providing a system and a method for image evaluation by means of which an alarm triggering in the event of irrelevant image changes is reliably avoided.
- FIG. 1 is a block diagram of a system for image evaluation according to a first embodiment
- 3 shows an object list updated after correlation of the current objects with the stored objects and supplemented after object tracking
- FIG. 4 is a schematic representation of the camera assembly and a vertical building in the field of view
- FIG. 5 shows an image generated by means of the camera shown in FIG. 4 to show perspective size changes
- FIG. 6 shows a block diagram of a system for image evaluation according to a second exemplary embodiment
- FIG. 9 is a block diagram of a system for image evaluation according to a third exemplary embodiment.
- the block diagram shown in Fig. 1 shows a first embodiment of a system for image evaluation.
- the video signal supplied by a camera 1 is digitized by means of an analog / digital converter 2.
- the camera 1 can be a conventional video surveillance camera but also an infrared or thermal imaging camera for generating the video signal.
- the digitized pixel data can optionally be combined to reduce the amount of data by a reduction stage 3, as described, for example, in DE-PS 36 28 816, by adding individual pixel data in groups to new pixel data, or directly in a first image memory 4 ( LIVE memory) can be saved.
- the reduced or digitized image in full resolution is periodically stored in a second image memory 5 (REFERENCE memory).
- a difference image generating device 6 which can be implemented by a signal processor or a hard-wired arithmetic circuit, forms for each new image the difference between corresponding, possibly reduced pixel data of the "LIVE image” and the "REFERENCE image".
- the difference image generating device 6 comprises a subtraction part 60, an amount formation part 61 and a threshold value comparator part 62. After the amount formation, the difference between the pixel data of a certain pixel is compared with a threshold value, which represents the decision threshold for a pixel change. This "sensitivity threshold" eliminates changes caused by signal noise.
- a binary value "1" is written into a binary image memory 7 if the threshold value is undershot.
- the pixels with the binary value "1" thus represent marked pixels in which an image change was found.
- an object extractor 8 which can be implemented, for example, by a microprocessor, for coherently marked pixels, all coherent pixels being Points examined, wherein all connected pixels are assigned to a so-called object. Accordingly, an object corresponds to a coherent image area that has changed within a certain time period that is dependent on the storage cycle of the second image memory.
- Object data of the extracted objects are stored in an object list.
- the objects are, for example, as a rectangle or the like circumscribing the maximum horizontal and vertical extent of the marked pixel area. Are defined.
- FIG. 2 shows a memory section of the binary image memory 7 in the form of a coordinate system in which the marked pixels are designated by "X".
- two marked pixel areas that have changed with respect to the reference image have been extracted as objects 1 and 2.
- the extracted objects are rectangular, their heights H 1, H 2 or widths B 1 , B2 depending on the extent of the marked pixel regions.
- FIG. 2 shows an object list in which the coordinates x, y of the object center M, the object height H, the object width B and the number P x of the binary marked pixels are stored for each extracted object.
- the current object list is compared and updated with a stored object list of the previous image by means of an object correlator 9 which can also be implemented by a microprocessor.
- the objects extracted from the current binary image are shaped or similar to the objects found in the previous image by means of a plausibility check, such as checking for a minimum distance. assigned. and objects to which no object has been assigned for a certain period of time are deleted again.
- An object tracking unit 11 calculates a vector which results from the difference between the detection point, namely the center of a new object, and the stored center point M (x; y) of the associated correlated object of the previous image. From the calculated vector, a distance covered s is determined as the amount of the vector, a horizontal and vertical directional component Rpj or Ry and an average speed v using the previous duration T of the object in question.
- FIG. 3 shows an object list updated after the correlation, which was supplemented by the data calculated by means of the object tracking unit.
- the current center of detection of an object is represented by the coordinates x n , y n and the last stored center of the object is represented by the coordinates x n - ⁇ , Yn- l • ° i e values H n _ ⁇ , B n _ ⁇ and P x - indicate the last stored height, width and number of the marked pixels of the object.
- the updated object list is supplemented by the determined values for the amount of the movement vector s, the average speed v, the previous duration T and the movement direction components R H and Ry.
- Object 1 has, for example, a current detection center (2; 0).
- the last saved center has the coordinates (3.5, -1.5). According to the Pythagorean Theorem, this results in a distance to:
- a subsequent feature extraction unit 12 which can again be implemented by means of a microprocessor, reads the image data in the area of the alarm-relevant object rectangles from the first image memory 4 and extracts image content features for the object in this image section according to known image processing methods.
- this feature extraction only occurs for alarm-relevant objects, i.e. for objects that have a predetermined direction, size, speed, etc.
- the size of the extracted rectangle and the number of marked pixels found within the rectangle are used to determine the object size.
- an alarm object checking unit 13 all features of the extracted and tracked objects are identified with a Compare the predetermined list of required feature criteria and an alarm is only triggered and all the criteria are met and the video signal is switched to a monitor 14, the alarm objects with the associated vectors being faded in.
- the differentiation of alarm-relevant changes from non-relevant ones is carried out, among other things, with the aid of the object size.
- the real height and width of an object at a certain distance from the camera can be calculated from these parameters.
- FIG. 5 also shows a resulting video image during the mounting of the surveillance camera K shown in FIG. 4.
- a size in pixels e.g. 40 x 10 can be calculated for each y-coordinate.
- FIG. 6 shows a second exemplary embodiment of the system for image evaluation, in which the above-mentioned problem can be avoided.
- the evaluation device 10 evaluates the difference image directly with regard to predetermined features that determine the alarm relevance, such as, for example, the object size, and generates it in the event of an alarm-relevant change an alarm signal, whereupon the video signal is switched to the monitor 14.
- the image parts in the present motion detector are marked by a one-time adjustment process which do not decrease in perspective (see dashed lines in FIG. 5).
- the adjustment process is carried out by the operator by means of an input device 21 such as a keyboard, a mouse or a light pen, the video image being switched to the monitor.
- an input device 21 such as a keyboard, a mouse or a light pen
- the marking takes place by moving or enlarging and reducing the size of the dashed areas which are brought into line with the desired image sections.
- the object size in the marked areas is determined without taking into account the reduction in perspective.
- the coordinates of the marked difference image points are compared with the coordinates of the marked image sections.
- Such a correction of the object size determination can of course also be used in the system according to the first exemplary embodiment shown in FIG. 1, the correction of the perspective size change in this case taking place in the feature extraction unit 12.
- FIG. 7 Such a development of the system is shown in the block diagram shown in FIG. 7.
- data of striking, unchangeable image content such as contours independent of the lighting conditions are stored in a read-only memory 17 and compared with the current image information in a comparison device 16 immediately after switching on or continuously at specific time intervals.
- a warning signal is forwarded via a signal line 20 to a display device on the monitor 14 for displaying the change in position.
- the contours are marked, for example, after the surveillance camera 1 has been installed.
- a contour image is generated by means of a contour circuit 15, to which the digitized data of the memory 4 is supplied, all contours which have a high jump in contrast being displayed in color.
- a suitably prepared contour image is fed to the monitor 14 via a signal line 19.
- the marking is preferably done by input via an input device 18 such as a keyboard, a mouse or a light pen, since the system cannot independently recognize which contours cannot be changed. For example, the contours of a parked vehicle must not be marked as reference contours.
- FIG 8 shows an example of a contour image generated in this way, in which marked reference contour sections K 1 and K 2 are shown in dashed lines.
- marked reference contour sections K 1 and K 2 are shown in dashed lines.
- two contour sections arranged at right angles to one another are marked in order to ensure reliable detection of each change in position.
- FIG. 9 shows a block diagram of a third exemplary embodiment of the system for image evaluation, in which the false alarm frequency can be reduced by a so-called auto-parameterization.
- the programming of the feature criteria of the evaluation device 10 or the threshold value of the threshold value comparator part 62 of the difference image generating device 6 is adapted to the current environmental influences.
- the detection sensitivity to the current environmental influences such as snow, rain, fog, passing clouds or the like.
- the threshold would be reduced, thereby increasing the detection sensitivity.
- Unique features that are detected by the pre-evaluation device 22 are, for example, many small objects that move in a preferred direction in the case of snowfall, a movement that occurs only in a limited area in the case of shadows from trees, etc.
- the auto-parameterization therefore makes it possible to adapt the sensitivity and object evaluation criteria of the motion detector to the prevailing environmental conditions in such a way that the criteria for alarm relevance are set higher in the case of poor visibility or a strongly moving image background.
- a motion detector and a motion detection method are disclosed, according to which a differential image obtained from a digitized video signal from a surveillance camera 1 and in which changing pixels are marked is examined for contiguous pixels (objects) by means of an object extractor 8.
- the extracted objects are assigned to stored objects of a previous image, and then the object data obtained are checked for alarm relevance with regard to predetermined object features. An alarm is triggered when the object features are fulfilled.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE4407528A DE4407528C2 (de) | 1994-03-07 | 1994-03-07 | Bewegungsmelder und Verfahren zur Bewegungsmeldung |
DE4407528 | 1994-03-07 | ||
PCT/EP1995/000783 WO1995024702A1 (de) | 1994-03-07 | 1995-03-03 | System und verfahren zur bildauswertung |
Publications (1)
Publication Number | Publication Date |
---|---|
EP0749619A1 true EP0749619A1 (de) | 1996-12-27 |
Family
ID=6512061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP95911286A Withdrawn EP0749619A1 (de) | 1994-03-07 | 1995-03-03 | System und verfahren zur bildauswertung |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP0749619A1 (de) |
FI (1) | FI963495A (de) |
NO (1) | NO963737L (de) |
WO (1) | WO1995024702A1 (de) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19603766A1 (de) * | 1996-02-02 | 1997-08-07 | Christian Gieselmann | Verfahren zum Erkennen gerichteter Bewegungen sowie Alarmanlage zur Durchführung des Verfahrens |
DE19603935A1 (de) * | 1996-02-03 | 1997-08-07 | Bosch Gmbh Robert | Verfahren zum Erkennen von Änderungen in einem Überwachungsbereich |
EP0967584B1 (de) * | 1998-04-30 | 2004-10-20 | Texas Instruments Incorporated | Automatische Videoüberwachungsanlage |
DK1079350T3 (da) * | 1999-07-17 | 2004-02-02 | Siemens Building Tech Ag | Indretning til rumovervågning |
NZ571839A (en) | 2006-03-29 | 2010-05-28 | Univ Curtin Tech | Testing surveillance camera installations |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2538653B1 (fr) * | 1982-05-28 | 1989-03-31 | Thomson Csf | Procede d'estimation des translations subies par des objets representes dans une sequence d'images et dispositif mettant en oeuvre ce procede |
DE3628816C1 (en) * | 1986-08-25 | 1987-11-19 | Ind Technik Ips Gmbh | Method and device for automatically checking a video signal |
JPH0335399A (ja) * | 1989-06-30 | 1991-02-15 | Toshiba Corp | 変化領域統合装置 |
-
1995
- 1995-03-03 WO PCT/EP1995/000783 patent/WO1995024702A1/de not_active Application Discontinuation
- 1995-03-03 EP EP95911286A patent/EP0749619A1/de not_active Withdrawn
-
1996
- 1996-09-06 NO NO963737A patent/NO963737L/no not_active Application Discontinuation
- 1996-09-06 FI FI963495A patent/FI963495A/fi not_active Application Discontinuation
Non-Patent Citations (1)
Title |
---|
See references of WO9524702A1 * |
Also Published As
Publication number | Publication date |
---|---|
NO963737D0 (no) | 1996-09-06 |
WO1995024702A1 (de) | 1995-09-14 |
FI963495A (fi) | 1996-10-10 |
NO963737L (no) | 1996-09-06 |
FI963495A0 (fi) | 1996-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE4407528C2 (de) | Bewegungsmelder und Verfahren zur Bewegungsmeldung | |
AT506928B1 (de) | Verfahren zur videoanalyse | |
DE69815977T2 (de) | Gegen globale veränderungen unempfindlicher videobewegungsdetektor | |
DE3634628C2 (de) | ||
DE69635980T2 (de) | Verfahren und vorrichtung zur detektierung von objektbewegung in einer bilderfolge | |
AT502551B1 (de) | Verfahren und bildauswertungseinheit zur szenenanalyse | |
EP0815539B1 (de) | Verfahren zur erkennung bewegter objekte in zeitlich aufeinander folgenden bildern | |
DE10301469A1 (de) | Infrarotbildverarbeitungsvorrichtung | |
DE4332753C2 (de) | Verfahren zur Erkennung bewegter Objekte | |
EP1346330B1 (de) | Video-rauchdetektionssystem | |
DE10042935B4 (de) | Verfahren zum Überwachen eines vorbestimmten Bereichs und entsprechendes System | |
DE3214254A1 (de) | Verfahren zum erkennen von bewegungen in video-kamera-bildern | |
EP0543148B1 (de) | Verfahren und Vorrichtung zum Erkennen von Veränderungen im Bildinhalt eines Videobildes | |
EP0777864B1 (de) | System und verfahren zur bildauswertung | |
EP0749619A1 (de) | System und verfahren zur bildauswertung | |
DE10049366A1 (de) | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System | |
DE69233637T2 (de) | Bildanalysator | |
EP0710927B1 (de) | Verfahren zur objektorientierten Erkennung bewegter Objekte | |
DE19641000C2 (de) | Verfahren und Anordnung zur automatischen Erkennung der Anzahl von Personen in einer Personenschleuse | |
JPH05300516A (ja) | 動画処理装置 | |
DE19749136C2 (de) | Verfahren und Vorrichtung zum Erfassen von Bewegungen | |
EP3453001A1 (de) | Detektionsvorrichtung, verfahren zur detektion eines ereignisses und computerprogramm | |
EP0515890B1 (de) | Verfahren zur Überwachung von Gelände | |
DE102004020467A1 (de) | Videoüberwachungs- und Alarmsystem |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI LU MC NL PT SE |
|
17P | Request for examination filed |
Effective date: 19961126 |
|
17Q | First examination report despatched |
Effective date: 19990222 |
|
18D | Application deemed to be withdrawn |
Effective date: 19990907 |
|
D18D | Application deemed to be withdrawn (deleted) | ||
19U | Interruption of proceedings before grant |
Effective date: 19991123 |
|
19W | Proceedings resumed before grant after interruption of proceedings |
Effective date: 20001201 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SEISMA AG |
|
18D | Application deemed to be withdrawn |
Effective date: 20031001 |