WO2010057732A1 - Dispositif de production et/ou de traitement d'une signature d'objet, dispositif de contrôle, procédé et produit-programme - Google Patents
Dispositif de production et/ou de traitement d'une signature d'objet, dispositif de contrôle, procédé et produit-programme Download PDFInfo
- Publication number
- WO2010057732A1 WO2010057732A1 PCT/EP2009/063666 EP2009063666W WO2010057732A1 WO 2010057732 A1 WO2010057732 A1 WO 2010057732A1 EP 2009063666 W EP2009063666 W EP 2009063666W WO 2010057732 A1 WO2010057732 A1 WO 2010057732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature
- feature space
- object signature
- signature
- identification
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the invention relates to a device for generating and / or processing an object signature of an object from a scene, wherein the object signature is designed for the description and / or identification of the object, with a feature extraction device, which is adapted to at least one feature value of the object from the scene extract, wherein the feature value is displayed in a feature space of the feature, and with a coding device which is adapted to encode the feature value in an identification date, wherein the identification date forms part of the object signature.
- the invention further relates to a monitoring device, a method and a related computer program.
- Video surveillance systems in known embodiments have a plurality of surveillance cameras that also monitor complex surveillance areas.
- the image data streams from the monitoring areas are combined in some embodiments and evaluated centrally.
- automated monitoring has meanwhile been established using image processing algorithms.
- a common procedure here is to separate moving objects from the (essentially static) scene background, to track over time and to find relevant ones
- document DE 10 2005 053 148 A1 which is probably the closest prior art, discloses a method for handling content information in a video surveillance system. It is intended to provide content information of individual objects in a network and to distribute them over the network. The content information will be in different
- Hierarchy levels whereby the different hierarchy levels can be read independently of each other. It is additionally proposed to encode the content information in order to condense content information from lower hierarchy levels in higher hierarchy levels to overview information, so that this overview information of the higher hierarchy levels can be advantageously used in a search for relevant events in order to reduce the sighting of image archives from video data to short to restrict relevant periods.
- the invention relates to a device which for generating and / or processing, in particular for a comparison, an object signature of a
- Object from a scene is suitable and / or trained.
- the object is preferably formed as a moving object such as a person, a car or the like.
- Under the scene is a section of a surveillance area understood, which is observed with sensors or is observable.
- the sensors are preferably designed as cameras, alternatively or additionally, other sensors, such as temperature sensors, odor sensors, microphones, etc. may be used.
- the object signature is a data collection which is designed for the description and / or identification of the object, in particular in the context of a recognition or a comparison.
- the device comprises a feature extractor which can extract at least one feature value of the object from the scene, preferably several feature values are extracted.
- the feature value can be represented in a feature space of the feature, wherein the feature space is formed by a plurality of possible feature values for the feature. Each feature may thus assume a plurality of such feature values in the feature space.
- the feature space is designed as a color histogram, spanned by structural features, SIFT features, sounds, smells, etc.
- a coding device which is designed to code the feature value into an identification data, in particular into a dimensionless identification data, the identification data forming part of the object signature.
- the identification date refers to a subregion of the feature space of the feature.
- the identification date for the feature is not immediately descriptive, but represents a transformed or mapped value.
- the feature value is not integrated in the object signature itself. Instead, only a reference is mapped to the subarea in the object signature that includes the feature value.
- This implementation has the advantage that a very efficient and compact description of objects is ensured. Instead of the sometimes very complex characteristic values, only the references - comparable to pointers in the C syntax - are passed on to the subareas with the characteristic values.
- the identification data are each formed as a multi-bit word.
- Complexity of the feature space and number of subdomains to be considered therein is used as the object date of an 8-bit, 16-bit or 32-bit word and thus limits the memory or transfer demand for the associated feature in the object signature to this word length.
- the subarea is preferably selected such that it combines several feature values. In particular, it can be provided that subareas of a feature space include different numbers of feature values.
- the object signature does not only comprise an identification data of a single feature, but has a plurality of identification data which are assigned to different features.
- the nature of the features is not limited in this case, it may be features for describing optical properties, such as color, brightness, texture etc, the movement, such as speed or acceleration, and / or acoustic features, such as footsteps or odors, etc. of the object.
- electromagnetic radiation such as emission radiation from mobile phones and the like, as features in the object signature and to refer to it via the identification data.
- the device comprises a partial area generator, which is designed to generate the feature space and / or the partial areas in the feature space.
- the portions may completely cover the feature space, but it is preferred that the portions be spaced apart or disjointly disposed in the feature space.
- the generation of the feature space and / or the subareas may be in a training phase, for example offline in advance, or online during the life of the device. It is preferably ensured that the feature space associated with each feature is usefully divided into subareas or clusters.
- a meaningful clustering can be achieved, for example, by many test data (objects that are in a
- the coding device is designed such that an object receives the identification data from the sub-region closest to that part
- Feature space is arranged to the feature value.
- the feature space can also be extended during the runtime of the device by defining new portions when the feature values are too far away from the previous portions.
- the subspace generator is configured to generate a similarity measure between the subregions of a single feature space.
- the similarity measure is an evaluation of the similarity between two subareas.
- Similarity measure lies in the fact that the result of a comparison between a current object and a reference or comparison object is dependent not only on the identity of the object signatures, but on the similarity of the object signatures.
- the similarity measure is a
- Similarity graph formed the sections are connected as nodes or clusters via routes.
- the similarity of two subareas is determined, for example, by the number of intermediate nodes between two subareas and / or over the length of the intervening routes, each preferably by the shortest path.
- the device comprises a comparison device which is designed for a comparison between a first object signature with first identification data and a second object signature with second identification data on the basis of the similarity measure.
- the features are treated equally, in modified embodiments, the features may also be weighted so that more significant features are weighted more heavily in the comparison than less significant features.
- Another object of the invention relates to a monitoring device, in particular a video surveillance system with a plurality of surveillance cameras, which for monitoring a
- Surveillance area are arranged in a network and / or can be arranged and which at least one device according to one of the preceding claims, wherein the object signatures are transmitted via the network.
- the advantage of minimizing the data to be transmitted in the object signature according to the invention is particularly evident.
- Feature spaces or the sections and their mapping on the Identification data and / or the similarity measure are stored.
- the data memory can be accessed and thus the similarity between two object signatures can be determined.
- Another object of the invention relates to a method for generating and / or processing and / or search or retrieval of an object signature with the features of claim 1 1, which is preferably carried out on a device or monitoring device according to one of the preceding claims.
- the presented method allows efficient
- a last subject of the invention relates to a computer program having the features of claim 12.
- Figure 1 is a schematic block diagram of a monitoring device as an embodiment of the invention
- Figure 2 is an illustration of a feature space.
- FIG. 1 shows a schematic block diagram of a monitoring system 1, which is used to monitor a complex
- the monitoring system 1 comprises a plurality of monitoring devices 2, which are networked together via a network 3.
- the network 3 can be of any desired design, e.g. wireless or wired, as LAN, WLAN,
- the monitoring devices 2 each have sensors 4, which detect part of the surveillance area as a scene.
- the sensors 4 are designed as surveillance cameras.
- the monitoring devices 2 with the sensor 4 are designed as so-called smart cameras, since the invention supports the use of low-performance devices.
- One function of the surveillance system 1 is to track moving objects, such as persons, in the surveillance area. For this purpose, it is necessary, the moving objects that have been detected by a first monitoring device 2, when detected by another
- Monitoring device 2 to recognize again. For the recognition, for each detected object in each monitoring device 2, an object signature is created and compared with object signatures of other monitoring devices 2 distributed over the network 3. If the current object signature and an object signature of another monitoring device 2 match, then an object is recognized as being recognized. If no matching or corresponding object signatures can be found, then the object is considered new to the monitoring area. This situation may occur, for example, at entrance areas of the surveillance area.
- the sensor data of the sensors 4, in particular the images or image sequences of the surveillance cameras are passed into a feature extraction device 5, which detects features of a current moving object.
- a feature extraction device 5 which detects features of a current moving object.
- Feature extraction unit 5 comprises means for object detection and tracking 6, which - as mentioned above - first separates moving objects from the essential static scene background and tracks them over time. Starting from these detected objects, optical or kinetic features, such as color, brightness, texture, or else velocity or acceleration, are then extracted, with each of these features being assigned a feature value for the current object.
- the feature extractor 5 may also include other modules 7 for extraction of other features, such as acoustic features, etc.
- the feature values of the features are transferred to a coding device 8, which assigns an identification date in the form of a cluster ID to each feature value of a feature.
- a cluster ID For the explanation of the cluster ID, reference is made to FIG. FIG. 2 shows a schematic representation of a cluster diagram which is intended to define a feature space.
- the feature space can - as shown - two-dimensional, but also be three or more dimensional pronounced.
- possible or detected feature values are entered and subsequently combined into clusters 1, 2,...
- a meaningful clustering can be achieved, for example, by recording many test data and using this test data to set up and cluster the feature space.
- the feature space can also be extended during runtime by forming new clusters. Depending on how many clusters are calculated in a feature space, the size of the cluster ID (8-bit, 16-bit or 32-bit) of a cluster and thus the storage or transfer requirement for this feature type is specified in the object signature ,
- a current feature value of a current object 9 is assigned to the cluster closest in feature space.
- the feature value 9 marked with an asterisk is assigned to the cluster ID 1 and the feature value 10 marked by a cross is assigned to the cluster ID 4.
- the monitoring device 2 For the assignment of the current feature values to the cluster IDs, the monitoring device 2 has a data memory 11 for storing the feature space and the clusters.
- the object signature now consists largely or exclusively only of the cluster IDs, which are each designed as a multi-bit word. Thus, the majority of the information remains in the data memory 11 and only the references to the information are stored in the object signature. This implements a very efficient and compact description of objects.
- the monitoring device comprises
- a comparison device 12 which has an object signature, which is transmitted via the Network 3 is compared with the current object signature. Ideally, it would be sufficient to compare only the cluster IDs of the object signatures to identify the same or the same objects. However, this procedure is usually susceptible to errors because the feature values vary or rush. These variations can eg
- the feature space in FIG. 2 is extended by a similarity graph, and the distance between different cluster IDs is given in the form of a path length d or the number of nodes to be traversed in the similarity graph
- the similarity graph is stored in the data memory 12 or in a central memory in the network 3. In the event that the feature space and / or the similarity graph is changed during runtime, the changes will be passed on to the other data stores 12 of the other monitors 2 or to the central store (s).
- a similarity comparison between two object signatures takes place in addition to or instead of an identity comparison.
- Object signatures calculated. Subsequently, all similarities of all features of the object signatures are cumulated in order to obtain a global statement about the similarity of the objects. The accumulation takes place, for example, via a summation with different weighting of the features. With this procedure, a very robust recognition of objects is possible.
- the method may be applied to any features, with the subsequent fusion of the similarity statement for each feature allowing robust recognition of objects.
- the feature values In the case of a video surveillance as monitoring system 1, it is possible, for example, for the feature values to be calculated per frame of a camera and to be assigned to the camera assigned to each subarea or cluster to determine the cluster ID. In order to stabilize the object signature, only the cluster IDs can be used, which are determined more often, ie in several frames.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
L'invention concerne un dispositif (2) de production et/ou de traitement d'une signature d'un objet issu d'une scène, la signature d'objet étant conçue pour la description et/ou l'identification de l'objet. Le dispositif comporte un appareil d'extraction de caractéristique (5) conçu pour extraire au moins une valeur de caractéristique de l'objet à partir de la scène, la valeur de caractéristique pouvant être représentée dans un espace de la caractéristique; et un appareil de codage (8) conçu pour coder la valeur de caractéristique en une date d'identification, la date d'identification formant une partie de la signature d'objet et faisant référence à une partie de l'espace de la caractéristique.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009801465836A CN102224510A (zh) | 2008-11-21 | 2009-10-19 | 用于产生和/或处理对象签名的设备、监控设备、方法和计算机程序 |
EP09755853.0A EP2359308B1 (fr) | 2008-11-21 | 2009-10-19 | Dispositif de production et/ou de traitement d'une signature d'objet, dispositif de contrôle, procédé et produit-programme |
US12/922,225 US8670598B2 (en) | 2008-11-21 | 2009-10-19 | Device for creating and/or processing an object signature, monitoring device, method and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102008043953.3 | 2008-11-21 | ||
DE102008043953A DE102008043953A1 (de) | 2008-11-21 | 2008-11-21 | Vorrichtung zur Erzeugung und/oder Verarbeitung einer Objektsignatur, Überwachungsvorrichtung, Verfahren und Computerprogramm |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010057732A1 true WO2010057732A1 (fr) | 2010-05-27 |
Family
ID=42034511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2009/063666 WO2010057732A1 (fr) | 2008-11-21 | 2009-10-19 | Dispositif de production et/ou de traitement d'une signature d'objet, dispositif de contrôle, procédé et produit-programme |
Country Status (5)
Country | Link |
---|---|
US (1) | US8670598B2 (fr) |
EP (1) | EP2359308B1 (fr) |
CN (1) | CN102224510A (fr) |
DE (1) | DE102008043953A1 (fr) |
WO (1) | WO2010057732A1 (fr) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8542981B2 (en) * | 2008-05-20 | 2013-09-24 | Honeywell International Inc. | Manual voice annotations for CCTV reporting and investigation |
US8953836B1 (en) * | 2012-01-31 | 2015-02-10 | Google Inc. | Real-time duplicate detection for uploaded videos |
DE102020205724A1 (de) | 2020-05-06 | 2021-11-11 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren, Computerprogramm, Speichermedium und Vorrichtung zur Erweiterung von Objektsignaturdaten |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE502658C2 (sv) * | 1994-02-28 | 1995-12-04 | Non Stop Info Ab | Förfarande och kontrollanordning för avläsning av identitets -och värdehandlingar. |
US6590996B1 (en) * | 2000-02-14 | 2003-07-08 | Digimarc Corporation | Color adaptive watermarking |
US6298153B1 (en) * | 1998-01-16 | 2001-10-02 | Canon Kabushiki Kaisha | Digital signature method and information communication system and apparatus using such method |
US6583813B1 (en) * | 1998-10-09 | 2003-06-24 | Diebold, Incorporated | System and method for capturing and searching image data associated with transactions |
US6507912B1 (en) * | 1999-01-27 | 2003-01-14 | International Business Machines Corporation | Protection of biometric data via key-dependent sampling |
JP3679953B2 (ja) * | 1999-09-14 | 2005-08-03 | 富士通株式会社 | 生体情報を用いた個人認証システム |
US6483927B2 (en) * | 2000-12-18 | 2002-11-19 | Digimarc Corporation | Synchronizing readers of hidden auxiliary data in quantization-based data hiding schemes |
US20090231436A1 (en) * | 2001-04-19 | 2009-09-17 | Faltesek Anthony E | Method and apparatus for tracking with identification |
JP4177598B2 (ja) * | 2001-05-25 | 2008-11-05 | 株式会社東芝 | 顔画像記録装置、情報管理システム、顔画像記録方法、及び情報管理方法 |
US7113633B2 (en) * | 2001-07-02 | 2006-09-26 | Photoinaphoto.Com, Inc. | System and method for discovering and categorizing attributes of a digital image |
US20050063596A1 (en) * | 2001-11-23 | 2005-03-24 | Yosef Yomdin | Encoding of geometric modeled images |
CN1168044C (zh) * | 2001-12-13 | 2004-09-22 | 中国科学院自动化研究所 | 基于步态的远距离身份识别方法 |
JP2004048267A (ja) * | 2002-07-10 | 2004-02-12 | Sharp Corp | リライタブルメディアの改竄防止署名方法、この方法を実行する改竄防止署名装置、この装置を備えた改竄防止署名システム、この方法を実現するための改竄防止署名プログラムおよびこの改竄防止署名プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004348674A (ja) * | 2003-05-26 | 2004-12-09 | Noritsu Koki Co Ltd | 領域検出方法及びその装置 |
TWI288873B (en) * | 2004-02-17 | 2007-10-21 | Mitsubishi Electric Corp | Method for burying watermarks, method and device for inspecting watermarks |
CN100341732C (zh) * | 2004-11-03 | 2007-10-10 | 上海杰得微电子有限公司 | 一种基于人脸认证技术的汽车防盗方法 |
JP4772544B2 (ja) * | 2005-04-27 | 2011-09-14 | 富士フイルム株式会社 | 撮像装置、撮像方法、及びプログラム |
US7542610B2 (en) * | 2005-05-09 | 2009-06-02 | Like.Com | System and method for use of images with recognition analysis |
US7945099B2 (en) * | 2005-05-09 | 2011-05-17 | Like.Com | System and method for use of images with recognition analysis |
US7519200B2 (en) * | 2005-05-09 | 2009-04-14 | Like.Com | System and method for enabling the use of captured images through recognition |
US7970164B2 (en) * | 2005-08-04 | 2011-06-28 | Nippon Telegraph And Telephone Corporation | Digital watermark padding method, digital watermark padding device, digital watermark detecting method, digital watermark detecting device, and program |
DE102005053148B4 (de) | 2005-11-04 | 2022-12-22 | Robert Bosch Gmbh | Verfahren zur Handhabung von Inhaltsinformationen |
-
2008
- 2008-11-21 DE DE102008043953A patent/DE102008043953A1/de not_active Withdrawn
-
2009
- 2009-10-19 CN CN2009801465836A patent/CN102224510A/zh active Pending
- 2009-10-19 EP EP09755853.0A patent/EP2359308B1/fr active Active
- 2009-10-19 US US12/922,225 patent/US8670598B2/en active Active
- 2009-10-19 WO PCT/EP2009/063666 patent/WO2010057732A1/fr active Application Filing
Non-Patent Citations (3)
Title |
---|
ANTANI S ET AL: "A survey on the use of pattern recognition methods for abstraction, indexing and retrieval of images and video", PATTERN RECOGNITION, ELSEVIER, GB, vol. 35, no. 4, 1 April 2002 (2002-04-01), pages 945 - 965, XP004329464, ISSN: 0031-3203 * |
SERGIO VELASTIN & PAOLO REMAGNINO: "Intelligent Distributed Video Surveillance Systems", INTELLIGENT DISTRIBUTED SURVEILLANCE SYSTEMS, IEE, LONDON, GB, 1 January 2006 (2006-01-01), pages 1 - 30, XP008081222 * |
THEODORIDIS S ET AL: "Pattern Recognition, PASSAGE", 1 January 1999, PATTERN RECOGNITION, SAN DIEGO, CA : ACADEMIC PRESS, US, PAGE(S) 403-441,484-486, ISBN: 9780126861402, XP002575794 * |
Also Published As
Publication number | Publication date |
---|---|
EP2359308B1 (fr) | 2018-07-25 |
DE102008043953A1 (de) | 2010-05-27 |
EP2359308A1 (fr) | 2011-08-24 |
US20110019871A1 (en) | 2011-01-27 |
US8670598B2 (en) | 2014-03-11 |
CN102224510A (zh) | 2011-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE112017001311T5 (de) | System und Verfahren zum Trainieren eines Objektklassifikators durch maschinelles Lernen | |
DE102019113830A1 (de) | Informationsverarbeitungsverfahren, Informationsverarbeitungsgerät und Programm | |
DE102007010186A1 (de) | Vorrichtung, Verfahren und Computerprogramm zur bildgestützten Verfolgung von Überwachungsobjekten | |
DE4430016A1 (de) | System und Verfahren zur Bildauswertung | |
DE102017219282A1 (de) | Verfahren und Vorrichtung zum automatischen Erzeugen eines künstlichen neuronalen Netzes | |
DE102020117544A1 (de) | System und verfahren zur verarbeitung von videodaten aus dem archiv | |
EP2483834B1 (fr) | Methode et appareil pour la reconnaissance d'une detection fausse d'un objet dans un image | |
WO2010057732A1 (fr) | Dispositif de production et/ou de traitement d'une signature d'objet, dispositif de contrôle, procédé et produit-programme | |
DE112021005703T5 (de) | Informationsverarbeitungseinrichtung und informationsverarbeitungsverfahren | |
EP3584748A1 (fr) | Procédé de génération d'un ensemble de données d'essai, procédé d'essai, procédé de fonctionnement d'un système, dispositif, système de commande, produit programme d'ordinateur, support lisible par ordinateur, production et utilisation | |
DE112020007472T5 (de) | Lernnutzungssystem, nutzungsvorrichtung, lernvorrichtung, programm und lernnutzungsverfahren | |
DE102020207449A1 (de) | Verfahren, Computerprogramm und Vorrichtung zum Verarbeiten von Signalen | |
DE102009048118B4 (de) | Verfahren und Vorrichtung zum Verwalten von Objektansichtsdaten in einer Objektdatenbank | |
DE102010002312A1 (de) | Verfahren und Vorrichtung zur Analyse eines Bildes einer Bilderfassungseinrichtung für ein Fahrzeug | |
DE102018209898A1 (de) | Verfahren zur Bestimmung von zueinander korrespondierenden Bildpunkten, SoC zur Durchführung des Verfahrens, Kamerasystem mit dem SoC, Steuergerät und Fahrzeug | |
DE102018201909A1 (de) | Verfahren und Vorrichtung zur Objekterkennung | |
WO2021180547A1 (fr) | Procédé et dispositif de traitement d'images | |
DE102018201914A1 (de) | Verfahren zum Anlernen eines Modells zur Personen-Wiedererkennung unter Verwendung von Bildern einer Kamera und Verfahren zum Erkennen von Personen aus einem angelernten Modell zur Personen-Wiedererkennung durch eine zweite Kamera eines Kameranetzwerkes | |
DE102018208604A1 (de) | Ermitteln eines Aufnahmeverhaltens einer Aufnahmeeinheit | |
DE102009039568A1 (de) | Verfahren und Vorrichtung zur Erkennung von Objekten | |
WO2024099797A1 (fr) | Procédé d'apprentissage d'un réseau neuronal pour déterminer les caractéristiques d'objets à des fins de suivi d'objet | |
DE102020206350A1 (de) | Verfahren zur Detektion von Vergleichspersonen zu einer Suchperson, Überwachungsanordnung, insbesondere zur Umsetzung des Verfahrens, sowie Computerprogramm und computerlesbares Medium | |
DE102015011926A1 (de) | Verfahren zum Betrieb eines Kamerasystems in einem Kraftfahrzeug und Kraftfahrzeug | |
DE102022202583A1 (de) | Verfahren zum Überwachen eines Beobachtungsbereichs | |
DE102022201526A1 (de) | Verfahren zur Wiedererkennung von einem Verfolgungsobjekt, Überwachungsanordnung, Computerprogramm sowie Speichermedium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980146583.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09755853 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009755853 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12922225 Country of ref document: US |