EP2201524A2 - Procédé pour créer et/ou actualiser des textures de modèles d'objets d'arrière-plan, système de vidéosurveillance pour la mise en uvre du procédé, et programme informatique - Google Patents
Procédé pour créer et/ou actualiser des textures de modèles d'objets d'arrière-plan, système de vidéosurveillance pour la mise en uvre du procédé, et programme informatiqueInfo
- Publication number
- EP2201524A2 EP2201524A2 EP08804058A EP08804058A EP2201524A2 EP 2201524 A2 EP2201524 A2 EP 2201524A2 EP 08804058 A EP08804058 A EP 08804058A EP 08804058 A EP08804058 A EP 08804058A EP 2201524 A2 EP2201524 A2 EP 2201524A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- background
- background image
- scene
- textures
- object models
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Definitions
- the invention relates to a method for generating and / or updating
- Textures of background object models in a three-dimensional scene model of a surveillance scene with background objects and a control device and a video surveillance system for performing the method and a computer program.
- Video surveillance systems are used for camera-based control of relevant areas and usually include a plurality of surveillance cameras, which are set up in the relevant areas for recording surveillance scenes.
- the surveillance scenes can be designed, for example, as parking lots, intersections, streets, squares, but also as areas in buildings, factories, hospitals or the like.
- the image data streams recorded by the surveillance cameras are merged into a monitoring center where they are either automated or evaluated by monitoring personnel.
- German patent application DE 10001252 A1 proposes a monitoring system which enables a more efficient working with a monitoring system through an object-oriented presentation. To do this, signals from the cameras used for each selected view are decomposed into objects and then passed to a display, adding artificial objects and deleting other objects.
- the invention creates a possibility that surveillance scenes at least partially in a virtual reality or a partially virtual reality in
- Form of a three-dimensional scene model can be displayed, with a particularly realistic representation of the surveillance scene can be achieved by generating and / or updating textures of background object models in the three-dimensional scene model.
- the method allows the mapping of a particular real
- the surveillance scene can be streets, intersections, squares or areas in buildings, factories, prisons, hospitals, etc.
- the behind grand objects are preferably defined as static and / or quasi-static objects, which do not change or change only slowly and which are mapped to the background object models.
- Typical static objects are buildings, trees, panels, etc.
- the quasi-static objects include, for example, shadows, parked cars or the like.
- the static objects have a retention period in the
- the quasi-static objects Monitoring scene of preferably more than several months, the quasi-static objects, however, preferably more than one or more days.
- the three-dimensional scene model includes the background object models, each of which is formed as a three-dimensional model.
- the three-dimensional scene model is "walkable", so that a user can move between the background object models in the three-dimensional scene model and / or change the view by adjusting the viewing direction or viewing angle / or an overlapping hierarchy (Z hierarchy) of
- the background object models and optionally the rest of the background have textures, with textures preferably being in the form of color, hatchings, patterns, and / or features of the surface of the background objects.
- a background image of the surveillance scene is formed from one or more camera images of the surveillance scene, wherein it is preferably provided that foreground objects or other interfering objects are masked out or suppressed.
- the background image can be formed in its thickness, ie in terms of the columns and rows of pixels, identical to the camera images.
- the background image is a section of one or more camera images. It is also possible that the background image has an arbitrary edge contour course, so that, for example, a background image represents exactly one background object.
- the background image is projected onto the scene model.
- the background image is imaged in such a way that a pixel of a background object coincides with a corresponding model point of the background object model.
- the projection can also be done pixelwise in - A -
- mapping rule Take place in the form of a mapping rule, wherein preferably only the pixels are displayed, to which a corresponding model point is available.
- the textures of the background object models are each stored with an orientation information, so that the textures in turn can be distributed on the background object models in the representation of the scene model on a monitor or the like, in the same manner as to the projection and projection.
- the method allows to provide a three-dimensional scene model with realistic textures, wherein the textures can be updated at regular or irregular intervals.
- the background image is through
- temporal filtering thus for example by averaging, moving averaging, or by elimination of foreground objects formed. It is also possible to form the median of several camera images or to cut out known objects. In principle, all known methods for creating the background image can be used.
- the projection of the background image onto the scene model takes place using the parameters of a camera model from the surveillance camera, from the perspective of which the background image is created.
- the parameters of the camera model it is possible to perform the projection of a point in the coordinate system of the surveillance scene into the coordinate system of the camera image and vice versa.
- the camera model it is also possible to use a look-up table which provides a corresponding point in the monitoring scene's coordinate system for each pixel in the camera image of the surveillance camera.
- an assignment rule between the surveillance scene and the camera image it is possible to project the background image resulting from the camera image in the correct position and / or perspective corrected onto the scene model so that misallocations are minimized.
- the background image is optionally corrected in addition to distortions, which are unintentionally caused by aberrations in the surveillance camera system, e.g. as optical aberrations may have arisen, on the other hand, but also wanted distortions, which are introduced for example by the use of 360 ° - or fish eye cameras.
- a mutual occlusion of background object models is checked with the aid of a depth buffer, wherein pixels which are to be assigned to a background object model concealed in the region of the corresponding model point are discarded.
- the depth buffer is based on a known Z-hierarchy from the rendering.
- the textures are formed on the basis of a plurality of camera images, which originate from a common surveillance camera with a common viewing angle or from different surveillance cameras with different viewing angles onto the surveillance scene.
- the camera images are projected from different angles in the manner described on the scene model in the correct position.
- pixels of different background images which belong to a common texture point or a common texture of a background object model, are merged in each case.
- the Merging can be done, for example, by averaging.
- a color balance of the pixels to be merged takes place.
- Surveillance area formed monitoring scenes - texture information from other sources, such as aerial photographs taken.
- Another object of the invention relates to a video surveillance system which is connected to one or a plurality of surveillance cameras and / or connectable and which has a control device, characterized in that the control device circuitry and / or program technology for performing the method just described and / or it has been defined in the preceding claims is formed.
- the video surveillance system is designed so that the described method preferably runs in the background at periodic intervals and thus keeps the textures up to date.
- a particular advantage of the video surveillance system is the fact that for the creation or
- a final object of the invention relates to a computer program with program code means for carrying out all the steps of the described method when the program is executed on a computer and / or a video surveillance system.
- FIG. 1 is a flow chart illustrating a first embodiment of the method according to the invention
- Figure 2 is a block diagram of a video surveillance system for performing the
- FIG. 1 shows, in a schematic flowchart, the sequence of a method for generating and / or updating textures of background object models in a three-dimensional scene model as an exemplary embodiment of the invention.
- Surveillance cameras are 10 ( Figure 2), used. These video images 1 are converted in a first method step 2 into a background image with background pixels. The transfer is accomplished by methods known in image processing, such as averaging or median formation of multiple video frames 1, clipping known objects, long-term observation, or the like.
- a background image is generated, which has as active pixels only background pixels from the one or more video images 1 and optionally disabled pixels, which are set to the positions of the video image 1, where a disturber object or foreground object is shown.
- the background image generated in this way is projected onto a scene model.
- the scene model is designed as a three-dimensional scene model and has a multiplicity of background object models, for example representative of buildings, furniture, streets or other stationary objects.
- Surveillance cameras that have supplied the background image underlying video image 1, which projects pixels of the background image in the image coordinate system to the respective corresponding point of the three-dimensional scene model.
- the distortion distortions such as distortions or the like are corrected.
- a pixel-by-pixel check for occlusion from the camera's view takes place with the aid of a depth buffer.
- a pixel of the background image which was projected by the method step 3 on a background object model, by another background object model and / or a real, e.g. dynamic object in the current camera view is obscured.
- the test assesses the inspected pixel as hidden, it is discarded and no longer used. Otherwise, the pixel, ie the projected video pixel or background pixel, is used for creating and / or updating the textures.
- the textures 6 are created and output on the basis of the transferred background pixels.
- several pixels of different background images which are arranged overlapping at least in sections after the projection and thus relate to the same regions of the same background object models, are merged into a common background pixel.
- a common background pixel e.g. also a color match done.
- any remaining gaps in the scene model may be filled by static textures, e.g. taken from aerial photographs.
- FIG. 2 shows a video surveillance system 100 which is designed to carry out the method described in FIG.
- the video surveillance system is connected to a plurality of security cameras 10 wirelessly or wired signal technology and is designed for example as a computer system.
- the Surveillance cameras 10 are directed to relevant areas which show surveillance scenes in the form of squares, intersections or the like.
- the image data streams of the surveillance cameras 10 are transferred to a background module 20, which is designed to carry out the first method step 2 in FIG.
- the background image (s) generated are forwarded to a projection module 30, which is designed to carry out the second method step 3.
- the projected background images are passed to a masking module 40, which is implemented for carrying out the third method step 4.
- the textures 6 are created or updated on the basis of the checked background images and forwarded to a texture memory 60.
- a display unit 70 Based on the stored data and the three-dimensional scene model is displayed on a display unit 70, such. a virtual representation of a monitor
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102007048857A DE102007048857A1 (de) | 2007-10-11 | 2007-10-11 | Verfahren zur Erzeugung und/oder Aktualisierung von Texturen von Hintergrundobjektmodellen, Videoüberwachungssystem zur Durchführung des Verfahrens sowie Computerprogramm |
PCT/EP2008/062093 WO2009049973A2 (fr) | 2007-10-11 | 2008-09-11 | Procédé pour créer et/ou actualiser des textures de modèles d'objets d'arrière-plan, système de vidéosurveillance pour la mise en œuvre du procédé, et programme informatique |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2201524A2 true EP2201524A2 (fr) | 2010-06-30 |
Family
ID=40435390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08804058A Pending EP2201524A2 (fr) | 2007-10-11 | 2008-09-11 | Procédé pour créer et/ou actualiser des textures de modèles d'objets d'arrière-plan, système de vidéosurveillance pour la mise en uvre du procédé, et programme informatique |
Country Status (5)
Country | Link |
---|---|
US (1) | US20100239122A1 (fr) |
EP (1) | EP2201524A2 (fr) |
CN (1) | CN101999139A (fr) |
DE (1) | DE102007048857A1 (fr) |
WO (1) | WO2009049973A2 (fr) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101924748A (zh) * | 2009-06-11 | 2010-12-22 | 鸿富锦精密工业(深圳)有限公司 | 数字内容系统 |
EP2478708A2 (fr) * | 2009-09-18 | 2012-07-25 | Logos Technologies Inc. | Procédé pour la compression d'une image aérienne par la prédiction de cette image à partir d'un modèle de profondeur du terrain |
DE102010003336A1 (de) | 2010-03-26 | 2011-09-29 | Robert Bosch Gmbh | Verfahren zur Visualisierung von Aktivitätsschwerpunkten in Überwachungsszenen |
DE102012205130A1 (de) * | 2012-03-29 | 2013-10-02 | Robert Bosch Gmbh | Verfahren zum automatischen Betreiben einer Überwachungsanlage |
DE102012211298A1 (de) | 2012-06-29 | 2014-01-02 | Robert Bosch Gmbh | Anzeigevorrichtung für ein Videoüberwachungssystem sowie Videoüberwachungssystem mit der Anzeigevorrichtung |
CN105023274A (zh) * | 2015-07-10 | 2015-11-04 | 国家电网公司 | 输配电线路基建现场立体安全防护方法 |
US10419788B2 (en) * | 2015-09-30 | 2019-09-17 | Nathan Dhilan Arimilli | Creation of virtual cameras for viewing real-time events |
CN105787988B (zh) * | 2016-03-21 | 2021-04-13 | 联想(北京)有限公司 | 一种信息处理方法、服务器及终端设备 |
CN106204595B (zh) * | 2016-07-13 | 2019-05-10 | 四川大学 | 一种基于双目摄像机的机场场面三维全景监视方法 |
TWI622024B (zh) * | 2016-11-22 | 2018-04-21 | Chunghwa Telecom Co Ltd | 智慧影像式監測告警裝置 |
KR102676837B1 (ko) | 2016-12-16 | 2024-06-21 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
CN111383340B (zh) * | 2018-12-28 | 2023-10-17 | 成都皓图智能科技有限责任公司 | 一种基于3d图像的背景过滤方法、装置及系统 |
US11430132B1 (en) | 2021-08-19 | 2022-08-30 | Unity Technologies Sf | Replacing moving objects with background information in a video scene |
CN117119148B (zh) * | 2023-08-14 | 2024-02-02 | 中南民族大学 | 基于三维场景的视频监控效果可视化评估方法和系统 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6924801B1 (en) * | 1999-02-09 | 2005-08-02 | Microsoft Corporation | Method and apparatus for early culling of occluded objects |
DE10001252B4 (de) | 2000-01-14 | 2007-06-14 | Robert Bosch Gmbh | Überwachungssystem |
US7148917B2 (en) * | 2001-02-01 | 2006-12-12 | Motorola Inc. | Method and apparatus for indicating a location of a person with respect to a video capturing volume of a camera |
US7161615B2 (en) * | 2001-11-30 | 2007-01-09 | Pelco | System and method for tracking objects and obscuring fields of view under video surveillance |
US6956566B2 (en) * | 2002-05-23 | 2005-10-18 | Hewlett-Packard Development Company, L.P. | Streaming of images with depth for three-dimensional graphics |
GB2392072B (en) * | 2002-08-14 | 2005-10-19 | Autodesk Canada Inc | Generating Image Data |
EP1576545A4 (fr) * | 2002-11-15 | 2010-03-24 | Sunfish Studio Llc | Systeme et procede de determination de surface visible, utilises en infographie, faisant appel a une analyse d'intervalle |
JP4307222B2 (ja) * | 2003-11-17 | 2009-08-05 | キヤノン株式会社 | 複合現実感提示方法、複合現実感提示装置 |
EP1705929A4 (fr) * | 2003-12-25 | 2007-04-04 | Brother Ind Ltd | Dispositif d'affichage d'images et dispositif de traitement de signaux |
US7542034B2 (en) * | 2004-09-23 | 2009-06-02 | Conversion Works, Inc. | System and method for processing video images |
JP4116648B2 (ja) * | 2006-05-22 | 2008-07-09 | 株式会社ソニー・コンピュータエンタテインメント | オクルージョンカリング方法および描画処理装置 |
US8009200B2 (en) * | 2007-06-15 | 2011-08-30 | Microsoft Corporation | Multiple sensor input data synthesis |
-
2007
- 2007-10-11 DE DE102007048857A patent/DE102007048857A1/de not_active Withdrawn
-
2008
- 2008-09-11 EP EP08804058A patent/EP2201524A2/fr active Pending
- 2008-09-11 CN CN2008801109019A patent/CN101999139A/zh active Pending
- 2008-09-11 WO PCT/EP2008/062093 patent/WO2009049973A2/fr active Application Filing
- 2008-09-11 US US12/682,069 patent/US20100239122A1/en not_active Abandoned
Non-Patent Citations (4)
Title |
---|
CAROLINA CRUZ-NEIRA ET AL: "The CAVE: audio visual experience automatic virtual environment", COMMUNICATIONS OF THE ACM, ASSOCIATION FOR COMPUTING MACHINERY, INC, UNITED STATES, vol. 35, no. 6, 1 June 1992 (1992-06-01), pages 64 - 72, XP058283534, ISSN: 0001-0782, DOI: 10.1145/129888.129892 * |
FAUGERAS O ET AL: "3-D Reconstruction of Urban Scenes from Image Sequences", COMPUTER VISION AND IMAGE UNDERSTAND, ACADEMIC PRESS, US, vol. 69, no. 3, 1 March 1998 (1998-03-01), pages 292 - 309, XP004448898, ISSN: 1077-3142, DOI: 10.1006/CVIU.1998.0665 * |
OTT R ET AL: "Advanced virtual reality technologies for surveillance and security applications", PROCEEDINGS - VRCIA 2006: ACM INTERNATIONAL CONFERENCE ON VIRTUAL REALITY CONTINUUM AND ITS APPLICATIONS - PROCEEDINGS - VRCIA 2006: ACM INTERNATIONAL CONFERENCE ON VIRTUAL REALITY CONTINUUM AND ITS APPLICATIONS 2006 ASSOCIATION FOR COMPUTING MACHINE, 2006, pages 163 - 170, DOI: 10.1145/1128923.1128949 * |
See also references of WO2009049973A2 * |
Also Published As
Publication number | Publication date |
---|---|
CN101999139A (zh) | 2011-03-30 |
US20100239122A1 (en) | 2010-09-23 |
WO2009049973A3 (fr) | 2010-01-07 |
DE102007048857A1 (de) | 2009-04-16 |
WO2009049973A2 (fr) | 2009-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2201524A2 (fr) | Procédé pour créer et/ou actualiser des textures de modèles d'objets d'arrière-plan, système de vidéosurveillance pour la mise en uvre du procédé, et programme informatique | |
DE102005061952A1 (de) | Verfahren und System zur Bestimmung einer Ungenauigkeitsinformation in einem Augmented Reality System | |
EP2766879A2 (fr) | Procédé d'intégration d'objets virtuels dans des affichages de véhicule | |
DE19825302A1 (de) | System zur Einrichtung einer dreidimensionalen Abfallmatte, welche eine vereinfachte Einstellung räumlicher Beziehungen zwischen realen und virtuellen Szeneelementen ermöglicht | |
DE102019116834B4 (de) | Augmentierte Fotoaufnahme | |
DE102011107458A1 (de) | Verfahren zum Evaluieren einer Objekterkennungseinrichtung eines Kraftfahrzeugs | |
DE102015220031A1 (de) | Verfahren zur Konfidenzabschätzung für optisch-visuelle Posenbestimmung | |
DE102004040372B4 (de) | Verfahren und Vorrichtung zur Darstellung einer dreidimensionalen Topographie | |
WO2006094637A1 (fr) | Procede de comparaison d'un objet reel a un modele numerique | |
DE19714915A1 (de) | Bilddarstellungsverfahren und Vorrichtung zur Durchführung des Verfahrens | |
DE102014219418B4 (de) | Verfahren zur Stereorektifizierung von Stereokamerabildern und Fahrerassistenzsystem | |
DE102015010264A1 (de) | Verfahren zur Erstellung einer 3D-Repräsentation und korrespondierende Bildaufnahmevorrichtung | |
WO2019121287A1 (fr) | Procédé de montage d'un système de tubes pour créer au moins une liaison par tube | |
EP3953862A1 (fr) | Procédé pour fournir une fonction de poursuite d'objet | |
DE102016006855B4 (de) | Verfahren zum Betreiben eines Anzeigesystems und Anzeigesystem | |
DE102013106280A1 (de) | Verfahren zur Visualisierung eines CAD-Modells | |
DE102015120927A1 (de) | Verfahren zur Darstellung einer Simulationsumgebung | |
WO2013178358A1 (fr) | Procédé de visualisation dans l'espace d'objets virtuels | |
DE102016011131B4 (de) | Rendern eines dreidimensionalen Objekts | |
DE10351577B4 (de) | Verfahren und Vorrichtung zum Ändern visueller Informationen in Bildern einer bewegten Bildfolge | |
DE102009001870A1 (de) | Bildverarbeitungsvorrichtung, Überwachungssystem mit der Bildverarbeitungsvorrichtung, Verfahren sowie Computerprogramm | |
DE202022104264U1 (de) | Kunstdarstellungssystem | |
DE102023102196A1 (de) | Computer-implementiertes verfahren und vorrichtung zum prüfen einer korrektheit eines zusammenbaus | |
DE102007001273A1 (de) | Verfahren zur selbsttätigen Analyse von Objektbewegungen | |
DE102015120929A1 (de) | Verfahren zur vorbereitenden Simulation eines militärischen Einsatzes in einem Einsatzgebiet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
17P | Request for examination filed |
Effective date: 20100707 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20121109 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |