EP2517149A2 - Device and method for monitoring video objects - Google Patents
Device and method for monitoring video objectsInfo
- Publication number
- EP2517149A2 EP2517149A2 EP10794955A EP10794955A EP2517149A2 EP 2517149 A2 EP2517149 A2 EP 2517149A2 EP 10794955 A EP10794955 A EP 10794955A EP 10794955 A EP10794955 A EP 10794955A EP 2517149 A2 EP2517149 A2 EP 2517149A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- tracking
- objects
- detection
- monitoring
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the invention relates to a method for monitoring objects, in particular for monitoring scenes of video-detected objects, according to the preamble of claim 1.
- the invention relates to a computer program and a computer program product for carrying out the method according to claim 8 or claim 9.
- the invention relates to a device for monitoring objects, in particular for monitoring scenes of video-detected objects, according to the preamble of claim 10.
- the invention is based on a monitoring system, in particular a monitoring system for monitoring one surveillance area or several surveillance areas, surveillance cameras being directed onto the surveillance area or the surveillance areas.
- the video images recorded by the surveillance cameras are often forwarded to a central unit, for example to a monitoring center, where they are evaluated by monitoring personnel or automatically. In other applications, the automatic evaluation is performed directly in the camera. Since such video surveillance systems often a variety, for. B. have 100 to 1000 surveillance cameras, the plurality of video images from a limited number of guards can not be meaningfully monitored, so that the automated image analysis gains in importance.
- the invention proceeds from a method, a computer program, a computer program product and an apparatus for monitoring objects, in particular for monitoring scenes of video-detected objects as generically defined by the independent claims.
- the present invention relates to surveillance systems, in particular video surveillance systems with a method for video object tracking by a detector tracking system.
- Video surveillance systems are known in the art.
- a particular object such as a person or a car is tracked through a video sequence.
- the recognition and tracking of the object in a video sequence is generally realized by a two-part system.
- This consists partly of a detector or a detector device which locates the object based on an object model in an image or a restricted image area.
- the system consists of an object tracking module (tracking module, tracking device) which tracks the position of the object over time.
- the tracking module determines associated tracking parameters, such as a direction of movement or a movement speed, for the object to be monitored.
- object detectors are used inter alia in video surveillance systems and similar applications, such as face detection systems in automatic focus digital cameras or collision avoidance vehicle personal detection systems, to image the objects of a given object class, such as a person, face or car detect.
- the object model is designed as an automatically learned classifier, which is trained on the recognition of images of this predetermined object class.
- AI monitoring systems such as video surveillance systems are known. These surveillance systems are for monitoring one or more surveillance areas, with surveillance cameras being directed at the surveillance area (s). The video images recorded by the surveillance cameras are forwarded, for example, to a central unit, for example to a monitoring center, where they are evaluated by monitoring personnel or automatically.
- This configuration modules are used for the monitoring system, wherein the Monitoring system is designed to classify objects with object properties in a monitoring scene based on object property areas as monitoring objects. For a comprehensive object monitoring with several objects to be monitored, for example, moving in different directions at different speeds, a corresponding computing power is required.
- the method according to the invention, the computer program according to the invention, the computer program product according to the invention and the device according to the invention with the features of the corresponding main claim or the corresponding independent claim have the advantage that they realize an improved object monitoring and require less computing power.
- a tracking device is fed back with a device for object model selection, so that during repeated detection, taking into account tracking parameters determined during the tracking of the object, the tracking parameters are fed to the selection device and can be taken into account for the detection.
- one or more object models are selected for detection based on the feedback tracking parameters, each describing a smaller model or variation range of the object and thus representing a more specific representation of the object.
- an object based on an object model with a narrower variation range can be detected more accurately and more quickly and can be tracked more easily and thus monitored.
- the detector device can be divided into a plurality of detector devices or detection modules, which have more specific models for object detection, which can then be used in a next monitoring step, for example a next image of a scene.
- a selection module determines, based on the detected tracking parameters and general scene parameters, which detection module is used for the detection of a specific object.
- the entire system is optionally designed as a learning or self-learning system in which the parameter ranges for the selection are optimized and readjusted with each repetition step of the method.
- the detector device can use a more specific object model, resulting in more robust recognition results as well as fewer misdetections (for example by a blurred model) leads.
- the feedback of information from the long-term observation of the object helps to determine the appropriate object model. Simple testing of all object models would not increase the recognition results to an effective extent, as the higher overall variance of the models would also increase the number of misdetections.
- the advantage of the invention is thus to increase the recognition performance of the detector device. Furthermore, the more specific object models, that is, the narrower range models, can reduce computation time, as opposed to the more complex, more general models, that is, the wider range models.
- Possible tracking parameters by which a subclass or a submodel is determined for the objects are, for example, the velocity of the object, whereby a moving object detector can, for example, quickly discard static image contents, or the orientation of the object, for example different ones
- Possible scene parameters include the object density, which influences, for example, the number of expected objects per unit area, the expected object obscuration, the lighting situation, such as effects such as fog, light sources, and the like, and the scene setup, such as the knowledge of obscuring the objects
- the detection and / or the tracking are performed model-based, that is, a selection of at least one of the models comprises. At least one of the predefined models is selected for detection and, analogously, for tracking. Preferably, the number of models in the course of the monitoring is always reduced. In a first step, at least one model is not taken into account in a next step of the n models, so that then only a maximum of n-1 models are taken into account. This can be continued until only one model has to be considered.
- the models may be adjusted in terms of their tracking parameter range and / or scene parameter range.
- the selection takes place taking into account the tracking parameters and / or scene parameters. Especially at the beginning of monitoring there is little data about the objects to be monitored. Accordingly, the group boundaries for detection of the objects are defined as blurred. The more specific a model and the sharper the boundaries, the higher the detection capability.
- the objects to be detected are compared for detection with the parameter ranges of the respective object models. If the tracking parameters and scene parameters of an object fall within a parameter range of an object model, then this object model is used to detect the object. In the next step, the detected object can be tracked. Tracking will again track tracking parameters. These are used in a new detection to further facilitate detection. This is done in particular by a new selection of a suitable model with a small variation range and by changing the parameter ranges for the selection of the object models.
- the detection and / or the selection is carried out on the basis of predetermined models.
- specifying the model comprises predetermining models with different tracking parameter ranges and / or scene parameter ranges.
- Detecting and / or tracking includes selecting at least one of the models.
- the selection of a model is carried out on the basis of tracking parameters and / or scene parameters.
- the selection at least at the beginning of a monitoring, comprises an adaptation of the parameter ranges, which in this way includes the models used for the detection specified with regard to the object to be monitored. Accordingly, the selecting and / or detecting comprises changing the tracking parameter ranges and / or scene parameter ranges.
- an object can not be assigned exactly to a model or sub-model, then the detection is carried out on the basis of several models with different parameter ranges. For example, an object that moves diagonally to the top right is not reliably detected only by the models for right-moving objects and moving-up objects. Rather, the object will fall into both parameter ranges, so both models are to be used for reliable detection.
- the predetermined object models encompass both models with a large variation width and also subordinate models with a smaller variation range.
- a model with a wide range of variation may include the detection of persons in all views, whereas a subordinate model is specialized in the detection of persons with a particular walking direction. This can be further refined so that, for example, separate models with different leg positions or arm positions are given.
- the selection module for the detection will select an object model with a large range of variation. If more accurate tracking parameters are known in a later step, the selection accuracy of a more specific object model with a smaller range of variation increases, as well as an increase in the detection speed.
- the method is preferably implemented as a computer program and can be distributed as a computer program product and used at different locations. In addition, the process can be easily retrofitted.
- the device according to the invention for monitoring objects comprising at least one detector device for detecting the object and at least one tracking device for tracking the object, is characterized in that the tracking device has means for detecting tracking parameters and fed back to the detector device is, so when repeated Detecting the tracking parameters of the detector device supplied and taken into account for the detection.
- the device also includes sensors for imaging real objects as video objects, for example cameras.
- the device comprises a total of means which are necessary for carrying out the method according to the invention.
- FIGURE shows: schematically as a block diagram a device for monitoring objects.
- the figure shows schematically as a block diagram an apparatus 1 for monitoring objects, which implements a method according to the invention for monitoring objects.
- the device 1 comprises an imaging unit designed as a video system 3 with which an image of the real monitoring area is generated.
- the video system 3 generates in a short time sequence different images 2, which result in a scene joined together.
- the images 2, which are present as input images 4, are analyzed in a detector device 8.
- the detector device 8 comprises detection models 6 and a model selection module 5 with which the detection modules 6, 6a, 6b, 6c, that is to say the modules for detecting the objects, in short detection modules 6, 6a, 6b, 6c, are selected or selected. In the process, existing objects are detected in the scene and various object parameters are recorded.
- an object is classified, that is, the object parameters are compared with the parameter ranges of the given object models and assigned to a suitable model or a group of models.
- the detection of the objects takes place in the detection modules 6, 6a, 6b, 6c in the section 10. These detect an object based on a given model description and generate a list of detected objects 11 after detection of the objects Tracking device 7 passed.
- the tracking device 7 performs an object tracking 12 over time.
- tracking parameters or track information are recorded and updated 13.
- the list of objects 11, which has been forwarded by the detector device 8 to the tracking device 7, is supplemented by the tracking parameters. In this case, for example, a motion or object trajectory is assigned to each object of the object list 11.
- the data resulting from the tracking device 7 is fed back to the model selection 5 so that one or more object detectors 6, 6a, 6b, 6c are selected for detection.
- a plurality of detection modules 6, 6a, 6b, 6c are provided, which are selected on the basis of the feedback of the information obtained with the tracking device 7 and each have narrower parameter ranges than, for example, a single detection module 6 for all objects.
- the detection module 6a has, for example, a model for people walking straight ahead.
- the detection module 6b has a model for people going to the right
- the detection module 6c has a model for people going to the left.
- a respective list IIa, IIb, 11c is created with the objects in the corresponding section, that meet the criteria of the model or group.
- the parameter ranges, which are used by the detection module 6, which was selected via the corresponding selection module 5, can be adapted to the respective situation on the basis of the tracking parameters and / or the scene parameters and thus result in an increasingly selective selection of the Detection to use model. By filtering out a reduced object model, the performance of the device 1 increases.
- multiple detection modules 6a, 6b, 6c each having a smaller variation width are used, which are each specialized for an object subclass of the original detector device increases the monitoring effectiveness.
- the subclass selection is carried out by the model selection module 5, which is controlled by tracking parameters of the tracking device 7, also referred to as a tracking module.
- the device 1 is formed in the illustrated embodiment as a system for tracking persons in a video sequence. Unlike a system According to the prior art, which would have only one detector device 8 which responds to all possible types of persons, the system according to the invention has a detector device 8 with a main detection module 6 and a plurality of subdetector sub detection modules 6a to 6c.
- the main detection module 6 has a model with a wide variation range. This is used in particular at the beginning of a monitoring, because at the beginning of a monitoring, a large amount of undefined objects is present in a scene.
- the detection modules 6a to 6c are provided which include models with smaller variation widths adapted to specific objects.
- the model of the main detection module 6 is thus subdivided into submodels of the sub detection modules 6a to 6c.
- a wide model range detection module 6 is subdivided into specific detection or detector modules 6a to 6c, for example, for straight ahead 6a, right running 6b, and left running 6c people.
- a model selection module 5 selects the appropriate module 6a-6c or a group of modules that will best describe the object in the next image. Then only this particular detection module 6, 6a, 6b, 6c is used to recognize the object in the next image.
- the detection can also be carried out with a plurality of detection modules 6, 6a, 6b, 6c, for example when a plurality of detection modules apply to the object to be monitored, for example during a diagonal movement.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009055127A DE102009055127A1 (en) | 2009-12-22 | 2009-12-22 | Apparatus and method for monitoring video objects |
PCT/EP2010/069566 WO2011076609A2 (en) | 2009-12-22 | 2010-12-14 | Device and method for monitoring video objects |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2517149A2 true EP2517149A2 (en) | 2012-10-31 |
Family
ID=44196185
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP10794955A Ceased EP2517149A2 (en) | 2009-12-22 | 2010-12-14 | Device and method for monitoring video objects |
Country Status (5)
Country | Link |
---|---|
US (1) | US8977001B2 (en) |
EP (1) | EP2517149A2 (en) |
CN (1) | CN102713933B (en) |
DE (1) | DE102009055127A1 (en) |
WO (1) | WO2011076609A2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9378576B2 (en) | 2013-06-07 | 2016-06-28 | Faceshift Ag | Online modeling for real-time facial animation |
CN105208343B (en) * | 2015-09-25 | 2016-09-07 | 珠海安联锐视科技股份有限公司 | Can be used for intelligent monitor system and the method for video monitoring equipment |
US10791285B2 (en) * | 2015-10-05 | 2020-09-29 | Woncheol Choi | Virtual flying camera system |
US10063790B2 (en) * | 2015-10-05 | 2018-08-28 | Woncheol Choi | Virtual flying camera system |
JP6942472B2 (en) * | 2017-01-13 | 2021-09-29 | キヤノン株式会社 | Video recognition device, video recognition method and program |
DE102018201909A1 (en) | 2018-02-07 | 2019-08-08 | Robert Bosch Gmbh | Method and device for object recognition |
DE102018201914A1 (en) | 2018-02-07 | 2019-08-08 | Robert Bosch Gmbh | A method of teaching a person recognition model using images from a camera and method of recognizing people from a learned model for person recognition by a second camera of a camera network |
JP7208713B2 (en) * | 2018-02-13 | 2023-01-19 | キヤノン株式会社 | Image analysis device and image analysis method |
CN111310526B (en) * | 2018-12-12 | 2023-10-20 | 杭州海康威视数字技术股份有限公司 | Parameter determination method and device for target tracking model and storage medium |
DE102022200831A1 (en) * | 2022-01-26 | 2023-07-27 | Robert Bosch Gesellschaft mit beschränkter Haftung | Surveillance device and method for image-based surveillance of a surveillance area, computer program and storage medium |
DE102022200834A1 (en) | 2022-01-26 | 2023-07-27 | Robert Bosch Gesellschaft mit beschränkter Haftung | Device arrangement, client-server system, procedures, computer programs and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080231709A1 (en) * | 2007-03-20 | 2008-09-25 | Brown Lisa M | System and method for managing the interaction of object detection and tracking systems in video surveillance |
US20090296989A1 (en) * | 2008-06-03 | 2009-12-03 | Siemens Corporate Research, Inc. | Method for Automatic Detection and Tracking of Multiple Objects |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1862969A1 (en) * | 2006-06-02 | 2007-12-05 | Eidgenössische Technische Hochschule Zürich | Method and system for generating a representation of a dynamically changing 3D scene |
JP4764273B2 (en) * | 2006-06-30 | 2011-08-31 | キヤノン株式会社 | Image processing apparatus, image processing method, program, and storage medium |
DE102007058959A1 (en) | 2007-12-07 | 2009-06-10 | Robert Bosch Gmbh | Configuration module for a monitoring system, monitoring system, method for configuring the monitoring system and computer program |
WO2009111499A2 (en) * | 2008-03-03 | 2009-09-11 | Videoiq, Inc. | Dynamic object classification |
-
2009
- 2009-12-22 DE DE102009055127A patent/DE102009055127A1/en not_active Withdrawn
-
2010
- 2010-12-14 EP EP10794955A patent/EP2517149A2/en not_active Ceased
- 2010-12-14 CN CN201080058504.9A patent/CN102713933B/en active Active
- 2010-12-14 WO PCT/EP2010/069566 patent/WO2011076609A2/en active Application Filing
- 2010-12-14 US US13/518,580 patent/US8977001B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080231709A1 (en) * | 2007-03-20 | 2008-09-25 | Brown Lisa M | System and method for managing the interaction of object detection and tracking systems in video surveillance |
US20090296989A1 (en) * | 2008-06-03 | 2009-12-03 | Siemens Corporate Research, Inc. | Method for Automatic Detection and Tracking of Multiple Objects |
Non-Patent Citations (2)
Title |
---|
See also references of WO2011076609A2 * |
VIVEK K SINGH ET AL: "Coopetitive multi-camera surveillance using model predictive control", 25 July 2007, MACHINE VISION AND APPLICATIONS, SPRINGER, BERLIN, DE, PAGE(S) 375 - 393, ISSN: 1432-1769, XP019651730 * |
Also Published As
Publication number | Publication date |
---|---|
WO2011076609A2 (en) | 2011-06-30 |
DE102009055127A1 (en) | 2011-06-30 |
US8977001B2 (en) | 2015-03-10 |
CN102713933A (en) | 2012-10-03 |
WO2011076609A3 (en) | 2011-10-13 |
CN102713933B (en) | 2016-08-17 |
US20120328153A1 (en) | 2012-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011076609A2 (en) | Device and method for monitoring video objects | |
DE112014007249T5 (en) | Image processing apparatus, vehicle display system, display apparatus, image processing method and image processing program | |
DE602005003926T2 (en) | SYSTEM AND METHOD FOR RECOGNIZING AN INDICATIVE VEHICLE FROM DYNAMIC BACKGROUND USING ROBUST INFORMATION FUSION | |
DE102009015142B4 (en) | Vehicle surroundings recognition device and control system for tracking a preceding vehicle | |
DE102017217056A1 (en) | Method and device for operating a driver assistance system and driver assistance system and motor vehicle | |
WO2018177484A1 (en) | Method and system for predicting sensor signals from a vehicle | |
DE102005034597A1 (en) | Method and device for generating a depth map | |
WO2009003793A2 (en) | Device for identifying and/or classifying movement patterns in an image sequence of a surveillance scene, method and computer program | |
WO2020025091A1 (en) | Detecting the movement intention of a pedestrian on the basis of camera images | |
EP2034461A2 (en) | Method for detecting and/or tracking moved objects in a monitoring zone with stoppers, device and computer program | |
DE102006053286A1 (en) | Method for detecting movement-sensitive image areas, apparatus and computer program for carrying out the method | |
WO2018215242A2 (en) | Method for determining a driving instruction | |
WO2012110654A1 (en) | Method for evaluating a plurality of time-offset pictures, device for evaluating pictures, and monitoring system | |
EP3655299B1 (en) | Method and device for determining an optical flow on the basis of an image sequence captured by a camera of a vehicle | |
DE102016223106A1 (en) | Method and system for detecting a raised object located within a parking lot | |
DE102016223094A1 (en) | Method and system for detecting a raised object located within a parking lot | |
EP2483834B1 (en) | Method and apparatus for the recognition of a false object detection in an image | |
WO2010139495A1 (en) | Method and apparatus for classifying situations | |
DE102010003669A1 (en) | Method and device for locating persons in a given area | |
DE102014219829A1 (en) | Smoke detection device, method for detecting smoke and computer program | |
EP3576013A1 (en) | Estimation of a path of a rail path | |
DE102019204187A1 (en) | Classification and temporal recognition of tactical driving maneuvers by road users | |
DE102017207958B4 (en) | A method of generating training data for a machine learning based pattern recognition method for a motor vehicle, motor vehicle, method of operating a computing device, and system | |
DE102019118607A1 (en) | ANOMALY DETECTOR FOR VEHICLE CONTROL SIGNALS | |
DE102018201914A1 (en) | A method of teaching a person recognition model using images from a camera and method of recognizing people from a learned model for person recognition by a second camera of a camera network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20120723 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20180518 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20210327 |