EP1897032A1 - Procede et unite d'evaluation d'images pour analyse de scene - Google Patents

Procede et unite d'evaluation d'images pour analyse de scene

Info

Publication number
EP1897032A1
EP1897032A1 EP06741041A EP06741041A EP1897032A1 EP 1897032 A1 EP1897032 A1 EP 1897032A1 EP 06741041 A EP06741041 A EP 06741041A EP 06741041 A EP06741041 A EP 06741041A EP 1897032 A1 EP1897032 A1 EP 1897032A1
Authority
EP
European Patent Office
Prior art keywords
scene
change
local
optical sensor
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06741041A
Other languages
German (de)
English (en)
Inventor
Martin Litzenberger
Bernhard Kohn
Peter Schön
Michael HOFSTÄTTER
Nikolaus Donath
Christoph Posch
Nenad Milosevic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AIT Austrian Institute of Technology GmbH
Original Assignee
Austrian Research Centers GmbH ARC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Austrian Research Centers GmbH ARC filed Critical Austrian Research Centers GmbH ARC
Publication of EP1897032A1 publication Critical patent/EP1897032A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the invention relates to a method according to the characterizing part of claim 1 and to an image evaluation unit according to the preamble of patent claim 7.
  • the invention relates to the processing of information which is recorded by means of optical sensors.
  • the subject of the invention is a method based on a special optical semiconductor sensor with asynchronous, digital data transmission to a processing unit, in which special algorithms for scene analysis are implemented.
  • the method provides selected scene content information which is evaluated and, for example, to control machines or installations, etc., can be used.
  • the sensors used asynchronously continue or release the preprocessed scene information in the form of signals, and this only if the scene undergoes changes or individual image elements of the sensors detect certain features in the scene.
  • This principle considerably reduces the amount of data required compared with an image display, and at the same time increases the information content of the data by already extracting properties of the scene.
  • the scene capture with conventional digital image processing relies on the evaluation of image information provided by an image sensor.
  • image usually the image, pixel by pixel sequentially, in a predetermined clock (synchronously), read many times per second from the image sensor and evaluated the information contained in the data about the scene. Due to the large amounts of data and complex evaluation methods, this principle encounters the following difficulties, even when using correspondingly powerful processor systems:
  • Powerful processors have too high an energy consumption for many, especially mobile, applications. 3.) Powerful processors require active cooling. Systems that use such processors can therefore not be built compact enough for many applications. 4.) Powerful processors are too expensive for many applications.
  • FIG. 1 shows schematically differences between the usual procedure and the procedure of the invention.
  • 2 shows a diagram of an image evaluation unit according to the invention.
  • FIGS. 3 a and 3 b, as in FIGS. 4 and 5, show the procedure according to the invention schematically on the basis of recorded images.
  • Fig. 1 shows schematically differences between the usual procedure and the procedure of the invention.
  • FIGS. 3 a and 3 b, as in FIGS. 4 and 5 show the procedure according to the invention schematically on the basis of recorded images.
  • the processing of the image signals of the optical sensor takes place in a specific manner, namely such that in the pixels of the optical sensor the brightness information recorded by a photosensor is preprocessed by means of an analog, electronic circuit.
  • the processing of the signals of several neighboring photosensors can be summarized.
  • the output signals of the picture elements are transmitted via an interface of the sensor asynchronously to a digital data evaluation unit, in which a scene analysis is performed and the result of the evaluation is provided at an interface of the device (FIG. 1 b).
  • a scene is imaged on the image plane of the optical sensor 1 via an optical recording arrangement, not shown.
  • the visual information is captured by the picture elements of the sensor and continuously processed in electronic circuits in the picture elements. This processing recognizes certain features in the scene content in real time.
  • Features to be detected in the image content may include static edges, local intensity changes, optical flow, etc.
  • the detection of a feature is hereafter referred to as an "event.”
  • Event Each time an event occurs, the pixel generates in real time a digital output on the asynchronous data bus containing the address of the pixel and thus the coordinates in the frame where the feature is detected This date will be referred to as the "Address Event” (AE).
  • AE Address Event
  • further properties of the feature, in particular the time of occurrence, are encoded in the data.
  • the sensor 1 sends this information to the processing unit CPU as relevant data via the asynchronous data channel.
  • a bus controller 2 prevents data collisions on the transmission channel.
  • the procedure according to the invention is based on the combination of the specially designed sensor, the data transmission and the statistical mathematical methods for data processing provided.
  • the intended sensor detects changes in light intensity and therefore responds, e.g. on moving edges or light-dark boundary lines in a scene.
  • the sensor tracks the changes in the photocurrent of a photosensor in each pixel. These changes are summed for each pixel in an integrator. If the sum of the changes exceeds a threshold, the pixel immediately sends this event asynchronously over the data bus to the processing unit. After each event, the value of the integrator is cleared. Positive and negative changes of the photocurrent are processed separately and generate events of different polarity (so-called "on” and "off” events). The sensor used does not generate images in the conventional sense. in the
  • AE frame is defined as the AE's stored in a buffer which have been generated within a defined period of time.
  • AE-Bid is the representation of an AE frame in an image in which polarity and
  • Frequency of events can be assigned to colors or gray scale values.
  • Fig. 3 shows (a) a video image of a scene and (b) an AE image of the same scene produced by a sensor responsive to changes in light intensity.
  • the features from the scene are examined by means of static mathematical methods, and higher-quality, abstract information on the scene content is obtained.
  • Such information can be eg the number of persons in a scene or the speed and distance of vehicles on a road.
  • a room counter can be realized by having the image sensor e.g. is mounted on the ceiling in the middle of a room.
  • the individual events are assigned by the processing unit corresponding square areas in the image field, which are approximately the size of a person.
  • Simple statistical methods and a correction mechanism allow easy estimation of the area covered by moving objects. This is proportional to the number of people in the field of view of the sensor. The calculation effort for the number of people is low, so that this system can be implemented with simple and inexpensive microprocessors. If no people or objects move in the sensor's field of view, no events are generated and the microprocessor can switch to a power-saving mode, which significantly reduces system power consumption. This is not possible in state-of-the-art image processing systems because the sensor image has to be processed and searched for people at all times.
  • the image sensor is mounted above the door or other entrance or exit of a room.
  • the persons are not distorted in perspective and the AE's are projected onto axes (eg: vertical axes) when the persons pass through the observation area and are summed up in a histogram (FIG. 4). If a person moves under the sensor through the door, one or more maxima 1 running in the direction of movement can be detected in the histogram. By means of statistical weighting, the calculation of the maximum and the direction of movement can be made robust against disturbances.
  • the index of the histogram containing the largest number of events is determined and compared to the index of the last AE frame.
  • Processing unit is capable of segmenting and tracking the AE's of people and vehicles near and on the protection path in the data stream (Figure 5).
  • the system recognizes the size and speed of the objects and allows them to be categorized into pedestrian and vehicle categories. 5 shows a scene taken by the sensor at two points in time, the corresponding AE images and the result of the mathematical-statistical evaluation which recognizes the individual objects and determines their direction of movement.
  • the system After a certain period of observation, it is possible for the system to recognize the location and orientation of roads, footpaths and paths through the use of learning methods based on static concepts. As a result, it is then possible to be warned of any pedestrian who is moving towards the protective path or moving along the protective path.
  • Pedestrians e.g. Moving on footpaths parallel to the roadway does not trigger a warning due to their detected direction of movement.
  • Systems with simple sensors are only able to detect the presence of persons in the vicinity of the protection path, but not to detect their direction of movement and thus specifically warn pedestrians to move directly to the protection path.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé d'analyse de scène, selon lequel la scène ou les objets situés dans la scène et un capteur optique exécutent un mouvement relatif, les informations de scène obtenues étant évaluées. Selon l'invention, l'information visuelle de la scène est détectée par les pixels individuels du capteur optique; les coordonnées de pixels des variations d'intensité constatées sont déterminées; les variations d'intensité constatées sont déterminées en fonction du temps; des méthodes statistiques sont utilisées pour déterminer les accumulations locales des variations d'intensité des pixels; les accumulations locales sont évaluées au moyen de méthodes statistiques pour déterminer leur nombre et/ou leur position, ainsi qu'au moyen de méthodes d'élimination de zones de données inutiles; les valeurs déterminées sont considérées en tant que paramètres d'une zone d'une scène détectée; au moins un des paramètres est comparé à un paramètre prédéfini qui est considéré comme étant caractéristique d'un objet; lorsque les critères de comparaison prédéfinis sont remplis, l'accumulation locale évaluée associée à la zone de scène respective est considérée en tant qu'image de cet objet.
EP06741041A 2005-06-15 2006-06-14 Procede et unite d'evaluation d'images pour analyse de scene Withdrawn EP1897032A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AT0101105A AT502551B1 (de) 2005-06-15 2005-06-15 Verfahren und bildauswertungseinheit zur szenenanalyse
PCT/AT2006/000245 WO2006133474A1 (fr) 2005-06-15 2006-06-14 Procede et unite d'evaluation d'images pour analyse de scene

Publications (1)

Publication Number Publication Date
EP1897032A1 true EP1897032A1 (fr) 2008-03-12

Family

ID=36933426

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06741041A Withdrawn EP1897032A1 (fr) 2005-06-15 2006-06-14 Procede et unite d'evaluation d'images pour analyse de scene

Country Status (8)

Country Link
US (1) US20080144961A1 (fr)
EP (1) EP1897032A1 (fr)
JP (1) JP2008547071A (fr)
KR (1) KR20080036016A (fr)
CN (1) CN101258512A (fr)
AT (1) AT502551B1 (fr)
CA (1) CA2610965A1 (fr)
WO (1) WO2006133474A1 (fr)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8065197B2 (en) * 2007-03-06 2011-11-22 Portrait Innovations, Inc. System, method, and computer program product for evaluating photographic performance
US8103056B2 (en) * 2008-10-15 2012-01-24 Honeywell International Inc. Method for target geo-referencing using video analytics
DE102009005920A1 (de) * 2009-01-23 2010-07-29 Hella Kgaa Hueck & Co. Verfahren und Vorrichtung zum Steuern mindestens einer Lichtzeichenanlage eines Fußgängerüberwegs
US8452599B2 (en) * 2009-06-10 2013-05-28 Toyota Motor Engineering & Manufacturing North America, Inc. Method and system for extracting messages
CN101931789A (zh) * 2009-06-26 2010-12-29 上海宝康电子控制工程有限公司 关键区域中的高清晰人像自动记录及比对系统及其方法
US8269616B2 (en) * 2009-07-16 2012-09-18 Toyota Motor Engineering & Manufacturing North America, Inc. Method and system for detecting gaps between objects
US8744131B2 (en) * 2009-09-29 2014-06-03 Panasonic Corporation Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device
US8337160B2 (en) * 2009-10-19 2012-12-25 Toyota Motor Engineering & Manufacturing North America, Inc. High efficiency turbine system
US8237792B2 (en) 2009-12-18 2012-08-07 Toyota Motor Engineering & Manufacturing North America, Inc. Method and system for describing and organizing image data
US8424621B2 (en) 2010-07-23 2013-04-23 Toyota Motor Engineering & Manufacturing North America, Inc. Omni traction wheel system and methods of operating the same
CN102739919A (zh) * 2011-04-14 2012-10-17 江苏中微凌云科技股份有限公司 动态监测的方法及设备
FR2985065B1 (fr) * 2011-12-21 2014-01-10 Univ Paris Curie Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere
EP2720171B1 (fr) * 2012-10-12 2015-04-08 MVTec Software GmbH Reconnaissance et détermination de la pose d'objets en 3D dans des scènes multimodales
FR3020699A1 (fr) * 2014-04-30 2015-11-06 Centre Nat Rech Scient Procede de suivi de forme dans une scene observee par un capteur asynchrone de lumiere
CN106991418B (zh) * 2017-03-09 2020-08-04 上海小蚁科技有限公司 飞虫检测方法、装置及终端
KR102103521B1 (ko) 2018-01-12 2020-04-28 상명대학교산학협력단 인공지능 심층학습 기반의 영상물 인식 시스템 및 방법
KR102027878B1 (ko) 2018-01-25 2019-10-02 상명대학교산학협력단 딥러닝 기술과 이미지 특징 추출 기술을 결합한 영상물 내 미술품 인식 방법
JP2020053827A (ja) * 2018-09-27 2020-04-02 ソニーセミコンダクタソリューションズ株式会社 固体撮像素子、および、撮像装置
JP2022532014A (ja) * 2019-04-25 2022-07-13 プロフェシー エスエー 振動のイメージングおよび感知のためのシステムおよび方法
JP7393851B2 (ja) * 2019-05-31 2023-12-07 慎太朗 芝 撮像装置、撮像方法及びプログラム
CN111166366A (zh) * 2019-12-31 2020-05-19 杭州美诺瓦医疗科技股份有限公司 基于束光器光野的屏蔽装置、屏蔽方法及x光检查装置
KR102694598B1 (ko) 2021-12-07 2024-08-13 울산과학기술원 테스트 영상 특징 반영에 의한 영상 분석 개선 시스템 및 방법
US11558542B1 (en) * 2022-01-03 2023-01-17 Omnivision Technologies, Inc. Event-assisted autofocus methods and apparatus implementing the same

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0330269B1 (fr) * 1988-02-23 1993-09-22 Koninklijke Philips Electronics N.V. Procédé et dispositif pour évaluer le degré de mouvement d'un élément d'image d'une image de télévision
US5341439A (en) * 1989-09-21 1994-08-23 Hsu Shin Yi System for texture-based automatic detection of man-made objects in representations of sensed natural environmental scenes
JPH096957A (ja) * 1995-06-23 1997-01-10 Toshiba Corp 濃度画像の2値化方法および画像2値化装置
US5956424A (en) * 1996-12-23 1999-09-21 Esco Electronics Corporation Low false alarm rate detection for a video image processing based security alarm system
JP3521109B2 (ja) * 1997-02-17 2004-04-19 シャープ株式会社 動き検出用固体撮像装置
GB2368021A (en) * 2000-10-21 2002-04-24 Roy Sennett Mouth cavity irrigation device
US20020131643A1 (en) * 2001-03-13 2002-09-19 Fels Sol Sidney Local positioning system
US7327393B2 (en) * 2002-10-29 2008-02-05 Micron Technology, Inc. CMOS image sensor with variable conversion gain
US7796173B2 (en) * 2003-08-13 2010-09-14 Lettvin Jonathan D Imaging system
JP4193812B2 (ja) * 2005-05-13 2008-12-10 カシオ計算機株式会社 撮像装置、撮像方法及びそのプログラム
US7755672B2 (en) * 2006-05-15 2010-07-13 Zoran Corporation Techniques for modifying image field data obtained using illumination sources

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2006133474A1 *

Also Published As

Publication number Publication date
US20080144961A1 (en) 2008-06-19
AT502551B1 (de) 2010-11-15
JP2008547071A (ja) 2008-12-25
KR20080036016A (ko) 2008-04-24
WO2006133474A1 (fr) 2006-12-21
CN101258512A (zh) 2008-09-03
AT502551A1 (de) 2007-04-15
CA2610965A1 (fr) 2006-12-21

Similar Documents

Publication Publication Date Title
AT502551B1 (de) Verfahren und bildauswertungseinheit zur szenenanalyse
US6985172B1 (en) Model-based incident detection system with motion classification
DE60020420T2 (de) Situationsdarstellungs-Anzeigesystem
Alpatov et al. Vehicle detection and counting system for real-time traffic surveillance
EP1854083B1 (fr) Camera servant a poursuivre des objets
DE102017217056A1 (de) Verfahren und Einrichtung zum Betreiben eines Fahrerassistenzsystems sowie Fahrerassistenzsystem und Kraftfahrzeug
Low et al. Simple robust road lane detection algorithm
DE102005026876B4 (de) Fahrzeugumgebungs-Überwachungsvorrichtung
DE602004011650T2 (de) Fahrassistenzsystem für ein Kraftfahrzeug
DE602005001627T2 (de) Vorrichtung zur Extraktion von Fussgängern
CN107122765B (zh) 一种高速公路服务区全景监控方法及系统
EP2973211A1 (fr) Analyse de flux vidéo
DE112013001424T5 (de) Objekterkennungsvorrichtung
DE10325762A1 (de) Bildverarbeitungssystem für ein Fahrzeug
EP2174260A2 (fr) Dispositif pour identifier et/ou classifier des modèles de mouvements dans une séquence d'images d'une scène de surveillance, procédé et programme informatique
DE102018212655A1 (de) Erkennung der Bewegungsabsicht eines Fußgängers aus Kamerabildern
EP3520023B1 (fr) Détection et validation d'objets provenant d'images séquentielles d'une caméra
CN115841651B (zh) 基于计算机视觉与深度学习的施工人员智能监测系统
DE19937928B4 (de) Einrichtung zum Erkennen eines beweglichen Körpers und Einrichtung zum Überwachen eines Kraftfahrzeugs
EP2483834B1 (fr) Methode et appareil pour la reconnaissance d'une detection fausse d'un objet dans un image
DE102009024066A1 (de) Verfahren und Vorrichtung zum Klassifizieren von Situationen
DE102019122015A1 (de) Verfahren zum betreiben eines automatischen bremssystems
EP2254104A2 (fr) Procédé de reconnaissance automatique d'une modification de situation
Lagorio et al. Automatic detection of adverse weather conditions in traffic scenes
JP3905774B2 (ja) パターン推定方法、パターン推定装置、パターン推定方法のプログラムおよびこのプログラムを記録した記録媒体

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20071231

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: HR

RAX Requested extension states of the european patent have changed

Extension state: HR

Payment date: 20071231

17Q First examination report despatched

Effective date: 20080401

RAX Requested extension states of the european patent have changed

Extension state: HR

Payment date: 20071231

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110103