EP1897032A1 - Method and image evaluation unit for scene analysis - Google Patents
Method and image evaluation unit for scene analysisInfo
- Publication number
- EP1897032A1 EP1897032A1 EP06741041A EP06741041A EP1897032A1 EP 1897032 A1 EP1897032 A1 EP 1897032A1 EP 06741041 A EP06741041 A EP 06741041A EP 06741041 A EP06741041 A EP 06741041A EP 1897032 A1 EP1897032 A1 EP 1897032A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- scene
- change
- local
- optical sensor
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the invention relates to a method according to the characterizing part of claim 1 and to an image evaluation unit according to the preamble of patent claim 7.
- the invention relates to the processing of information which is recorded by means of optical sensors.
- the subject of the invention is a method based on a special optical semiconductor sensor with asynchronous, digital data transmission to a processing unit, in which special algorithms for scene analysis are implemented.
- the method provides selected scene content information which is evaluated and, for example, to control machines or installations, etc., can be used.
- the sensors used asynchronously continue or release the preprocessed scene information in the form of signals, and this only if the scene undergoes changes or individual image elements of the sensors detect certain features in the scene.
- This principle considerably reduces the amount of data required compared with an image display, and at the same time increases the information content of the data by already extracting properties of the scene.
- the scene capture with conventional digital image processing relies on the evaluation of image information provided by an image sensor.
- image usually the image, pixel by pixel sequentially, in a predetermined clock (synchronously), read many times per second from the image sensor and evaluated the information contained in the data about the scene. Due to the large amounts of data and complex evaluation methods, this principle encounters the following difficulties, even when using correspondingly powerful processor systems:
- Powerful processors have too high an energy consumption for many, especially mobile, applications. 3.) Powerful processors require active cooling. Systems that use such processors can therefore not be built compact enough for many applications. 4.) Powerful processors are too expensive for many applications.
- FIG. 1 shows schematically differences between the usual procedure and the procedure of the invention.
- 2 shows a diagram of an image evaluation unit according to the invention.
- FIGS. 3 a and 3 b, as in FIGS. 4 and 5, show the procedure according to the invention schematically on the basis of recorded images.
- Fig. 1 shows schematically differences between the usual procedure and the procedure of the invention.
- FIGS. 3 a and 3 b, as in FIGS. 4 and 5 show the procedure according to the invention schematically on the basis of recorded images.
- the processing of the image signals of the optical sensor takes place in a specific manner, namely such that in the pixels of the optical sensor the brightness information recorded by a photosensor is preprocessed by means of an analog, electronic circuit.
- the processing of the signals of several neighboring photosensors can be summarized.
- the output signals of the picture elements are transmitted via an interface of the sensor asynchronously to a digital data evaluation unit, in which a scene analysis is performed and the result of the evaluation is provided at an interface of the device (FIG. 1 b).
- a scene is imaged on the image plane of the optical sensor 1 via an optical recording arrangement, not shown.
- the visual information is captured by the picture elements of the sensor and continuously processed in electronic circuits in the picture elements. This processing recognizes certain features in the scene content in real time.
- Features to be detected in the image content may include static edges, local intensity changes, optical flow, etc.
- the detection of a feature is hereafter referred to as an "event.”
- Event Each time an event occurs, the pixel generates in real time a digital output on the asynchronous data bus containing the address of the pixel and thus the coordinates in the frame where the feature is detected This date will be referred to as the "Address Event” (AE).
- AE Address Event
- further properties of the feature, in particular the time of occurrence, are encoded in the data.
- the sensor 1 sends this information to the processing unit CPU as relevant data via the asynchronous data channel.
- a bus controller 2 prevents data collisions on the transmission channel.
- the procedure according to the invention is based on the combination of the specially designed sensor, the data transmission and the statistical mathematical methods for data processing provided.
- the intended sensor detects changes in light intensity and therefore responds, e.g. on moving edges or light-dark boundary lines in a scene.
- the sensor tracks the changes in the photocurrent of a photosensor in each pixel. These changes are summed for each pixel in an integrator. If the sum of the changes exceeds a threshold, the pixel immediately sends this event asynchronously over the data bus to the processing unit. After each event, the value of the integrator is cleared. Positive and negative changes of the photocurrent are processed separately and generate events of different polarity (so-called "on” and "off” events). The sensor used does not generate images in the conventional sense. in the
- AE frame is defined as the AE's stored in a buffer which have been generated within a defined period of time.
- AE-Bid is the representation of an AE frame in an image in which polarity and
- Frequency of events can be assigned to colors or gray scale values.
- Fig. 3 shows (a) a video image of a scene and (b) an AE image of the same scene produced by a sensor responsive to changes in light intensity.
- the features from the scene are examined by means of static mathematical methods, and higher-quality, abstract information on the scene content is obtained.
- Such information can be eg the number of persons in a scene or the speed and distance of vehicles on a road.
- a room counter can be realized by having the image sensor e.g. is mounted on the ceiling in the middle of a room.
- the individual events are assigned by the processing unit corresponding square areas in the image field, which are approximately the size of a person.
- Simple statistical methods and a correction mechanism allow easy estimation of the area covered by moving objects. This is proportional to the number of people in the field of view of the sensor. The calculation effort for the number of people is low, so that this system can be implemented with simple and inexpensive microprocessors. If no people or objects move in the sensor's field of view, no events are generated and the microprocessor can switch to a power-saving mode, which significantly reduces system power consumption. This is not possible in state-of-the-art image processing systems because the sensor image has to be processed and searched for people at all times.
- the image sensor is mounted above the door or other entrance or exit of a room.
- the persons are not distorted in perspective and the AE's are projected onto axes (eg: vertical axes) when the persons pass through the observation area and are summed up in a histogram (FIG. 4). If a person moves under the sensor through the door, one or more maxima 1 running in the direction of movement can be detected in the histogram. By means of statistical weighting, the calculation of the maximum and the direction of movement can be made robust against disturbances.
- the index of the histogram containing the largest number of events is determined and compared to the index of the last AE frame.
- Processing unit is capable of segmenting and tracking the AE's of people and vehicles near and on the protection path in the data stream (Figure 5).
- the system recognizes the size and speed of the objects and allows them to be categorized into pedestrian and vehicle categories. 5 shows a scene taken by the sensor at two points in time, the corresponding AE images and the result of the mathematical-statistical evaluation which recognizes the individual objects and determines their direction of movement.
- the system After a certain period of observation, it is possible for the system to recognize the location and orientation of roads, footpaths and paths through the use of learning methods based on static concepts. As a result, it is then possible to be warned of any pedestrian who is moving towards the protective path or moving along the protective path.
- Pedestrians e.g. Moving on footpaths parallel to the roadway does not trigger a warning due to their detected direction of movement.
- Systems with simple sensors are only able to detect the presence of persons in the vicinity of the protection path, but not to detect their direction of movement and thus specifically warn pedestrians to move directly to the protection path.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AT0101105A AT502551B1 (en) | 2005-06-15 | 2005-06-15 | METHOD AND PICTURE EVALUATION UNIT FOR SCENE ANALYSIS |
PCT/AT2006/000245 WO2006133474A1 (en) | 2005-06-15 | 2006-06-14 | Method and image evaluation unit for scene analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1897032A1 true EP1897032A1 (en) | 2008-03-12 |
Family
ID=36933426
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP06741041A Withdrawn EP1897032A1 (en) | 2005-06-15 | 2006-06-14 | Method and image evaluation unit for scene analysis |
Country Status (8)
Country | Link |
---|---|
US (1) | US20080144961A1 (en) |
EP (1) | EP1897032A1 (en) |
JP (1) | JP2008547071A (en) |
KR (1) | KR20080036016A (en) |
CN (1) | CN101258512A (en) |
AT (1) | AT502551B1 (en) |
CA (1) | CA2610965A1 (en) |
WO (1) | WO2006133474A1 (en) |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8065197B2 (en) * | 2007-03-06 | 2011-11-22 | Portrait Innovations, Inc. | System, method, and computer program product for evaluating photographic performance |
US8103056B2 (en) * | 2008-10-15 | 2012-01-24 | Honeywell International Inc. | Method for target geo-referencing using video analytics |
DE102009005920A1 (en) * | 2009-01-23 | 2010-07-29 | Hella Kgaa Hueck & Co. | Method and device for controlling at least one traffic light system of a pedestrian crossing |
US8452599B2 (en) * | 2009-06-10 | 2013-05-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for extracting messages |
CN101931789A (en) * | 2009-06-26 | 2010-12-29 | 上海宝康电子控制工程有限公司 | High-resolution human figure automatic recording and comparing system and method in key region |
US8269616B2 (en) * | 2009-07-16 | 2012-09-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for detecting gaps between objects |
WO2011039977A1 (en) * | 2009-09-29 | 2011-04-07 | パナソニック株式会社 | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
US8337160B2 (en) * | 2009-10-19 | 2012-12-25 | Toyota Motor Engineering & Manufacturing North America, Inc. | High efficiency turbine system |
US8237792B2 (en) | 2009-12-18 | 2012-08-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for describing and organizing image data |
US8424621B2 (en) | 2010-07-23 | 2013-04-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Omni traction wheel system and methods of operating the same |
CN102739919A (en) * | 2011-04-14 | 2012-10-17 | 江苏中微凌云科技股份有限公司 | Method and equipment for dynamic monitoring |
FR2985065B1 (en) * | 2011-12-21 | 2014-01-10 | Univ Paris Curie | OPTICAL FLOAT ESTIMATING METHOD FROM LIGHT ASYNCHRONOUS SENSOR |
EP2720171B1 (en) * | 2012-10-12 | 2015-04-08 | MVTec Software GmbH | Recognition and pose determination of 3D objects in multimodal scenes |
FR3020699A1 (en) * | 2014-04-30 | 2015-11-06 | Centre Nat Rech Scient | METHOD OF FOLLOWING SHAPE IN A SCENE OBSERVED BY AN ASYNCHRONOUS LIGHT SENSOR |
CN106991418B (en) * | 2017-03-09 | 2020-08-04 | 上海小蚁科技有限公司 | Winged insect detection method and device and terminal |
KR102103521B1 (en) | 2018-01-12 | 2020-04-28 | 상명대학교산학협력단 | Artificial intelligence deep-learning based video object recognition system and method |
KR102027878B1 (en) | 2018-01-25 | 2019-10-02 | 상명대학교산학협력단 | Method for recognizing art objects in video combining deep learning technology and image feature extraction technology |
JP2020053827A (en) * | 2018-09-27 | 2020-04-02 | ソニーセミコンダクタソリューションズ株式会社 | Solid-state imaging element and imaging apparatus |
JP2022532014A (en) * | 2019-04-25 | 2022-07-13 | プロフェシー エスエー | Systems and methods for vibration imaging and sensing |
JP7393851B2 (en) * | 2019-05-31 | 2023-12-07 | 慎太朗 芝 | Imaging device, imaging method and program |
KR20230085509A (en) | 2021-12-07 | 2023-06-14 | 울산과학기술원 | System and method of improving predictions of images by adapting features of test images |
US11558542B1 (en) * | 2022-01-03 | 2023-01-17 | Omnivision Technologies, Inc. | Event-assisted autofocus methods and apparatus implementing the same |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE68909271T2 (en) * | 1988-02-23 | 1994-03-24 | Philips Nv | Method and arrangement for estimating the extent of movement in a picture element of a television picture. |
US5341439A (en) * | 1989-09-21 | 1994-08-23 | Hsu Shin Yi | System for texture-based automatic detection of man-made objects in representations of sensed natural environmental scenes |
JPH096957A (en) * | 1995-06-23 | 1997-01-10 | Toshiba Corp | Binarization method for density image and image binarization device |
US5956424A (en) * | 1996-12-23 | 1999-09-21 | Esco Electronics Corporation | Low false alarm rate detection for a video image processing based security alarm system |
JP3521109B2 (en) * | 1997-02-17 | 2004-04-19 | シャープ株式会社 | Solid-state imaging device for motion detection |
GB2368021A (en) * | 2000-10-21 | 2002-04-24 | Roy Sennett | Mouth cavity irrigation device |
US20020131643A1 (en) * | 2001-03-13 | 2002-09-19 | Fels Sol Sidney | Local positioning system |
US7327393B2 (en) * | 2002-10-29 | 2008-02-05 | Micron Technology, Inc. | CMOS image sensor with variable conversion gain |
US7796173B2 (en) * | 2003-08-13 | 2010-09-14 | Lettvin Jonathan D | Imaging system |
JP4193812B2 (en) * | 2005-05-13 | 2008-12-10 | カシオ計算機株式会社 | Imaging apparatus, imaging method, and program thereof |
US7755672B2 (en) * | 2006-05-15 | 2010-07-13 | Zoran Corporation | Techniques for modifying image field data obtained using illumination sources |
-
2005
- 2005-06-15 AT AT0101105A patent/AT502551B1/en not_active IP Right Cessation
-
2006
- 2006-06-14 CA CA002610965A patent/CA2610965A1/en not_active Abandoned
- 2006-06-14 CN CNA2006800212545A patent/CN101258512A/en active Pending
- 2006-06-14 WO PCT/AT2006/000245 patent/WO2006133474A1/en active Application Filing
- 2006-06-14 EP EP06741041A patent/EP1897032A1/en not_active Withdrawn
- 2006-06-14 JP JP2008516063A patent/JP2008547071A/en not_active Withdrawn
- 2006-06-14 KR KR1020077030584A patent/KR20080036016A/en not_active Application Discontinuation
-
2007
- 2007-12-17 US US11/957,709 patent/US20080144961A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of WO2006133474A1 * |
Also Published As
Publication number | Publication date |
---|---|
CN101258512A (en) | 2008-09-03 |
KR20080036016A (en) | 2008-04-24 |
WO2006133474A1 (en) | 2006-12-21 |
AT502551A1 (en) | 2007-04-15 |
AT502551B1 (en) | 2010-11-15 |
CA2610965A1 (en) | 2006-12-21 |
US20080144961A1 (en) | 2008-06-19 |
JP2008547071A (en) | 2008-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AT502551B1 (en) | METHOD AND PICTURE EVALUATION UNIT FOR SCENE ANALYSIS | |
US6985172B1 (en) | Model-based incident detection system with motion classification | |
DE60020420T2 (en) | Presentation situation display system | |
Alpatov et al. | Vehicle detection and counting system for real-time traffic surveillance | |
EP1854083B1 (en) | Object tracking camera | |
DE102017217056A1 (en) | Method and device for operating a driver assistance system and driver assistance system and motor vehicle | |
Low et al. | Simple robust road lane detection algorithm | |
DE602004011650T2 (en) | Driver assistance system for a motor vehicle | |
DE602005001627T2 (en) | Device for extraction of pedestrians | |
DE102005026876A1 (en) | Vehicle environment monitoring device | |
CN107122765B (en) | Panoramic monitoring method and system for expressway service area | |
EP2973211A1 (en) | Video stream evaluation | |
DE112013001424T5 (en) | Object detection device | |
DE10325762A1 (en) | Image processing system for a vehicle | |
WO2009003793A2 (en) | Device for identifying and/or classifying movement patterns in an image sequence of a surveillance scene, method and computer program | |
DE102018212655A1 (en) | Detection of the intention to move a pedestrian from camera images | |
CN115841651B (en) | Constructor intelligent monitoring system based on computer vision and deep learning | |
DE19937928B4 (en) | Device for detecting a movable body and device for monitoring a motor vehicle | |
EP2483834B1 (en) | Method and apparatus for the recognition of a false object detection in an image | |
DE102009024066A1 (en) | Method and device for classifying situations | |
DE102019122015A1 (en) | METHOD FOR OPERATING AN AUTOMATIC BRAKING SYSTEM | |
EP2254104A2 (en) | Method for automatic recognition of a change in a situation | |
Lagorio et al. | Automatic detection of adverse weather conditions in traffic scenes | |
JP3905774B2 (en) | PATTERN ESTIMATION METHOD, PATTERN ESTIMATION DEVICE, PATTERN ESTIMATION METHOD PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM | |
DE102019220009A1 (en) | Procedure for recognizing road users |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20071231 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: HR |
|
RAX | Requested extension states of the european patent have changed |
Extension state: HR Payment date: 20071231 |
|
17Q | First examination report despatched |
Effective date: 20080401 |
|
RAX | Requested extension states of the european patent have changed |
Extension state: HR Payment date: 20071231 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20110103 |