EP1989680A2 - Bewegungsanalyse in digitalen bildfolgen - Google Patents
Bewegungsanalyse in digitalen bildfolgenInfo
- Publication number
- EP1989680A2 EP1989680A2 EP07712336A EP07712336A EP1989680A2 EP 1989680 A2 EP1989680 A2 EP 1989680A2 EP 07712336 A EP07712336 A EP 07712336A EP 07712336 A EP07712336 A EP 07712336A EP 1989680 A2 EP1989680 A2 EP 1989680A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- pixels
- pixel
- predecessor
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
Definitions
- the invention relates to the analysis of movements of real objects in digital image sequences.
- an influencing of the image content by visible in the image real objects makes sense.
- a simple example of such an application is described in the article by V. Paelke, Ch. Reimann and D. Stichling, "Foot-based Mobile Interaction with Games", ACE2004, Singapore, June 2004, in which a virtual football through the real world Foot of the player is to be started.
- One of the methods known for this purpose is the determination of edges in the video image and, based on this, the motion analysis of the extracted edges. In order to be able to determine the edge movement, an attempt is first made to approximate the edges by polygons. This also applies to the o.g. Items; see p.2, left column, paragraph below
- Compression is the movement of pixel blocks of fixed size determined. It is irrelevant whether this corresponds to the movement of image objects; these methods are therefore in the context of 'Augmented Reality' unusable.
- the methods described in more detail below, however, are much simpler, faster and more robust than the previously known methods. You do not need a model of the object completely or partially visible in the picture and do not require vectoring of edges; They are also relatively insensitive to image noise and other disturbances that interfere with conventional edge detection of the edge image.
- each individual image from the image sequence is pretreated by known filters. These filters are used to reduce the color of image pixels, reduce noise, and emphasize contours or edges. The type and extent of pre-treatment depends on the application. In an application in a handheld device such as a mobile phone with a camera, it was advantageous to use all of the following filters.
- Colored output images are first converted into grayscale (for example by averaging all the color channels of each pixel).
- Highly noisy images can optionally be smoothed by a Gaussian filter; This can be done, for example, if a sensor detects a low ambient brightness.
- an edge image is created from the grayscale image by contour filter.
- the Sobel filter is often used.
- the Prewitt filter, the Laplace filter or similar filters can be used to generate an edge image.
- a pure black and white image is used with 1 bit per pixel, i. the
- Brightness values are reduced to one bit, so each pixel is binary either white (0 or "no edge") or black (1 or "edge”).
- the threshold for this conversion can either be fixed or determined relative to the mean or median of the gray levels. Pixels of value 1 will hereafter be referred to simply as edge pixels, even if the invention does not have edges Vectorized, but the movement without the reconstruction of edges from pixel movements allowed to determine. Instead of the explicit determination of edges according to the invention, the calculation of the movement of a partial image in two successive images (eg for implicit collision detection with a virtual object) takes place by means of two interleaved steps which refer only to the pixels of the image. These pixels are preferably the above edge pixels. 1. A motion is calculated for each edge pixel (see step 2). Subsequently, the movements of all edge pixels of the image section are averaged. The means is the movement of the entire section and thus of an object that is wholly or partially in the image section.
- edge pixels have no attributes (such as brightness, pattern, etc.), there can be no unique association between an edge pixel in the current image and an edge pixel in the previous image. Therefore, the movement of one edge pixel becomes related to neighboring ones
- Edge pixels are calculated by determining displacement vectors to the neighboring edge pixels and averaging.
- the displacement vector (2-dimensional) is the vector from the position of the pixel in the current image to the position of an environment pixel in the previous image.
- Fig. Ia shows the input image, which is in the form of gray levels.
- Fig. Ib After applying an edge filter four pixels remain, as shown in Fig. Ib.
- the pixels were numbered consecutively.
- Fig. Ic wherein the positions occupied in the previous image are marked by circles.
- the Movement calculated for each edge pixel in the current image (1 ', 2', 3 'and 4').
- the 9th neighborhood is used, that is, all positions that are directly or diagonally adjacent to the current position and the current position itself, ie, pixels are considered at a given distance.
- Edge pixel 1 ' has two adjacent edge pixels in the previous image (Figs. 1 and 2).
- the average motion M 1 ' of 1' is thus:
- FIG. 2a is used for illustration; Here, for the sake of clarity, the black blocks have been removed compared with FIG. 1c, and the circles have become smaller. Points in which the pixel values have changed are, in the example, the points 1 to 3 which are not circled. Point 4 is not taken into account because the pixel value has not changed.
- a distance vector is now formed for each of the (changed) points 1 to 3 and each point set in the previous image in the area; this is indicated in Figure 2a for point 1 by arrows and listed in the following table; the mean of these vectors is determined by averaging the x and y Values formed and gives the last column, labeled MW for Mean:
- a new mean value is then formed in the same way, which already represents the result.
- a true upward movement results
- the value of the actual displacement is 0 / -1.
- black and white images were used in which the black pixels corresponded by filter edges, and only those black pixels were considered.
- the invention is not limited thereto. If greater accuracy is needed to compensate for more processing power, the method can also be used for grayscale or color images. In this case, first of all pixels of the predecessor image which are equivalent to the pixel are determined for a pixel in the current image.
- the pixel is the same gray level with respect to a given bound of deviation; For example, for 8-bit or 256-grayscale, 8 grayscale values.
- the grayscale image may also be quantized in advance by using only 16 gray levels of the 256 possible gray levels and rounding the remainder to these 16 values, and then using exact equality of the pixel values. Both driving gives slightly different equivalents because the quantization is too different.
- quantization was performed to 1 bit before the determination of the equivalent pixels, and the white pixels were not used. It was first quantized and then only pixels in a predetermined interval, here only the black pixels, used. Since here the color or gray value is only one bit, only the equality of the pixel values makes sense here.
- the invention may be extended in an application of
- Reality can be used to effect interaction between real and virtual objects with little computational effort.
- a mobile phone is used that includes a camera on the back and a screen on the front, and that
- the virtual object is, as in the article mentioned above, a ball.
- the invention provides a substantially improved method for detecting the movement of a real foot and detecting the impact towards the virtual ball.
- the known methods described in the above-mentioned article could only be used in real time by delegation to more powerful computers connected via a network.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102006009774A DE102006009774B4 (de) | 2006-03-01 | 2006-03-01 | Bewegungsanalyse in digitalen Bildfolgen |
PCT/EP2007/051847 WO2007099099A2 (de) | 2006-03-01 | 2007-02-27 | Bewegungsanalyse in digitalen bildfolgen |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1989680A2 true EP1989680A2 (de) | 2008-11-12 |
Family
ID=38329206
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07712336A Withdrawn EP1989680A2 (de) | 2006-03-01 | 2007-02-27 | Bewegungsanalyse in digitalen bildfolgen |
Country Status (4)
Country | Link |
---|---|
US (1) | US20090169059A1 (de) |
EP (1) | EP1989680A2 (de) |
DE (1) | DE102006009774B4 (de) |
WO (1) | WO2007099099A2 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011233039A (ja) * | 2010-04-28 | 2011-11-17 | Sony Corp | 画像処理装置、画像処理方法、撮像装置、およびプログラム |
JP2013041387A (ja) * | 2011-08-15 | 2013-02-28 | Sony Corp | 画像処理装置、画像処理方法、撮像装置、電子機器、及び、プログラム |
KR102369802B1 (ko) * | 2017-07-13 | 2022-03-04 | 한화디펜스 주식회사 | 영상 처리 시스템 및 그에 의한 영상 처리 방법 |
US10262220B1 (en) * | 2018-08-20 | 2019-04-16 | Capital One Services, Llc | Image analysis and processing pipeline with real-time feedback and autocapture capabilities, and visualization and configuration system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6826294B1 (en) * | 1999-03-05 | 2004-11-30 | Koninklijke Philips Electronics N.V. | Block matching motion estimation using reduced precision clustered predictions |
EP1395061A1 (de) * | 2002-08-27 | 2004-03-03 | Mitsubishi Electric Information Technology Centre Europe B.V. | Verfahren und Vorrichtung zur Kompensation von falschen Bewegunsvektoren in Videodaten |
-
2006
- 2006-03-01 DE DE102006009774A patent/DE102006009774B4/de not_active Expired - Fee Related
-
2007
- 2007-02-27 EP EP07712336A patent/EP1989680A2/de not_active Withdrawn
- 2007-02-27 US US12/224,520 patent/US20090169059A1/en not_active Abandoned
- 2007-02-27 WO PCT/EP2007/051847 patent/WO2007099099A2/de active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO2007099099A2 * |
Also Published As
Publication number | Publication date |
---|---|
DE102006009774B4 (de) | 2007-10-18 |
DE102006009774A1 (de) | 2007-09-06 |
US20090169059A1 (en) | 2009-07-02 |
WO2007099099A3 (de) | 2008-03-20 |
WO2007099099A2 (de) | 2007-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102020131265A1 (de) | Segmentieren von video-rahmen unter verwendung eines neuronalen netzes mit verringerter auflösung und von masken aus vorhergehenden rahmen | |
DE112017002799T5 (de) | Verfahren und system zum generieren multimodaler digitaler bilder | |
DE102020201787A1 (de) | Erzeugen von blickkorrigierten bildern unter verwendung eines bidirektional trainierten netzwerks | |
DE60111851T2 (de) | Videobildsegmentierungsverfahren unter verwendung von elementären objekten | |
DE102018130924A1 (de) | Systeme und Verfahren zur dynamischen Gesichtsanalyse mittels eines rekurrenten neuronalen Netzes | |
DE112016004731T5 (de) | Erweitern von Mehrfachansicht-Bilddaten mit synthetischen Objekten unter Verwendung von IMU und Bilddaten | |
EP2284795A2 (de) | Quantitative Analyse, Visualisierung und Bewegungskorrektur in dynamischen Prozessen | |
DE102006030709A1 (de) | Verfahren für die kenntnisbasierte Bildsegmentierung unter Verwendung von Formmodellen | |
DE112014006439B4 (de) | Kantenerkennungsvorrichtung, Kantenerkennungsverfahren und Programm | |
DE112013002200T5 (de) | Automatische Anpassung von Bildern | |
EP2430614A1 (de) | Verfahren zur echtzeitfähigen, rechnergestützten analyse einer eine veränderliche pose enthaltenden bildsequenz | |
DE102010046507A1 (de) | Berechnung der Detailstufe für die anisotrope Filterung | |
DE102015009820A1 (de) | Bildsegmentierung für eine Kamera-Liveeinspielung | |
CN109685045A (zh) | 一种运动目标视频跟踪方法及系统 | |
DE102010016251A1 (de) | Erkennungsverfahren für ein bewegliches Objekt und das der Erkennung des beweglichen Objekts zugrunde liegende Befehlseingabeverfahren | |
DE112007001789T5 (de) | Bestimmung und Verwendung einer dominanten Linie eines Bildes | |
DE102020133245A1 (de) | Tiefenbasierte 3D-Rekonstruktion unter Verwendung einer a-priori-Tiefenszene | |
Malik et al. | Llrnet: A multiscale subband learning approach for low light image restoration | |
DE102006009774B4 (de) | Bewegungsanalyse in digitalen Bildfolgen | |
DE102016109660A1 (de) | Clusterungsverfahren und -system, entsprechende Vorrichtung und entsprechendes Computerprogrammprodukt | |
EP2893510B1 (de) | Verfahren und bildverarbeitungsanlage zum entfernen eines visuellen objektes aus einem bild | |
DE102004026782A1 (de) | Verfahren und Vorrichtung zur rechnergestützten Bewegungsschätzung in mindestens zwei zeitlich aufeinander folgenden digitalen Bildern, computerlesbares Speichermedium und Computerprogramm-Element | |
Cai et al. | Image Blur Assessment with Feature Points. | |
DE112017007162T5 (de) | Gesichtsermittlungsvorrichtung, dazugehöriges Steuerungsverfahren und Programm | |
EP4118620A1 (de) | Verfahren und vorrichtung zum verarbeiten von bildern |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20080828 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: REIMANN, CHRISTIAN Inventor name: STICHLING, DIRK Inventor name: KLEINJOHANN, BERND Inventor name: SCHNEIDER, CHRISTIAN |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: UNIVERSITAET PADERBORN Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20110901 |